A more reliable reduction algorithm for behavioral model extraction
description
Transcript of A more reliable reduction algorithm for behavioral model extraction
![Page 1: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/1.jpg)
A more reliable reduction algorithm for behavioral model extraction
Dmitry Vasilyev, Jacob White
Massachusetts Institute of Technology
![Page 2: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/2.jpg)
Outline
Background Projection framework for model reduction Balanced Truncation algorithm and
approximations AISIAD algorithm
Description of the proposed algorithm
Modified AISIAD and a low-rank square root algorithm
Efficiency and accuracy
Conclusions
![Page 3: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/3.jpg)
Model reduction problem
• Reduction should be automatic • Must preserve input-output properties
Many (> 104) internal states
inputs outputs
few (<100) internal states
inputs outputs
![Page 4: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/4.jpg)
Differential Equation Model
Model can represent: Finite-difference spatial discretization of PDEs Circuits with linear elements
A – stable, n x n (large)E – SPD, n x n
- state
- vector of inputs
- vector of outputs
![Page 5: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/5.jpg)
Model reduction problem
n – large(thousands)!
Need the reduction to be automatic and preserve input-output properties (transfer function)
q – small (tens)
![Page 6: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/6.jpg)
Approximation error Wide-band applications: model should have
small worst-case error
ω
=> maximal difference over all frequencies
![Page 7: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/7.jpg)
Projection framework for model reduction
Pick biorthogonal projection matrices W and V
Projection basis are columns of V and W
Vxr x
x
n x xrV q
WTAVxr
Ax
Most reduction methods are based on projection
![Page 8: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/8.jpg)
LTI SYSTEM
X (state)
tu
t
y
input output
P (controllability)Which modes are easier to reach?
Q (observability)Which modes produce more output? Reduced model retains
most controllable and most observable modes
Mode must be both very controllable and very observable
Projection should preserve important modes
![Page 9: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/9.jpg)
Reduced system: (WTAV, WTB, CV, D)
Compute controllability and observability
gramians P and Q :
(~n3)AP + PAT + BBT =0 ATQ + QA + CTC = 0
Reduced model keeps
the dominant eigenspaces of PQ : (~n3)
PQvi = λivi wiPQ = λiwi
Balanced truncation reduction (TBR)
Very expensive. P and Q are dense even for sparse models
![Page 10: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/10.jpg)
• Arnoldi [Grimme ‘97]:V = colsp{A-1B, A-2B, …}, W=VT , approx. Pdom only
• Padé via Lanczos [Feldman and Freund ‘95]colsp(V) = {A-1B, A-2B, …}, - approx. Pdom colsp(W) = {A-TCT, (A-T )2CT, …}, - approx. Qdom
• Frequency domain POD [Willcox ‘02], Poor Man’s TBR [Phillips ‘04]
Most reduction algorithms effectively separately approximate dominant eigenspaces of P and Q :
However, what matters is the product PQ
colsp(V) = {(jω1I-A)-1B, (jω2I-A)-1B, …}, - approx. Pdom
colsp(W) = {(jω1I-A)-TCT, (jω2I-A)-TCT, …}, - approx. Qdom
![Page 11: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/11.jpg)
RC line (symmetric circuit)
Symmetric, P=Q all controllable states are observable and vice
versa
V(t) – inputi(t) - output
![Page 12: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/12.jpg)
RLC line (nonsymmetric circuit)
P and Q are no longer equal! By keeping only mostly controllable
and/or only mostly observable states, we may not find dominant eigenvectors of PQ
Vector of states:
![Page 13: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/13.jpg)
Lightly damped RLC circuit
Exact low-rank approximations of P and Q of order < 50 leads to PQ ≈ 0!!
R = 0.008, L = 10-5
C = 10-6
N=100
![Page 14: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/14.jpg)
Lightly damped RLC circuit
Union of eigenspaces of P and Qdoes not necessarily approximate
dominant eigenspace of PQ .
Top 5 eigenvectors of P Top 5 eigenvectors of Q
![Page 15: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/15.jpg)
AISIAD model reduction algorithm
Idea of AISIAD approximation:Approximate eigenvectors using power iterations:
Vi converges to dominant eigenvectors of PQ
Need to find the product (PQ)Vi
Xi = (PQ)Vi => Vi+1
= qr(Xi)
“iterate”
How?
![Page 16: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/16.jpg)
Approximation of the product Vi+1 =qr(PQVi), AISIAD algorithm
Wi ≈ qr(QVi) Vi+1
≈ qr(PWi)
Approximate using solution of Sylvester equation
Approximate using solution of Sylvester equation
![Page 17: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/17.jpg)
More detailed view of AISIAD approximation
Right-multiply by Wi
X X H, qxq (original AISIAD)
M, nxq
![Page 18: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/18.jpg)
X X H, qxq
Modified AISIAD approximation
Right-multiply by Vi
Approximate!
M, nxq
^
![Page 19: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/19.jpg)
Modified AISIAD approximation
Right-multiply by Vi
We can take advantage of numerous methods, which approximate P and Q!
X X H, qxqApproximate!
M, nxq
^
![Page 20: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/20.jpg)
n x qn x n
Specialized Sylvester equation
A X + X H =-M
q x q
Need only column span of X
![Page 21: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/21.jpg)
Solving Sylvester equation
Schur decomposition of H :
A X + X =-M~ ~
Solve for columns of X~
~
X
![Page 22: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/22.jpg)
Solving Sylvester equation
Applicable to any stable A
Requires solving q times
Schur decomposition of H :
Solution can be accelerated via fast MVPAnother methods exists, based on IRA, needs A>0 [Zhou ‘02]
![Page 23: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/23.jpg)
Solving Sylvester equation
Applicable to any stable A
Requires solving q times
Schur decomposition of H :
For SISO systems and P=0 equivalent to matching at frequency points –Λ(WTAW)
^
![Page 24: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/24.jpg)
Modified AISIAD algorithm
1.Obtain low-rank approximations of P and Q2.Solve AXi +XiH + M = 0, => Xi≈ PWi
where H=WiTATWi, M = P(I - WiWi
T)ATWi + BBTWi
3. Perform QR decomposition of Xi =ViR
4. Solve ATYi +YiF + N = 0, => Yi≈ QVi
where F=ViTAVi, N = Q(I - ViVi
T)AV + CTCVi
5.Perform QR decomposition of Yi =Wi+1 R to get new
iterate. 6.Go to step 2 and iterate.7.Bi-orthogonalize W and V and construct reduced
model:(WTAV, WTB, CV, D)
LR-sqrt^ ^
^
^
![Page 25: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/25.jpg)
For systems in the descriptor form
Generalized Lyapunov equations:
Lead to similar approximate power iterations
![Page 26: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/26.jpg)
mAISIAD and low-rank square root
Low-rank gramians
LR-square root
mAISIAD
(inexpensive step) (more expensive)
For the majority of non-symmetric cases, mAISIAD works better than low-rank square root
(cost varies)
![Page 27: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/27.jpg)
RLC line example results
H-infinity norm of reduction error (worst-case discrepancy over all frequencies)
N = 1000,1 input
2 outputs
![Page 28: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/28.jpg)
Steel rail coolling profile benchmark
Taken from Oberwolfach benchmark collection, N=1357 7 inputs, 6 outputs
![Page 29: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/29.jpg)
mAISIAD is useless for symmetric models
For symmetric systems (A = AT, B = CT) P=Q, therefore mAISIAD is equivalent to LRSQRT for P,Q of order q
RC line example
^ ^
![Page 30: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/30.jpg)
Cost of the algorithm
Cost of the algorithm is directly proportional to the cost of solving a linear system:
(where sjj is a complex number)
Cost does not depend on the number of inputs and outputs
(non-descriptor case)
(descriptor case)
![Page 31: A more reliable reduction algorithm for behavioral model extraction](https://reader036.fdocuments.us/reader036/viewer/2022062810/56815b1c550346895dc8d120/html5/thumbnails/31.jpg)
Conclusions The algorithm has a superior accuracy and
extended applicability with respect to the original AISIAD method
Very promising low-cost approximation to TBR
Applicable to any dynamical system, will work (though, usually worse) even without low-rank gramians
Passivity and stability preservation possible via post-processing
Not beneficial if the model is symmetric