Quadratic Programming
Transcript of Quadratic Programming
![Page 1: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/1.jpg)
Quadratic programming
Under guidance of:Dr.G.Agarwal
(Prof. Deptt.of Mech. Engg. , MNIT) Presented by : Anil Kumar Sharma (2011PMM5023) Ramniwas Saran (2010PMM133)
![Page 2: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/2.jpg)
Quadratic programming
A linearly constrained optimization problem with a quadratic objective function is calleda quadratic program (QP). Because of its many applications, quadratic programming is often viewed as a discipline in and of itself .We begin this section by examining the Karush-Kuhn-Tucker conditions for the QP and see that they turn out to be a set of linear equalities and complementarity constraints.
![Page 3: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/3.jpg)
The quadratic programming problem can be formulated as
Minimize with respect to xF(X)=CTX+1/2XTQXSUBJECT TO:
AX<=B Inequality constraint
EX=D equality constraint
X>=0
![Page 4: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/4.jpg)
Equality Constraints
Maximize f(X)s.t. gi(X) = bi
Set up the Lagrangian function:
L(X,) = f(X) - ii(gi(X)-bi)
![Page 5: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/5.jpg)
Constrained Optimization
• Equality constraints – often solvable by calculus
• Inequality constraints – sometimes solvable by numerical methods
![Page 6: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/6.jpg)
Optimizing the Lagrangian
Differentiate the Lagrangian function withrespect to X and . Set the partial derivatives equal to zero and solve thesimultaneous equation system. Examinethe bordered Hessian for concavity conditions. The "border" of this Hessianis comprised of the first partial derivativesof the constraint function, with respect to , X1, and X2.
![Page 7: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/7.jpg)
7
The Principles and Geometries of KKT of Optimization
X)(
)(
21
21
s.t.min
x ,...,x ,x
x ,...,x ,xf
n
n
![Page 8: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/8.jpg)
8
Geometries of KKT: Unconstrained
• Problem:• Minimize f(x), where x is a vector that could have any values, positive or negative
• First Order Necessary Condition (min or max): f(x) = 0 (∂f/∂xi = 0 for all i) is the first order necessary condition for optimization
• Second Order Necessary Condition: 2f(x) is positive semidefinite (PSD)
• [ x • 2f(x) • x ≥ 0 for all x ]• Second Order Sufficient Condition
(Given FONC satisfied) 2f(x) is positive definite (PD)
• [ x • 2f(x) • x > 0 for all x ] ∂f/∂xi = 0
xi
f
∂2f/∂xi2 > 0
![Page 9: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/9.jpg)
9
Geometries of KKT: Equality Constrained (one constraint)
• Problem:• Minimize f(x), where x is a vector• Subject to: h(x) = b
• First Order Necessary Condition for minimum (or for maximum): f(x) = h(x) for some free ( is a scalar)
Two surfaces must be tangent
h(x) = b and -h(x) = -b are the same;there is no sign restriction on
h(x) = b
![Page 10: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/10.jpg)
10
Geometries of KKT: Equality Constrained (one constraint)
• First Order Necessary Condition: f(x) = h(x) for some
• Lagrangian:• L(x, ) = f(x) - [h(x)-b] , • Minimize L(x, ) over x and Maximize L(x, ) over . Use principles of
unconstrained optimizationL(x, ) = 0:xL(x, ) = f(x) - h(x) = 0L(x, ) = h(x)-b = 0
![Page 11: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/11.jpg)
11
Geometries of KKT: Equality Constrained (multiple constraints)
• Problem:• Minimize f(x), where x is a vector• Such that: hi(x) = bi for i = 1,2,…,m
• KKT Conditions (Necessary Conditions): Exist i ,i = 1,2,…,m, such that f(x) = i=1
n ihi(x) • hi(x) = bi for i = 1,2,…,m
• Such a point (x, ) is called a KKT point, and is called the Dual Vector or the Lagrange Multipliers. Furthermore, these conditions are sufficient if f(x) is convex and hi(x), i = 1,2,…,m, are linear.
![Page 12: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/12.jpg)
12
Geometries of KKT: Inequality Constrained (one constraint)
• Problem:• Mimimize f(x), where x is a vector• Subject to: g(x) ≥ b
• Equivalently:• f(x) = g(x)• g(x) ≥ b > 0 • [g(x) – b] = 0
![Page 13: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/13.jpg)
13
Geometries of KKT: Inequality Constrained (two constraints)
• Problem:• Minimize f(x), where x is a vector• Subject to: g1(x) ≥ b1 and g2(x) ≥ b2
• First Order Necessary Conditions: f(x) = 1 g1(x) + 2 g2(x), 1 ≥ 0, 2 ≥ 0
f(x) lies in the cone between g1(x) and g2(x)• g1(x) > b1 1 = 0 • g2(x) > b2 2 = 0 1 [g1(x) - b1] = 0 2 [g2(x) - b2] = 0
• Shaded area is feasible set with two constraints
x1
x2 -g1(x)
-g2(x)-f(x)
Both constraints are binding
![Page 14: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/14.jpg)
14
Geometries of KKT: Inequality Constrained (two constraints)
• Problem:• Minimize f(x), where x is a vector• Subject to: g1(x) ≥ b1 and g2(x) ≥ b2
• First Order Necessary Conditions: f(x) = 1 g1(x), 1 ≥ 0• g2(x) > b2 2 = 0 • g1(x) - b1 = 0
• Shaded area is feasible set with two constraints
x1
x2 -g1(x)
-f(x)
First constraint is binding
![Page 15: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/15.jpg)
15
Geometries of KKT: Inequality Constrained (two constraints)
• Problem:• Minimize f(x), where x is a vector• Subject to: g1(x) ≥ b1 and g2(x) ≥ b2
• First Order Necessary Conditions: f(x) = 0• g1(x) > b1 1 = 0 • g2(x) > b2 2 = 0
• Shaded area is feasible set with two constraints
x1
x2
f(x)=0
None constraint is binding
![Page 16: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/16.jpg)
16
Geometries of KKT: Inequality Constrained (two constraints)
• Lagrangian:• L(x, 1, 2) = f(x) - 1 [g1(x)-b1] - 2 [g2(x)-b2]
• Minimize L(x, 1, 2) over x. • Use principles of unconstrained maximization L(x, 1, 2) = 0 (gradient with respect to x only) L(x, 1, 2) = f(x) - 1 g1(x) - 2 g2(x) = 0• Thus f(x) = 1 g1(x) + 2 g2(x)
• Maximize L(x, 1, 2) over 1≥ 0, 2 ≥ 0.• g1(x)-b1 > 0, then 1=0• g2(x)-b2 > 0, then 2=0
![Page 17: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/17.jpg)
Bordered Hessian0 g'(x1) g'(x2)
g'(x1) f11 f12g'(x2) f21 f22
For a max, the determinant of this matrix wouldbe positive. For a min, it would be negative. Forproblems with 3 or more variables, the “even”determinants are positive for max, and “odd”ones are negative. For a min, all are negative.
Note: thedeterminantis designated|H2|
![Page 18: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/18.jpg)
Aside on Bordered Hessians
You can also set these up so that the border carriesnegative signs. And you can set these up sothat the border runs along the bottom and theright edge, with either positive or negative signs. Be sure that the concavity condition testsmatch the way you set up the bordered Hessian.
![Page 19: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/19.jpg)
Multiple Constraints SOCM is the number of constraints in a given problem. N the number of variables in a given problem.
The bordered Principle Minor that contains f22 asthe last element is denoted |H2| as before. Iff33 is the last element, we denote |H3|, and so on.
Evaluate |Hm+1| through |Hn|. For a maximum,they alternate in sign. For a min, they all take the sign (-1)M
![Page 20: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/20.jpg)
Interpreting the Lagrangian Multipliers
The values of the Lagrangian multipliers (i) aresimilar to the shadow prices from LP, exceptthey are true derivatives (i = L/bi) and are not usually constant over a range.
![Page 21: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/21.jpg)
Inequality Constraints
Maximize f(X)
s.t. g(X) b X 0
![Page 22: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/22.jpg)
Example
Minimize C = (X1 – 4)2 + (X2 –4)2
s.t. 2x1 + 3x2 ge 6 -3x1 – 2x2 ge –12
x1, x2 ge 0
![Page 23: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/23.jpg)
Graph
76543210 1 2 3 4 5 6 7
Optimum: 2 2/13, 2 10/13
![Page 24: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/24.jpg)
A Nonlinear Restriction
Maximize Profit = 2x1 + x2s.t. -x12 + 4x1 - x2 le 0 2x1 + 3x2 le 12
x1, x2 ge 0
![Page 25: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/25.jpg)
Graph – Profit Max Problem
4
3
2
1
0 1 2 3 4 5 6 7
F1
F2
There is a local optimum at edge of F1, but itisn't global.
![Page 26: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/26.jpg)
KKT Conditions: Final Notes
• KKT conditions may not lead directly to a very efficient algorithm for solving NLPs. However, they do have a number of benefits:• They give insight into what optimal solutions to NLPs look
like• They provide a way to set up and solve small problems• They provide a method to check solutions to large
problems• The Lagrange multipliers can be seen as shadow prices of
the constraints
29
![Page 27: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/27.jpg)
The Kuhn-Tucker Conditions
• xf(X*) - * xg(X*) 0
• [ xF(X*) - * xg(X*)]X* = 0
• X* 0• g(X*) b• *(g(X*)-b) =0• * 0
represents the gradient vector (1st derivatives)
![Page 28: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/28.jpg)
Wolfe’s method
Max. Z = 4X1 + 6X2 -2X12 – 2X1X2 – 2X2
2 = f(x) say
Subjected to constraints-X1 + 2X2 ≤ 2 =g(x),say
X1 , X2 ≥ 0
Non-negativity conditions X1 , X2 ≥ 0 as inequality constraints
![Page 29: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/29.jpg)
Wolfe’s methodStandard form of QP problem-
X1 + 2X2 + S12 = 2
-X1 + r12 = 0
-X2 + r22 = 0
And X1, X2 , S1, r1, r2 ≥ 0
Lagrange function for necessary condition- L(X1, X2 , S1, λ1, µ1, µ2, r1, r2 )
= (4X1 + 6X2 -2X12 – 2X1X2 – 2X2
2) -λ1(X1+2X2+ S12 -2 ) -
µ1(-X1+ r12) - µ2(-X2+ r2
2)
![Page 30: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/30.jpg)
Wolfe’s method
Necessary & sufficient condition for maximization of Lagrange function, hence objective function-
∂L/∂X1 = 4 - 4X1 - 2X2 - 1λ1 + µ1 = 0;
∂L/∂X2 = 6 - 2X1 - 4X2 - 2λ1 + µ2 = 0;
∂L/∂λ1 = -(X1+2X2+S12-2)=0 => X1+2X2+ S1
2-2 = 0
∂L/∂S1 = -2λ1S1 = 0 => 2λ1S1 = 0
∂L/∂µ1 = -(-X1+ r12) = 0; => -X1+ r1
2 = 0
∂L/∂µ2 = - (-X2+ r22) =0; => -X2+ r2
2 =0
∂L/∂r1 = -2µ1r1 = 0; => 2µ1r1 = 0
∂L/∂r2 = -2µ2r2 = 0; => 2µ2r2 = 0
![Page 31: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/31.jpg)
Wolfe’s method
On re-arranging the conditions-4X1 + 2X2 + 1λ1 - µ1 = 4; --------12X1 + 4X2 + 2λ1 - µ2 = 6; --------2X1+ 2X2+ S1
2 = 2;
λ1S1 = 0µ1x1 = 0µ2x2 = 0
X1, X2 , λ1, S1 , µ1, µ2 ≥ 0
Complementary condition
![Page 32: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/32.jpg)
Wolfe’s method
On introducing artificial variable A1 and A2 in constraints 1 & 2 respectively, the modified LP-
Min. Z*= A1 +A2
Subject to- 4X1 + 2X2 + 1λ1 - µ1 + A1 = 4;
2X1 + 4X2 + 2λ1 - µ2 + A2 = 6;
X1+ 2X2+ S12 = 2;
X1, X2 , λ1 , µ1, µ2 , A1, A2 ≥ 0
![Page 33: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/33.jpg)
Initial solution
Cj 0 0 0 0 0 0 1 1
CB
Variable
in basis B
Solution value
Xb
X1 X2 λ1 µ1 µ2 S1 A1 A2
Min. Exchange
ratio
1 A1 4 ④ 2 1 -1 0 0 1 0 1→
1 A2 6 2 4 2 0 -1 0 0 1 3
0 s1 2 1 2 0 0 0 1 0 0 2
Z*=10 Zj 6 6 3 -1 -1 0 1 1
Cj - Zj -6 -6 -3 1 1 0 0 0
↑
![Page 34: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/34.jpg)
Cont…..
Cj 0 0 0 0 0 0 1
CB
Variable in basis
B
Solution value
Xb
X1 X2 λ1 µ1 µ2 S1 A2
Min. Exchange
ratio
0 X1 1 1 1/2 1/4 -1/4 0 0 0 2
1 A2 4 0 3 3/2 1/2 -1 0 1 4/3
0 s1 1 0 3/2 -1/4 ¼ 0 1 0 2/3 →
Z*=4 Zj 0 3 3/2 ½ -1 0 1
Cj - Zj 0 -3 -3/2 -1/2 1 0 0
↑
![Page 35: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/35.jpg)
Cont…..
Cj 0 0 0 0 0 0 1
CB
Variable in basis
B
Solution value
Xb
X1 X2 λ1 µ1 µ2 S1 A2
Min. Exchange
ratio
0 X1 2/3 1 0 1/3 -1/3 0 -1/3 0 2
1 A2 2 0 0 2 0 -1 -2 1 1 →
0 X2 2/3 0 1 -1/6 1/6 0 2/3 0 (-)ve
Z*=2 Zj 0 0 2 0 -1 -2 1
Cj - Zj 0 0 -2 0 1 2 0
↑
![Page 36: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/36.jpg)
Cj 0 0 0 0 0 0
CB
Variable in basis
B
Solution value
Xb
X1 X2 λ1 µ1 µ2 S1
Min. Exchange
ratio
0 X1 1/3 1 0 0 -1/3 1/6 0
0 λ1 1 0 0 1 0 -1/2 -1
0 X2 5/6 0 1 0 1/6 -1/12 ½
Z*=0 Zj 0 0 0 0 0 0
Cj - Zj 0 0 0 0 0 0
Since all Cj – Zj = 0, optimul solution for phase I is reached, hence
X1 = 1/3; X2 =5/6; λ1 =1; Z*= 0, Max.Z =25/6 µ1=0; µ2 =0; S1 =0;
.
![Page 37: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/37.jpg)
![Page 38: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/38.jpg)
![Page 39: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/39.jpg)
![Page 40: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/40.jpg)
![Page 41: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/41.jpg)
![Page 42: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/42.jpg)
![Page 43: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/43.jpg)
![Page 44: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/44.jpg)
Note:-
![Page 45: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/45.jpg)
CONCLUSIONThis presentation has presented for study about a new feature of operations research to solve the problem which has square in powers of variables, these problems are solved by the help of langarnge multiplier, wolfe’s method and with the help of kkt condition .
![Page 46: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/46.jpg)
References
•Hamdy A.Taha,Operations Reasearch an inroduction,Pearson,eighth edition,2011•JK.Sharma,Opertions Research,macmillan,third edition•QUADRATIC PROGRAMMING BASICS,H. E. Krogstad,Spring 2005/Rev. 2008
•Operations Research Models and Methods Paul A. Jensen and Jonathan F. Bard
![Page 47: Quadratic Programming](https://reader035.fdocuments.us/reader035/viewer/2022081412/5450e500af7959d40d8b45a4/html5/thumbnails/47.jpg)
THANKS