In this lecture, we talk about Perturbation and Sensitivity Analysis. But what does that mean ? Well, consider our good old looking optimization problem that looks like this
(1)
One way to tell how the above problem reacts to perturbation is to actually perturb it and check how its optimal value behaves with perturbation parameters. To this end, consider the following perturbed problem
(2)
What did we just do ? Well instead of having strict equalities and inequalities. We “perturb” the zero boundary. If , we say that the inequality constraint is relaxed by an amount of . Likewise, when , the equality constraint is relaxed by an amount of . Also we can see that problem (2) “boils down” to problem (1) when perturbation parameters ‘s and ‘s are set to zero, which makes sense right ? Now if , you can see that we “tighten” the inequality constraint. Similarly, if , the equality constraint is said to be “tightened”. Going back to our main concern that is “check how its optimal value behaves with perturbation parameters”, we have to quantify that perturbation on , the optimal value of problem (1). To this extent, let us define the optimal value of problem (2) as
(3)
The function tells us how the optimal value of problem (2) as a function of perturbation parameters
(4)
Note that for the particular case and , we have that . In the lecture, we prove an inequality that shows us how far is from which is the following
(5)
where are the optimal dual Lagrangian multipliers. The above also provides a global view on how far we are from the optimal unperturbed problem in terms of the optimal dual Lagrangian multipliers. Now, if we impose extra properties on , i.e. differentiable at and strong duality, we can get a feel on what happens locally around . In the above lecture, we also prove that given the previous two conditions (strong duality and differentiability around ), we have that
(6)
The above allows us to quantify how active a constraint is at the optimal point .
Ahmad Bazzi
Ahmad Bazzi is an Electrical Engineer and YouTuber. Ahmad Bazzi is a signal processing engineer at CEVA DSP. He holds a PhD in Electrical Engineering from EURECOM. Ahmad Bazzi is a EURECOM alumni. Ahmad Bazzi obtained his Master studies (Summa Cum Laude) in Electrical Engineering at SUPELEC, Paris. Ahmad Bazzi has many publications in well-known IEEE conferences, including a nomination award and is a co-inventor in several patents.
Ahmad Bazzi dedicates time to publish high level lectures on Mathematics (including convex optimization) and Programming. Ahmad Bazzi also focuses on Machine Learning, Convex Optimization, Linear Algebra, Python, SymPy, NumPy, Pandas, CVXOPT, MATLAB, C++ for firmware systems.
Let us say we are interested in checking whether system hereby defined as (1) is feasible or not. In other words, could we find a vector that satisfies set ? In many cases, it may turn out to be hard to answer the question by exhaustively searching all possible […]