Strong Alternatives

The above lecture is brought to you by Skillshare. In a previous post of mine, we introduced weak alternatives. As a small reminder, consider the following two sets

(1)   \begin{equation*}S = \lbrace f_i(x) \leq 0, i = 1\ldots m ; h_i(x) = 0, i = 1 \ldots p \rbrace\end{equation*}

and

(2)   \begin{equation*}T = \lbrace \lambda \geq 0 , g(\lambda, \nu) > 0 \rbrace\end{equation*}

where

(3)   \begin{equation*}g(\lambda,\nu) = \inf_{x\in D} \Big\lbrace\sum\limits_{i=1}^m \lambda_i f_i(x) +\sum\limits_{i=1}^p \nu_i h_i(x) \Big\rbrace\end{equation*}

is the dual function and \mathcal{D} is the domain of the problem. Since we did not impose any convexity assumption on f_i‘s neither did we assume that our h_i‘s are affine, then all we can say about (T) and (S) is that they form weak alternatives. In other words,

  • If (T) is feasible, then (S) is infeasible.
  • If (S) is feasible, then (T) is infeasible.

In this lecture, we assume the following

  • f_i(x) are convex
  • h_i‘s are affine, i.e. h_i(x) = Ax-b
  • \exists x \in \text{relint}(D) such that Ax=b

In that case, we write (S) as

(4)   \begin{equation*}S = \lbrace f_i(x) \leq 0, i = 1\ldots m ; Ax= b \rbrace\end{equation*}

Thanks to the three conditions above, we could strengthen weak alternatives so that they form strong alternatives. That is to say

  • (T) is feasible \iff (S) is infeasible.
  • (S) is feasible \iff (T) is infeasible.

Indeed, strong alternatives are stronger since (unlike weak alternatives) if we know that one of the sets (S) or (T) is infeasible, then the other has to be feasible.

In my YouTube lecture, I give two applications relating to linear inequalities and intersection of ellipsoids.

Ahmad Bazzi

Ahmad Bazzi is an Electrical Engineer and YouTuber. Ahmad Bazzi is a signal processing engineer at CEVA DSP. He holds a PhD in Electrical Engineering from EURECOM. Ahmad Bazzi is a EURECOM alumni. Ahmad Bazzi obtained his Master studies (Summa Cum Laude) in Electrical Engineering at SUPELEC, Paris. Ahmad Bazzi has many publications in well-known IEEE conferences, including a nomination award and is a co-inventor in several patents. Ahmad Bazzi dedicates time to publish high level lectures on Mathematics (including convex optimization) and Programming. Ahmad Bazzi also focuses on Machine Learning, Convex Optimization, Linear Algebra, Python, SymPy, NumPy, Pandas, CVXOPT, MATLAB, C++ for firmware systems.

Leave a Reply

Your email address will not be published. Required fields are marked *

Next Post

Introduction of Convex Optimization

Fri Dec 18 , 2020
Mathematical optimization is a problem that takes the following form (1)   where is a vector containing all the variables of the problem (2)   The function is referred to as the cost or the objective function. Moreover, the functions are referred to as constraint functions. In most cases, our […]