Log Barrier Minimization: An Intuitive Approach

This post deals with the minimization of the log barrier function that is:

    \[f(x) = - \sum\limits_{i=1}^m \log (b_i - a_i^T x)\]

where x \in \text{dom}(f) defined as

    \[\text{dom}(f) = \lbrace b_i > a_i^T x \qquad i = 1 \ldots m \rbrace\]



We shall attempt an intuitive explanation on the solution of the following unconstrained minimization problem

    \[x^* = \min_{x \in \text{dom}(f)} f(x)\]



First off, let’s say our set of inequalities defined by \text{dom}(f) is closed, the optimal solution of problem \min_{x \in \text{dom}(f)} f(x) can not lie outside the closed area defined by \text{dom}(f) since function f(x) would not be defined. So we have no other choice than to force x^* to lie inside \text{dom}(f). If x^* is close to the hyperplane defined by , say the k^{th} one, i.e. b_i = a_i^T x, then the term -\log (b_k - a_k^T x) would explode to infinity, hence the minimization process fails. The best compromise would be to place x^* as far as possible from all hyperplanes a_i^T x = b_i, which is then termed the analytic center of the inequalities b_i > a_i^T x for all i = 1 \ldots m. See the above video for a nice illustration.

Ahmad Bazzi

Ahmad Bazzi is an Electrical Engineer and YouTuber. Ahmad Bazzi is a signal processing engineer at CEVA DSP. He holds a PhD in Electrical Engineering from EURECOM. Ahmad Bazzi is a EURECOM alumni. Ahmad Bazzi obtained his Master studies (Summa Cum Laude) in Electrical Engineering at SUPELEC, Paris. Ahmad Bazzi has many publications in well-known IEEE conferences, including a nomination award and is a co-inventor in several patents. Ahmad Bazzi dedicates time to publish high level lectures on Mathematics (including convex optimization) and Programming. Ahmad Bazzi also focuses on Machine Learning, Convex Optimization, Linear Algebra, Python, SymPy, NumPy, Pandas, CVXOPT, MATLAB, C++ for firmware systems.

Leave a Reply

Your email address will not be published. Required fields are marked *