Convex Optimization

Definitions

  • Convex Set

    A set \(\mathcal X\) is called a convex set if \(x^{(1)}, x^{(2)} \in \mathcal X\), then \(\forall \alpha \in [0,1], \alpha x^{(1)} + (1 - \alpha)x^{(2)} \in \mathcal X\).

  • Convex Function

    A function \(f: \R^N \mapsto \R\) is convex if \(dom(f)\) is a convex set and \[ f(\alpha x^{(1)} + (1 - \alpha)x^{(2)}) \le \alpha f(x^{(1)}) + (1 - \alpha)f(x^{(2)}), \forall x^{(1)}, x^{(2)} \in dom(f), \alpha \in [0,1] \] \(f\) is concave if \(-f\) is convex.

Differentiable function \(f\) with a convex domain is convex if and only if \[ f(y) \ge f(x) + \nabla^T f(x)(y - x), \forall x, y \in dom(f) \] \(f\) is twice differentiable if \(\nabla^2f(x)_{ij} = \frac{\partial^2f(x)}{\partial x_ix_j}\) exists at each \(x \in dom(f)\). \(f\) is convex if and only if

\[ \nabla^2f(x) \succeq 0 \]

Theorem: Any local minima of a convex function \(f: \R^N \mapsto \R\) is also a global minima.

Proof:

Suppose \(y\) is a local minima. Then there exists a number \(r > 0\) such that \(\forall x \in dom(f) \land ||x-y||_2 \le r \Rightarrow f(x) \ge f(y)\). Suppose instead there exists a global minima \(z\). Then it must hold that \[ \begin{gather} ||y-z||_2 > r \\ 0 < \frac{r}{||y-z||_2} < 1 \end{gather} \] There exists some \(0 < \epsilon < r\) such that \(0 < \theta = 1 - \frac{r - \epsilon}{||y-z||_2} < 1\). Then the distance from \(y\) to point \((\theta y + (1 - \theta)z)\) is \[ ||y - (\theta y + (1 - \theta)z)||_2 = (1 - \theta)||y-z||_2 = \frac{r - \epsilon}{||y-z||_2}||y-z||_2 < r \] That is to say \[ f(\theta y + (1 - \theta)z) \ge f(y) \] However, since \(f\) is convex and \(0 < \theta < 1\), \[ f(\theta y + (1 - \theta)z) \le \theta f(y) + (1 - \theta)f(z) \] Since \(z\) is the global minima, \[ f(\theta y + (1 - \theta)z) \le \theta f(y) + (1 - \theta)f(z) < \theta f(y) + (1 - \theta)f(y) = f(y) \] which contradicts that \(f(\theta y + (1 - \theta)z) \ge f(y)\) derived. In other words, \(y\) is a global minima.

Standard Form

Standard form of the optimization problem is \[ \begin{aligned} &\min f_0(x) \\ s.t.\quad &f_i(x) \le 0, i = 1,\dots,r \\ &h_i(x) = 0, i = 1,\dots,s \\ \end{aligned} \] Standard form of the convex optimization problem is \[ \begin{aligned} &\min f_0(x) \\ s.t.\quad &f_i(x) \le 0, i = 1,\dots,r \\ &a_i^Tx = b_i (Ax = b), i = 1,\dots,s \\ &\text{$f_0,\dots,f_r$ are convex, and $a_0,\dots,a_s \in \R^N$} \end{aligned} \]

Previous
Next