fa.bianp.net

Optimization inequalities cheatsheet

Most proofs in optimization consist in using inequalities for a particular function class in some creative way. This is a cheatsheet with inequalities that I use most often. It considers class of functions that are convex, strongly convex and $L$-smooth.

Setting. $f$ is a function $\mathbb{R}^p \to \mathbb{R}$. Below are a set of inequalities verified when $f$ belongs to a particular class of functions and $x, y, z \in \mathbb{R}^p$ are arbitrary elements in its domain. For simplicity I'm assuming that functions are differentiable, but most of these are also true replacing the gradient with a sub-gradient.

$f$ is $L$-smooth. This is the class of functions that are differentiable and its gradient is Lipschitz continuous.

$f$ is convex:

$f$ is both $L$-smooth and convex:

$f$ is $\mu$-strongly convex. Set of functions $f$ such that $f - \frac{\mu}{2}\|\cdot\|^2$ is convex. It includes the set of convex functions with $\mu=0$. Here $x^*$ denotes the minimizer of $f$.

$f$ is both $L$-smooth and $\mu$-strongly convex.

References

Most of these inequalities appear in the Book: "Introductory lectures on convex optimization: A basic course" by Nesterov (2013, Springer Science & Business Media). Another good (and free) resource is the book "Convex Optimization" by Stephen Boyd and Lieven Vandenberghe.

Mastodon