# Author Archives: Sebastien Bubeck

## ORF523: Noisy oracles

Today we start the third (and last) part of our class titled: ‘Optimization and Randomness’. An important setting that we will explore is the one of optimization with noisy oracles: Noisy -order oracle: given it outputs such that . Noisy … Continue reading

## ORF523: Mirror Prox

Today I’m going to talk about a variant of Mirror Descent invented by Nemirovski in 2004 (see this paper). This new algorithm is called Mirror Prox and it is designed to minimize a smooth convex function over a compact convex … Continue reading

## COLT 2013 accepted papers

The accepted papers for COLT 2013 have just been revealed. You can also find the list below with a link to the arxiv version when I could find one (if you want me to add a link just send me … Continue reading

## ORF523: Mirror Descent, part II/II

We start with some of the standard setups for the Mirror Descent algorithm we described in the previous post. Standard setups for Mirror Descent ‘Ball’ setup. The simplest version of Mirror Descent is obtained by taking on The function is … Continue reading

## ORF523: Mirror Descent, part I/II

In this post we are interested in minimizing a convex function over a compact convex set under the assumption that is -Lipschitz on with respect to some arbitrary norm (precisely we assume that ). As we have seen in this … Continue reading

## ORF523: ISTA and FISTA

As we said in the previous lecture it seems stupid to consider that we are in a black-box situation when in fact we know entirely the function to be optimized. Consider for instance the LASSO objective where one wants to minimize over . By … Continue reading

## ORF523: Conditional Gradient Descent and Structured Sparsity

In the following table we summarize our findings of previous lectures in terms of oracle convergence rates (we denote ). Note that in the last two lines the upper bounds and lower bounds are not matching. In both cases one can … Continue reading

## A new book on Concentration Inequalities by Boucheron, Lugosi and Massart

If you work in a field related to probability theory you most certainly have already come across the concentration of measure phenomenon: ‘a function that depends on many independent variables, but not too much on any one of them, is … Continue reading

## Guest post by Amir Ali Ahmadi: Sum of Squares (SOS) Techniques: An Introduction, Part II/II

The reader is encouraged to read Part I of this series before reading this post. For ease of reference, we recall the polynomial optimization problem from our previous post: (1) This is the task of minimizing a multivariate polynomial … Continue reading

## ORF523: Strong convexity

Today we will talk about another property of convex functions that can significantly speed-up the convergence of first-order methods: strong convexity. We say that is -strongly convex if it satisfies (1) Of course this definition does not require differentiability … Continue reading