# Category Archives: Optimization

## Guest post by Dan Garber and Elad Hazan: The Conditional Gradient Method, A Linear Convergent Algorithm – Part I/II

In a previous post Sebastien presented and analysed the conditional gradient method for minimizing a smooth convex function over a compact and convex domain . The update step of the method is as follows,     where , . The … Continue reading

## The aftermath of ORF523 and the final

It has been two weeks since my last post on the blog, and I have to admit that it felt really good to take a break from my 2-posts-per-week regime of the previous months. Now that ORF523 is over the … Continue reading

## ORF523: Optimization with bandit feedback

In this last lecture we consider the case where one can only access the function with a noisy -order oracle (see this lecture for a definition). This assumption models the ‘physical reality’ of many practical problems (on the contrary to … Continue reading

## ORF523: Acceleration by randomization for a sum of smooth and strongly convex functions

In this lecture we are interested in the following problem:     where each is -smooth and strongly-convex. We denote by the condition number of these functions (which is also the condition number of the average). Using a gradient descent … Continue reading

Posted in Optimization | 2 Comments

## ORF523: Noisy oracles

Today we start the third (and last) part of our class titled: ‘Optimization and Randomness’. An important setting that we will explore is the one of optimization with noisy oracles: Noisy -order oracle: given it outputs such that . Noisy … Continue reading

Posted in Optimization | 1 Comment

## ORF523: Mirror Prox

Today I’m going to talk about a variant of Mirror Descent invented by Nemirovski in 2004 (see this paper). This new algorithm is called Mirror Prox and it is designed to minimize a smooth convex function over a compact convex … Continue reading

Posted in Optimization | 2 Comments

## ORF523: Mirror Descent, part II/II

We start with some of the standard setups for the Mirror Descent algorithm we described in the previous post. Standard setups for Mirror Descent ‘Ball’ setup. The simplest version of Mirror Descent is obtained by taking on The function is … Continue reading

Posted in Optimization | 4 Comments

## ORF523: Mirror Descent, part I/II

In this post we are interested in minimizing a convex function over a compact convex set under the assumption that is -Lipschitz on with respect to some arbitrary norm (precisely we assume that ). As we have seen in this … Continue reading

Posted in Optimization | 6 Comments

## ORF523: ISTA and FISTA

As we said in the previous lecture it seems stupid to consider that we are in a black-box situation when in fact we know entirely the function to be optimized. Consider for instance the LASSO objective where one wants to minimize over . By … Continue reading

Posted in Optimization | 5 Comments

## ORF523: Conditional Gradient Descent and Structured Sparsity

In the following table we summarize our findings of previous lectures in terms of oracle convergence rates (we denote ). Note that in the last two lines the upper bounds and lower bounds are not matching. In both cases one can … Continue reading

Posted in Optimization | 1 Comment