# Category Archives: Optimization

## Guest post by Amir Ali Ahmadi: Sum of Squares (SOS) Techniques: An Introduction, Part I/II

Sum of squares optimization is an active area of research at the interface of algorithmic algebra and convex optimization. Over the last decade, it has made significant impact on both discrete and continuous optimization, as well as several other disciplines, … Continue reading

## ORF523: Nesterov’s Accelerated Gradient Descent

In this lecture we consider the same setting than in the previous post (that is we want to minimize a smooth convex function over ). Previously we saw that the plain Gradient Descent algorithm has a rate of convergence of … Continue reading

## ORF523: Oracle complexity of smooth convex functions

In this lecture we consider the unconstrained minimization of a function that satisfy the following requirements: (i) admits a minimizer on . (ii) is continuously differentiable and convex on . (iii) is smooth in the sense that the gradient mapping … Continue reading

## ORF523: Oracle complexity of Lipschitz convex functions

In this lecture we consider the unconstrained minimization of a function that satisfy the following requirements: (i) admits a minimizer on such that . (ii) is convex on . Recall that in particular this implies for any . (iii) is … Continue reading

## ORF523: Oracle complexity, large-scale optimization

Our study of the computational complexity of mathematical optimization led us to realize that we are still far from a complete understanding of what exactly can be optimized with a computer. On the positive side we proved that some problems can be … Continue reading

## ORF523: Classification, SVM, Kernel Learning

In this post I will go over some simple examples of applications for the optimization techniques that we have seen in the previous lectures. The general theme of these examples is the canonical Machine Learning problem of classification. Since the … Continue reading

## ORF523: Lagrangian duality

In this post I review very basic facts on Lagrangian duality. The presentation is extracted from Chapter 5 of Boyd and Vandenberghe. We consider the following problem: Let be the non-empty domain of definition for this problem (that is the … Continue reading

## ORF523: IPMs for LPs and SDPs

We have seen that, roughly, the complexity of Interior Point Methods to optimize linear functions on a convex set (with non-empty interior) with a -self-concordant barrier is of order where is the complexity of computing the Newton direction (which can … Continue reading

## ORF523: Interior Point Methods

Khachiyan’s ellipsoid method was such a breakthrough that it made it to The New York Times in 1979. The next time that optimization theory was featured in The New York Times was only a few years later, in 1984, and … Continue reading

## ORF523: LP, SDP, and Conic Programming

Recall from the previous lecture that we can ‘solve’ the problem (with a convex body in and a convex function) in ‘time’ where is the computational complexity of finding a separating hyperplane from at , and computing the and order … Continue reading