ORF523: The complexities of optimization

This page collects together the posts for the graduate course on optimization I taught at Princeton in the Spring 2013. This material has been reorganized (some parts have been cut, some have been extended) into a monograph which got recently published “Foundations and Trends in Machine Learning, Vol. 8: No. 3-4, pp 231-357, 2015” (see here for the free version):

convexopt

Part I: Com­pu­ta­tional com­plex­ity of optimization

Post 1: Introduction

Post 2: The ellipsoid method

Post 3: LP, SDP, and Conic Programming

Post 4: Interior Point Methods

Post 5: IPMs for LPs and SDPs

Post 6: Lagrangian duality

Post 7: Classification, SVM, Kernel Learning

Post 8: Turing Machines

Post 9: P vs. NP, NP-completeness

Post 10: Getting around NP-hardness

Bonus lectures by Amir Ali Ahmadi: Sum of Squares (SOS) Techniques: An Introduction, Part I, and Part II

Part II: Ora­cle com­plex­ity of optimization

Post 11: Oracle complexity, large-scale optimization

Post 12: Oracle complexity of Lipschitz convex functions

Post 13: Oracle complexity of smooth convex functions

Post 14: Nesterov’s Accelerated Gradient Descent

Post 15: Strong convexity (added in 2014: Nesterov’s Accelerated Gradient Descent for Smooth and Strongly Convex Optimization)

Post 16: Conditional Gradient Descent and Structured Sparsity

Post 17: ISTA and FISTA

Post 18: Mirror Descent, part I/II

Post 19: Mirror Descent, part II/II

Post 20: Mirror Prox

Part III: Optimization and Randomness

Post 21: Noisy oracles (for stochastic approximation and acceleration by randomization)

Post 22: Acceleration by randomization for a sum of smooth and strongly convex functions

Post 23: Optimization with bandit feedback

Final exam

Post 24: An SDP based on Grothendieck’s inequality; Coordinate Descent

266 Responses to "ORF523: The complexities of optimization"

Leave a replyLeave a Reply to Sharon