A year ago I started this blog. I did not expect this experience to be so rewarding. For instance seeing weeks after the weeks the growing interest for the optimization posts gave me the stamina to pursue this endaveor to its end. Meeting with readers at various conferences (or housewarming parties…) has been a lot of fun too.
Now is probably a good time to look back on 2013 as well as to look forward to what this blog will become in 2014.
I’m a bandit in 2013
First of all I’m happy to report that ‘I’m a bandit’ was viewed more than 55 000 times in 2013!!! As you can see from the plot below (taken from Google analytics) there has been three significant spikes.
– On March 22nd I received my first significant link from John Langford on his own blog hunch.net.
– On July 17th I made a post with ‘deep learning’ in the title. It was retweeted, reblogged, facebooked, G+’ed, etc.
– The last spike on December 13th is quite interesting, as it comes from my first link from the Machine Learning group on reddit.
Of course the stars of the blog so far have been the optimization lecture notes. But the star among the stars is the post on Nesterov’s Accelerated Gradient Descent which has been viewed THREE TIMES MORE than any other post in this sequence! Apart from optimization stuff this blog hosted a few other topics such as metric embeddings, convex geometry or graph theory. Browsing these older posts should be easier now with the new Archives page.
I’m a bandit in 2014
My main objective for the first half of 2014 is to turn the optimization posts into an actual book (or rather a long survey). This will be published in the Foundations and Trends in Machine Learning series (alongside with my previous survey on bandits). Of course I expect this project to take up a lot of my time, so I won’t post too much from February to May. On the other hand I am hopeful that during this downtime I will host quite a few interesting guest posts.
Once the first draft for the optimization lecture notes is out (probably in early May) I will have more time to dedicate to the blog. I plan to start a new series of posts on a topic that I find fascinating, this is the recent theory of graphs limits. I believe (and I’m not the only one!) that in the years to come this theory will prove to be a powerful tool for network analysis, and in particular for statistical analysis on network data. More on this in a few months!
By ashish soni February 18, 2014 - 2:42 am
I read your blog but not as soon as it is updates, , Though I don’t comment. But many many congratulations for completing one year. Way to go!!
By Igor January 15, 2014 - 2:06 am
congratulations Sebastien!
By Sebastien Bubeck January 15, 2014 - 8:49 am
Thanks Igor, and thanks for your links on Nuit Blanche, I got quite a bit traffic from them!
By Moritz January 14, 2014 - 8:23 pm
It’s funny. I saw exactly the same December spike. It so happened that our two posts on Gradient Descent appeared simultaneously on Reddit. Who knew Reddit liked Gradient Descent so much 🙂
By Sebastien Bubeck January 15, 2014 - 8:48 am
Yeah I had no idea that ML was a thing on Reddit before that!