## Fall 2013: Information theoretic methods

While information theory has traditionally been based on probabilistic methods, ideas and methods from information theory have recently played an increasingly important role in various areas of probability theory itself (as well as in statistics, combinatorics, and other areas of mathematics). The goal of these informal lectures is to introduce some topics and tools at the intersection of probability theory and information theory. No prior knowledge of information theory will be assumed. Potential topics include: entropic central limit theorems, entropic Poisson approximation, and related information-theoretic and probabilistic inequalities; connections to logarithmic Sobolev inequalities and Stein’s method; entropic inequalities for additive, matroidal, and tree structures, and their applications; transportation cost-information inequalities and their relation to concentration of measure; basic large deviations theory.

**Prerequisites:** Probability at the level of ORF 526 is assumed.

**Time and location:** Thursdays, 4:30-6:00, Bendheim Center classroom 103.

The first lecture will be on September 19.

**References:**

- N. Gozlan and C. Leonard, “Transport Inequalities. A Survey”
- O. Johnson, “Information Theory and the Central Limit Theorem”
- M. Ledoux, “The Concentration of Measure Phenomenon”
- C. Villani, “Topics in Optimal Transportation”