﻿ Information theoretic methods | Stochastic Analysis Seminar Information theoretic methods – Stochastic Analysis Seminar

# Category Archives for Information theoretic methods

## Lecture 10. Concentration, information, transportation (2)

Recall the main proposition proved in the previous lecture, which is due to Bobkov and Götze (1999). Proposition. The following are equivalent for : for every and with finite mean. for every probability measure . This result provides a characterization … Continue reading

15. December 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 10. Concentration, information, transportation (2)

## Lecture 9. Concentration, information, transportation (1)

The goal of the next two lectures is to explore the connections between concentration of measure, entropy inequalities, and optimal transportation. What is concentration? Roughly speaking, concentration of measure is the idea that nonlinear functions of many random variables often … Continue reading

11. December 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 9. Concentration, information, transportation (1)

## Lecture 8. Entropic cone and matroids

This lecture introduces the notion of the entropic cone and its connection with entropy inequalities. Entropic cone Recall that if is a discrete random variable with distribution , the entropy of is defined as     Now let be (not … Continue reading

06. December 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 8. Entropic cone and matroids

## Lecture 7. Entropic CLT (4)

This lecture completes the proof of the entropic central limit theorem. From Fisher information to entropy (continued) In the previous lecture, we proved the following: Theorem. If are independent random variables with , then (1)   where are weights satisfying … Continue reading

05. December 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 7. Entropic CLT (4)

## Lecture 6. Entropic CLT (3)

In this lecture, we complete the proof of monotonicity of the Fisher information in the CLT, and begin developing the connection with entropy. The entropic CLT will be completed in the next lecture. Variance drop inequality In the previous lecture, … Continue reading

13. November 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 6. Entropic CLT (3)

## Lecture 5. Entropic CLT (2)

The goal of this lecture is to prove monotonicity of Fisher information in the central limit theorem. Next lecture we will connect Fisher information to entropy, completing the proof of the entropic CLT. Two lemmas about the score function Recall … Continue reading

06. November 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 5. Entropic CLT (2)

## Lecture 4. Entropic CLT (1)

The subject of the next lectures will be the entropic central limit theorem (entropic CLT) and its proof. Theorem (Entropic CLT). Let be i.i.d. real-valued random variables with mean zero and unit variance. Let     If for some , … Continue reading

23. October 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 4. Entropic CLT (1)

## Lecture 3. Sanov’s theorem

The goal of this lecture is to prove one of the most basic results in large deviations theory. Our motivations are threefold: It is an example of a probabilistic question where entropy naturally appears. The proof we give uses ideas … Continue reading

10. October 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 3. Sanov’s theorem

## Lecture 2. Basics / law of small numbers

Due to scheduling considerations, we postpone the proof of the entropic central limit theorem. In this lecture, we discuss basic properties of the entropy and illustrate them by proving a simple version of the law of small numbers (Poisson limit … Continue reading

03. October 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 2. Basics / law of small numbers

## Lecture 1. Introduction

What is information theory? The first question that we want to address is: “What is information?” Although there are several ways in which we might think of answering this question, the main rationale behind our approach is to distinguish information … Continue reading

25. September 2013 by Ramon van Handel
Categories: Information theoretic methods | Comments Off on Lecture 1. Introduction

← Older posts