An unpleasant calculation?

At the end of my previous post I claimed that proving that \sum_{s=2}^k \frac{{k \choose s} {n-k \choose k-s}}{{n \choose k}} 2^{{s \choose 2}} , tends to 0 when n tends to infinity and k=(2-\epsilon)\log_2(n) was unpleasant. It appears that I was wrong, and I would like to show here a simple proof that was suggested to me by my colleague Philippe Rigollet.

The key observation is that the term \frac{{k \choose s} {n-k \choose k-s}}{{n \choose k}} is the probability that an hypergeometric random variable H(k,k,n) takes value s. Hypergeometric distributions are nasty, but when n is of larger order than k (which is the case here), they are well approximated by a simple binomial B(n, k/n). In the present case this suggests the upper bound \frac{{k \choose s} {n-k \choose k-s}}{{n \choose k}} \leq {k \choose s} \left(\frac{k}{n}\right)^s . The latter inequality can be proved formally using the identity {n-1 \choose k-1} = \frac{k}{n} {n \choose k} . Now one can simply use the inequalities 2^{{s \choose 2}} \leq 2^{s \frac{k}{2}}, and {k \choose s} \leq k^s to obtain that the original unpleasant sum is upper bounded by \sum_{s=2}^{k} 2^{s (\frac{k}{2} + 2\log_2(k) - \log_2(n) )} which clearly tends to 0 when n tends to infinity and k=(2-\epsilon)\log_2(n).

This entry was posted in Random graphs. Bookmark the permalink.

One Response to "An unpleasant calculation?"

  • Philippe

Leave a reply