Chernoff Bound

Consider the sum of a sequence of independent random variables. One may expect its distribution is concentrated around its expected value. This is characterized by the Chernoff bound, which gives exponentially decreasing bounds on tail distributions of the sum.

Let \(X_i\), \(i=1,\ldots, n\), be independent Bernoulli variables, where
{\bf Pr}[X_i = 1] = 1 – {\bf Pr}[X_i = 0] = p_i.
Let \(X = \sum X_i\) and \(\mu = \sum p_i\). We wish to bound the probability that \(X\) is far from \(\mu\).

The moment generating function of \(X\) is defined as \(M_t(X) = {\bf E} [e^tX]\).
{\bf E}[e^{tX_i}] = p_i e^t + 1-p_i \leq e^{p_i(e^t-1)},
by the fact that \(1 + x\leq e^x\).
We have the following
{\bf E}[e^{tX}] = {\bf E}[\prod e^{tX_i}] = \prod {\bf E}[e^{tX_i}]
\leq \prod e^{p_i(e^t-1)} = e^{\mu(e^t-1)},
by the independency of \(X_i\)‘s.

{\bf Pr}[ X\geq (1+\delta) \mu ] = {\bf Pr}[ e^{tX}\geq e^{t(1+\delta) \mu} ]
\leq \frac{ {\bf E}[e^{tX}] }{ e^{t(1+\delta) \mu} }
\leq \frac{ e^{\mu(e^t-1)} }{ e^{t(1+\delta) \mu} }
\leq \left(\frac{ e^\delta }{ (1+\delta)^{(1+\delta)} }\right)^{\mu},
choosing \(e^t = 1+\delta\).

We can also get a similar bound on the deviation below the mean.
{\bf Pr}[ X\leq (1-\delta) \mu ]
\leq \left(\frac{ e^{-\delta} }{ (1-\delta)^{(1-\delta)} }\right)^{\mu}.



Tagged with:
Posted in Probability