## Chernoff Bound

Consider the sum of a sequence of independent random variables. One may expect its distribution is concentrated around its expected value. This is characterized by the Chernoff bound, which gives exponentially decreasing bounds on tail distributions of the sum. Let

Tagged with:
Posted in Probability

## Kolmogorov Complexity

Kolmogorov complexity measures the complexity of describing an object, such as a binary string. The description is considered as a program which can be run on a computer. By Church’s thesis, all non-trivial computational models are equivalent; thus, one can

Tagged with: ,
Posted in Probability, Theory

Markov’s inequality provides a bound on the upper-tail probability of non-negative random variables. The bound is expressed with the expectation of the random variable. [Markov’s Inequality] Let $$X$$ be a non-negative random variable. Then, for any $$\lambda > 0$$, $Tagged with: , , Posted in Probability ## Quadratic Forms of Random Variables Quadratic Forms and Transformation Let $$A = \{a_{ij}\}$$ be an $$n\times n$$ matrix. A quadratic function of $$n$$ variables $$x = (x_1,\ldots, x_n)’$$ is defined as \[ f(x) = x’ A x = \sum_{i,j} a_{ij} x_i x_j.$ Without loss

Tagged with: ,
Posted in Probability