In probability theory, concentration inequalities provide bounds on how a random variable deviates from some value (typically, its expected value). The laws of large numbers of classical probability theory state that sums of independent random variables are, under very mild conditions, close to their expectation with a large probability. Such sums are the most basic examples of random variables concentrated around their mean. Recent results show that such behavior is shared by other functions of independent random variables.
Concentration inequalities can be sorted according to how much information about the random variable is needed in order to use them.
Markov's inequality requires only the following information on a random variable X:
Then, for every constant :
Markov's inequality extends to a strictly increasing and non-negative function :
Chebyshev's inequality requires the following information on a random variable X:
Then, for every constant a > 0:
or equivalently:
Chebyshev's inequality can be seen as a special case of the generalized Markov's inequality when .