*** Welcome to piglix ***

Wald's equation


In probability theory, Wald's equation, Wald's identity or Wald's lemma is an important identity that simplifies the calculation of the expected value of the sum of a random number of random quantities. In its simplest form, it relates the expectation of a sum of randomly many finite-mean, independent and identically distributed random variables to the expected number of terms in the sum and the random variables' common expectation under the condition that the number of terms in the sum is independent of the summands.

The equation is named after the mathematician Abraham Wald. An identity for the second moment is given by the Blackwell–Girshick equation.

Let (Xn)n∈ℕ be a sequence of real-valued, independent and identically distributed random variables and let N be a nonnegative integer-value random variable that is independent of the sequence (Xn)n∈ℕ. Suppose that N and the Xn have finite expectations. Then

Roll a six-sided die. Take the number on the die (call it N) and roll that number of six-sided dice to get the numbers X1, . . . , XN, and add up their values. By Wald's equation, the resulting value on average is

Let (Xn)n∈ℕ be an infinite sequence of real-valued random variables and let N be a nonnegative integer-valued random variable. Assume that

Then the random sums

are integrable and

If, in addition,

then

Remark: Usually, the name Wald's equation refers to this last equality.

Clearly, assumption (1) is needed to formulate assumption (2) and Wald's equation. Assumption (2) controls the amount of dependence allowed between the sequence (Xn)n∈ℕ and the number N of terms; see the counterexample below for the necessity. Note that assumption (2) is satisfied when N is a stopping time for the sequence (Xn)n∈ℕ. Assumption (3) is of more technical nature, implying absolute convergence and therefore allowing arbitrary rearrangement of an infinite series in the proof.


...
Wikipedia

...