The Blackwell-Girshick equation is an equation in probability theory that allows for the calculation of the variance of random sums of random variables.[1] It is the equivalent of Wald's lemma for the expectation of composite distributions.
Let be a random variable with values in , let be independent and identically distributed random variables, which are also independent of , and assume that the second moment exists for all and . Then, the random variable defined by
has the variance
.
The Blackwell-Girshick equation can be derived using conditional variance and variance decomposition.
If the are natural number-valued random variables, the derivation can be done elementarily using the chain rule and the probability-generating function.[2]
Let have a Poisson distribution with expectation , and let follow a Bernoulli distribution with parameter . In this case, is also Poisson distributed with expectation , so its variance must be . We can check this with the Blackwell-Girshick equation: has variance while each has mean and variance , so we must have
For an example of an application: Mühlenthaler, M.; Raß, A.; Schmitt, M.; Wanka, R. (2021). "Exact Markov chain-based runtime analysis of a discrete particle swarm optimization algorithm on sorting and OneMax". Natural Computing: 1–27.