Proof of the Rao–Blackwell Theorem

I walk the reader through a proof the Rao–Blackwell Theorem.

The Rao–Blackwell Theorem (Rao, 1992; Blackwell, 1947) states:

Let θ^\hat{\theta} be an unbiased estimator of θ\theta with a finite second moment for all θ\theta. Let T(X)T(X) be a sufficient statistic for θ\theta. Then for all θ\theta,

  1. θRBE[θ^T(X)]=θ\theta_{\texttt{RB}} \triangleq \mathbb{E}[\hat{\theta} \mid T(X)] = \theta,

  2. V[θRB]V[θ^]\mathbb{V}[\theta_{\texttt{RB}}] \leq \mathbb{V}[\hat{\theta}].

This is a remarkably general result. In words, it says: if we have an unbiased estimator of our statistical parameter θ\theta and a sufficient statistic of that parameter T(X)T(X), then we can construct another estimator θRB\theta_{\texttt{RB}} such that this new estimator is still unbiased and may have less variance.

The proof of the first claim is

E[θRB]E[E[θ^T(X)]]=E[θ^]=θ.(1) \begin{aligned} \mathbb{E}[\theta_{\texttt{RB}}] &\triangleq \mathbb{E}[\mathbb{E}[\hat{\theta} \mid T(X)]] \\ &= \mathbb{E}[\hat{\theta}] \\ &= \theta. \end{aligned} \tag{1}

The first equality just applies our definition of this new estimator θRB\theta_{\texttt{RB}}. The next applies the law of total expectation. The last holds because θ^\hat{\theta} is unbiased.

The proof of the second claim is

V[θRB]=E[(θRBθ)2]=E[(E[θ^T(X)]θ)2]=E[(E[θ^θT(X)])2]E[(E[(θ^θ)2T(X)]]=E[(θ^θ)2]=V[θ^].(2) \begin{aligned} \mathbb{V}[\theta_{\texttt{RB}}] &= \mathbb{E}[(\theta_{\texttt{RB}} - \theta)^2] \\ &= \mathbb{E}[(\mathbb{E}[\hat{\theta} \mid T(X)] - \theta)^2] \\ &= \mathbb{E}[(\mathbb{E}[\hat{\theta} - \theta \mid T(X)])^2] \\ &\leq \mathbb{E}[(\mathbb{E}[(\hat{\theta} - \theta)^2 \mid T(X)]] \\ &= \mathbb{E}[(\hat{\theta} - \theta)^2] \\ &= \mathbb{V}[\hat{\theta}]. \end{aligned} \tag{2}

Once again, we just use the definition of θRB\theta_{\texttt{RB}} and the law of total expectation. The third equality holds because θ=E[θ]\theta = \mathbb{E}[\theta] and the linearity of expectation. The inequality holds because

V[X]=E[X2]E[X]2    E[X]2E[X2].(3) \mathbb{V}[X] = \mathbb{E}[X^2] - \mathbb{E}[X]^2 \implies \mathbb{E}[X]^2 \leq \mathbb{E}[X^2]. \tag{3}

In my mind, the Rao–Blackwell Theorem is remarkable in that (1) the proof is quite simple and (2) the result is quite general.

   

Acknowledgements

This proof is based on a blackboard proof by Matias Cattaneo in Princeton’s Statistical Theory and Methods.

  1. Rao, C. R. (1992). Information and the accuracy attainable in the estimation of statistical parameters. In Breakthroughs in statistics (pp. 235–247). Springer.
  2. Blackwell, D. (1947). Conditional expectation and unbiased sequential estimation. The Annals of Mathematical Statistics, 105–110.