This time we have a guest puzzle editor, Stanislav Volkov, who is Professor of Mathematical Statistics at the Centre for Mathematical Sciences at Lund University, Sweden. (Anirban DasGupta will return with the next puzzles).
Here is Stanislav’s puzzle:
Puzzle 54
Let Tn=Y1+Y2+…+Yn be a “random walk” in the sense that Yi = ±1 and ℙ(Yi=1) = p for all i.
What is the minimum and the maximum of the variance of Tn over all possible joint distributions of (Y1, …, Yn) that can be achieved? (Note that we do not assume that Ys are independent.)
Solution to Puzzle 53
Pictured here are IMS student members Andrew Czeizler (University of New England, Australia), Radmehr Karimian (Columbia University), and Reihaneh Malekian (Columbia University), who sent [mostly] correct and serious answers, and did a fine job. Congratulations to them!
Puzzle editor Anirban DasGupta explains:
Puzzle 53.1: Suppose X has a Poisson distribution with mean one. Prove that the distribution of X is determined by its moments, i.e., if Y is any real valued random variable such that E(Yn) = E(Xn) for all n = 1, 2, …, then Y has a Poisson distribution with mean one.
The MGF of any Poisson distribution exists for all values of t. If a distribution on the real line has an MGF in a nonempty neighborhood of zero, then it is determined by its sequence of moments. Alternatively, the nth moment of a Poisson with mean PNE is the nth Bell number Bn. It can then be proved that $\sum_{n =1}^\infty \,B_{2n}^{-1/(2n)} = \infty $, and this implies that the Poisson distribution with mean one is determined by its moments.
Puzzle 53.2 (Multi-part contest questions: answer true or false)
(a) If X is a non-negative random variable, and has a finite MGF everywhere, then E(XX) < ∞.
False. The Poisson with mean one is a counterexample.
(b) For estimating the variance of a normal distribution with an unknown mean, the MLE of the variance is inadmissible under squared error loss function.
True. A better estimator is $\frac{1}{n+1}\,\sum_{i = 1}^n\,(X_i-\bar{X})^2$.
(c) Suppose X1, X2, … are i.i.d. Poisson with mean one. Let $S_n = \sum_{i=1}^n\ X_i, n \geq 1$. Let $N$ be the first $n$ for which $S_n > 1$. Then $E(S_N – 2) \leq 1$.
True. Just note that $P(N > n) = P(S_n \leq 1)$, and that $S_n \sim \mbox{Poi}(n)$. Now use the tailsum formula to calculate $E(N)$, and hence, $E(S_N)$.
(d) Suppose X is uniformly distributed in the p-dimensional disk $\{x: ||x||^2 = \rho ^2\}$. Treat $\rho $ as an unknown positive parameter. Then the bias of the MLE of $\rho $ converges to zero when $p \to \infty $.
True. This follows from a straightforward calculation of the density of ||X||.
(e) Consider a 2×2 random matrix in which all four entries are i.i.d. standard normal. Then the probability that the determinant of this matrix exceeds 1 is an irrational number.
True. If X,Y,Z,W are i.i.d. standard normal, then XW − YZ has a standard double exponential distribution. So the probability that the determinant in question is larger than 1 is e−1.