Student members of IMS are invited to submit solutions to (with subject “Student Puzzle Corner”). The names of student members who submit correct solutions to either or both of these puzzles, and the answer, will be published in the issue following the deadline. The Puzzle Editor is Anirban DasGupta. His decision is final.

Student Puzzle Editor Anirban DasGupta poses another two problems, one each in probability and statistics. Send us your solution, to either or both.

Puzzle 43.1:
(i) Simulate 1,000 standard Cauchy variables, and plot $\bar{X}_n$ against $n$ for $n = 1, 2, \cdots, 1000$, where $\bar{X}_n = \frac{1}{n}\,\sum_{i = 1}^n\,X_i$.
(ii) Briefly discuss what do you see in this plot that is interesting.
(iii) What is your guess for $\limsup_{n \to \infty}\, |\bar{X}_n – e^n|$, and justify why that is your guess.

Puzzle 43.2:
Suppose $X$ has a discrete uniform distribution on the set $\{1, 2, \cdots , N\}$, where $N \geq 1$ is an unknown integer valued parameter. Construct an explicit admissible estimator of $N$ under squared error loss and provide a proof of your estimator’s admissibility.


Solution to Puzzle 42

Congratulations to Soham Bonnerjee, who is a PhD student in the Department of Statistics at the University of Chicago, for his correct solution to both parts. Puzzle Corner Editor Anirban DasGupta explains the answers:

Puzzle 42.1:
It is easily seen that for any $\epsilon > 0$, $\Phi (x+\epsilon) – \Phi(x)$ is maximized at $x = 0$ on $[0, \infty)$ due to the monotonicity of $\phi (x)$ on $[0,\infty )$. Hence, we want $\epsilon $ to satisfy $\Phi (\epsilon ) – \frac{1}{2} = .001$, which gives $\epsilon = \Phi ^{-1}(.501) = .0025$.
Assuming, for example, that we can print 8 numbers in one line and that we can accommodate 40 lines in one page, we will need $\frac{5}{.0025}/320 = 6.25$ pages.

Puzzle 42.2:
On the second problem, a classic result is that if $S_0 \sim \mathcal{W}_p(n-1, I)$, then the Cholesky decomposition of $S_0 = TT’$ satisfies that $\{T_{ij}\}$ are mutually independent and that for $i > j, T_{ij} \sim N(0,1)$ and $T_{ii}^2 \sim \chi ^2_{n-i}, i = 1,2, \cdots , p$.
Hence, for $S \sim \mathcal{W}_p(n-1,\Sigma)$, \[ E\bigg [\frac{|S|}{|\Sigma|}\bigg ] = \prod_{i=1}^p\,(n-i), \]
which gives the UMVUE of $|\Sigma |$ for any $p$, by the fact that $(\bar{\bf{X}}, S)$ is a complete sufficient statistic and then an application of the Lehmann–Scheffé theorem.
The variance of the UMVUE uses the second moment of a central chi-square variable and is straightforward.