Editor Anirban DasGupta writes:

Well done to Lu Mao (pictured below), of the department of biostatistics at the University of North Carolina at Chapel Hill, who sent a carefully written solution.
Lu Mao
Lu Mao

The problem was to find a minimax estimator of $m = |µ|$ under squared error loss when there are $n$ iid observations from $N (µ, 1), µ ∈ \mathcal{R}$. The claim is that the plug-in estimator $|\bar{X}|$ is unique minimax.

First, note that by the triangular inequality,

$sup_µ E_µ[(|\bar{X}| − m)^2] ≤ sup_µ E_µ [(\bar{X} − µ)^2 ] = \frac{1}{n}.$

Consider now any other estimator $δ(X_1 , ··· , X_n)$ of $m$; we may assume $δ$ to be a function of $\bar{X}$ by Rao–Blackwell. Then,

$sup_µ E_µ [(δ − m)^2] ≥ sup_{µ≥0} E_µ[(δ − µ)^2] ≥ \frac{1}{n}, $

the last inequality following from a standard application of the Cramér–Rao inequality. Thus |\bar{X}| is minimax; unique minimaxity follows because the last inequality above is strict by completeness. The exact risk function of |\bar{X}| equals

$R(θ,|\bar{X}|) = \frac{1}{n}[1 + 4θ^2 (1−Φ(θ))−4θφ(θ)]$, where $θ=|\sqrt{n} µ|$.

By elementary calculus, the function $g(x) = x^2 (1 − Φ(x)) − xφ(x)$, $x ≥ 0$ has a unique minimum at $x_0$ where $x_0$ is the unique root of $\frac{x(1−Φ(x))}{φ(x)}=\frac{1}{2}$; $x_0 ≈ .61200$.

Plugging this back into the exact risk formula, one gets that the minimum risk of $|\bar{X}|$ is $\frac{1−2×0 φ(x0)}{n} = \frac{.59509}{n}$.