Editor Anirban DasGupta writes:

Well done to Lu Mao (pictured below), of the department of biostatistics at the University of North Carolina at Chapel Hill, who sent a carefully written solution.
Lu Mao
Lu Mao

The problem was to find a minimax estimator of m=|µ| under squared error loss when there are n iid observations from N(µ,1),µR. The claim is that the plug-in estimator |X¯| is unique minimax.

First, note that by the triangular inequality,

supµEµ[(|X¯|m)2]supµEµ[(X¯µ)2]=1n.

Consider now any other estimator δ(X1,···,Xn) of m; we may assume δ to be a function of X¯ by Rao–Blackwell. Then,

supµEµ[(δm)2]supµ0Eµ[(δµ)2]1n,

the last inequality following from a standard application of the Cramér–Rao inequality. Thus |\bar{X}| is minimax; unique minimaxity follows because the last inequality above is strict by completeness. The exact risk function of |\bar{X}| equals

R(θ,|X¯|)=1n[1+4θ2(1Φ(θ))4θφ(θ)], where θ=|nµ|.

By elementary calculus, the function g(x)=x2(1Φ(x))xφ(x), x0 has a unique minimum at x0 where x0 is the unique root of x(1Φ(x))φ(x)=12; x0.61200.

Plugging this back into the exact risk formula, one gets that the minimum risk of |X¯| is 12×0φ(x0)n=.59509n.