Puzzle editor Anirban DasGupta returns with these puzzles in our contest model. Each correct answer receives 3 points, each incorrect answer receives -2 points, and each item left unanswered receives -1 point. You can answer just one of the two problems, 55.1 and 55.2, but it will be fantastic if you attempt both.

Puzzle 55.1

Let X1,,Xn be iid C(μ,1), the Cauchy distribution with median μ and scale parameter 1.

(a) Prove rigorously that the number of roots Tn of the likelihood equation is an odd integer between 1 and 2n1.

(b) Find with precise reasoning limnP(Tn>1).

Puzzle 55.2, the contest problem. For each question, just say True or False, without the need to provide a proof. But answers with some explanations are especially welcome. Here are the items.
(a) A quadratic ax2+bx+c is called a Gaussian quadratic if a,b,c are i.i.d. standard normal. If X1,X2 denote the two roots of a Gaussian quadratic, then the expectation of 1X1+1X2 does not exist.

(b) In the standard linear model E(Y)=Xβ, no quadratic function YAY can be an unbiased estimate of a linear function cbeta.

(c) There exists a location parameter density on the real line such that the average of the three sample quartiles is asymptotically the most efficient among all convex combinations of the three sample quartiles.

(d) Suppose f:[0,1]R is a strictly increasing continuous function. Then there is a minimum value of D(r)=01|f(x)r|dx, and the minimum is attained at a unique real number r0.

(e) Let Xn denote an n×n matrix all of whose elements are ±1. If Dn denotes the supremum of the determinant of all such matrices Xn, then (Dn)1/n has a finite limit superior.

(f) Sixteen equally good soccer teams are going to play in a tournament, in which teams are paired up in random, and a team to lose a match is eliminated from the tournament. The probability that teams 1 and 2 will meet each other at some point in the tournament is more than 10%.

Solution to Puzzle 54

Congratulations to IMS student member Yutong Wang (London School of Economics and Political Science), whose answers to Puzzle 54 were correct!

Guest puzzle editor Stanislav Volkov explains:

LEMMA 1:
Suppose that Xi, i=1,,n are Bernoulli(p) random variables and denote Sn=X1++Xn.
Then Var(Sn)n2p(1p) and this upper bound can be achieved.

PROOF:
Since Var(Xi)=p(1p) and Cov(Xi,Xj)Var(Xi)Var(Xj)=p(1p) by the Cauchy-Schwarz inequality, we have
Var(Sn)=i=1nVar(Xi)+21i<jnCov(Xi,Xj)np(1p)+n(n1)p(1p)=n2p(1p).
On the other hand, this variance can be achieved by setting all XiX1.

◻

 

LEMMA 2:
Suppose that Xi, i=1,,n are Bernoulli(p) random variables and denote Sn=X1++Xn. Let m=np, and let a=npm be the fractional part of np (which can be 0). Then Var(Sn)a(1a), and this lower bound can be achieved.

PROOF:
(a) First, let us show that this is indeed the lowest possible value. For simplicity, assume that 0<a<1 (the case when np is an integer can be handled similarly).

We claim that for any integer-valued random variable S, with E(S)=np(m,m+1), the variance cannot be less than a(1a), and moreover this minimum is achieved when S{m,m+1} with probability one; in this case P(Sn=m)=1a, P(Sn=m+1)=a, and the variance is indeed a(1a).

Suppose the contrary and the minimum is achieved for some S such that P(S{m,m+1})>0. Then there exists either im1 such that P(S=i)>0, or jm+2 such that P(S=j)>0. Because of symmetry, it suffices only to study the first case.

Indeed, assume that there is an im1 such that P(S=i)>0. We shall show that the variance of S can be made strictly smaller. Since ES>m, there must be an integer jm+1 such that P(S=j)>0. Let ϵ=min(P(S=i),P(S=j))>0. Note also that P(S=m)<1. Take a very small δ>0 and create a new variable S by changing the probabilities only at three points: i,m,j by setting
P(S=i)=P(S=i)x,P(S=m)=P(S=m)+δ,P(S=j)=P(S=j)y.
To ensure that the probabilities sum up to 1 and to keep ES=ES unchanged, we need
x+δy=0,ix+mδjy=0.
Solving for x and y gives
x=δjmji>0,x=δmiji>0
which is possible as long as δ is small enough.
However, with this change, the variance of S will decrease:
Var(S)Var(S)=E(S2)E(S2)=i2x+m2δj2y=(jm)(mi)δ<0
so Var(S)<Var(S) contradicting the assumption that S has the smallest variance. By the same argument, we can show that if P(S=j)>0 for some jm+2, then S does not have the smallest variance.
(b) Now let us show how this minimum can be achieved. Let C be the circumference of a circle centred at the origin, and let U be a point uniformly chosen on C. Consider n arcs on C, I1,I2,,In, such that Ik contains all the points located between angles 2π(k1)p and 2πkp. Let Xk=1 if Ik contains U and set Xk=0 otherwise. It is easy to check that P(Xk=1)=p for all k=1,2,,n.

On the other hand, it is easy to check that Sn{m,m+1} As a result, Snm is a Bernoulli(a) random variable, resulting in the statement of the claim.

◻

 

Now let Yi=2Xi1. Then Tn=2Snn and thus by Lemma 1 Var(Tn)=4Var(Sn) which has the largest achievable value at 4n2p(1p). By the same token, from Lemma 2, Var(Tn)=4Var(Sn) which has the lowest achievable value at 4a(1a) where a=npnp.