Roger Koenker

Roger Koenker was born in 1947 in North Dakota, graduated from Grinnell College in 1969 and received his PhD from the University of Michigan in 1974. He began his academic career at the University of Illinois at Urbana-Champaign in 1974. From 1976–83, he was a Member of the Technical Staff in the Department of Economics of Bell Laboratories. He returned to the University of Illinois in 1983, where he was Professor of Economics and Statistics until 2018. Since 2018 he has been Honorary Professor of Economics at University College London. Much of his research has focused on quantile regression methods, which were introduced in joint work with Gib Bassett in the late 1970s. He received the Emanuel and Carol Parzen Prize for Statistical Innovation in 2010. Since 2010 his research has diversified, including work on shape-constrained density estimation and total variation regression smoothing. His most recent work has focused on nonparametric maximum likelihood methods for mixture models.

Roger will give this Medallion Lecture at the Joint Statistical Meetings in Philadelphia in August.

 

Some Unlikely Likelihoods

Statistics as a discipline has oscillated between the poles of parametric and distributional inference from its infancy. Before 1900, emphasis was placed on estimating means and medians without much concern about what underlying distributions might justify such estimates. Karl Pearson felt compelled to expand upon the dominant model of the (Gaussian) law of errors with his family of (momentary) densities. Fisher pointed out that inference about such densities should be based on their parameters. Modern statistical theory, having extracted most of the blood from the parametric turnip, has once again turned back toward models of distributions and nonparametrics. And yet, Fisher’s beloved method of maximum likelihood for parametric models has proven to be a vital tool for nonparametric data analysis.

Maximum likelihood has been an especially fruitful approach for shape-constrained density estimation, as foretold by Grenander. Curiously, attempts to extend log-concave density estimation to algebraic tailed distributions by maximum likelihood were unmanageable, and modified objectives based on Rényi entropies have proven to be more convenient. Closely related nonparametric maximum likelihood methods are also effective for mixture models, as suggested initially by Robbins. The nonparametric MLE of Kiefer and Wolfowitz now plays an important role in empirical Bayes compound decision theory. Random coefficient binary response models provide a novel illustration. Relying on modern convex optimization, these methods share the advantage that they are free of pesky tuning parameters that plague other nonparametric estimation methods. Many important questions remain open, notably, the asymptotic behavior of profile likelihood estimators.

Finally, there may be some remarks that attempt to clarify the role of likelihood in the general formulation of quantile regression, noting that, in effect, appropriately weighted estimation of the quantile regression process may be regarded as a scheme for maximum likelihood estimation of the entire conditional distribution. Again, there are many interesting open problems including those of computational implementation and adaptive efficiency.