
Pressley Warwick “Warry” Millar. Photo courtesy of Derek Millar.
Pressley Warwick (Warry) Millar, Emeritus Professor of Statistics at the University of California, Berkeley, passed away surrounded by his family, due to complications of heart failure on August 27, 2024. He was 85 years old.
Warry was born in Beverly, Mass., on June 1, 1939, to Norval Pressley and Marian Millar. After he completed his early education in Danvers, Mass., he attended Brown University where he completed his bachelor’s degree in 1961. He then obtained a master’s degree in History of Science at Cornell University, followed by a PhD in Mathematics at the University of Illinois Urbana-Champaign in 1967. His doctoral dissertation, supported in part by a terminal year National Science Foundation (NSF) Graduate Fellowship, was titled “Martingale Integrals,” written under the direction of Donald Lyman Burkholder,
After his PhD, and still in 1967, Warry joined the UC Berkeley Department of Statistics, as Assistant Professor, where he developed a distinguished career. He received a coveted NSF Postdoctoral Fellowship (1971–72), and attained the positions of Associate Professor in 1972 and Professor in 1977. Upon retirement in 2003, he was appointed as Professor Emeritus at UC Berkeley.
Warry’s scientific writings can be divided into two parts. In the first phase of his research career (roughly 1967–80), Warry wrote extensively on central topics in the theory of probability and stochastic processes. During this phase, he made important contributions to the theory of stochastic integration, potential theory of Markov processes, and analysis of the sample functions of Markov processes including decompositions theorems for Markov processes at random times. Despite the difficult and technical nature of the work, one cannot help but notice the elegant style with which Warry explained deep ideas succinctly, each idea flowing naturally from the previous one, seemingly without need for unnecessary technical details. One also might notice a particular affinity for the study of Lévy processes, better known at that time as “processes with stationary and independent increments.” The following beautiful result is a noteworthy example. See Millar [4, Theorem 3.1] for a precise description.
Let X = {Xt; t ≥ 0} denote a Lévy process and let R be an arbitrary random time. Suppose that the smallest value of X during the random time interval [0, R] is achieved at a single point in (0, R), and define M to be the time when the minimum is achieved. Then, essentially as long as M and R can be defined canonically and M < R, the random function {XM+u − XM; u ≥ 0} has a zero-one law immediately to the right of the time M.
In the case that M is a “stopping time,” such a zero-one law was first proved by Hunt [2] for a Brownian motion X, and Blumenthal [1] for a general strong Markov process. In that case, the result reduces to the celebrated “Blumenthal zero-one law,” which is a cornerstone of modern stochastic analysis. Millar’s zero-one law is significant in part because M is not a stopping time. Also because, in the same paper (§4), Millar goes on to prove that if X is a general strong Markov process, with R and M defined as in the above theorem, and if a zero-one law holds at time M—as he proves it does in great generality when X is a Lévy process—
then conditionally the post-M process {Xt; t ≥ M} is a Markov process, given (XM , XM−), that is independent of the pre-M process {Xt; t ≤ M}.
The second phase of Warry’s scientific writing (roughly 1980–97) was primarily devoted to the study of the minimax principle and asymptotic optimality of statistical procedures, in particular of minimum distance estimators. Warry showed that minimum distance estimators have a desirable stability property: These estimators do not deteriorate when the actual data distribution departs somewhat from that posited by the model. In characteristic fashion, Warry established this robustness property in a very abstract form by analyzing certain stochastic processes in a Banach space and showing that minimum distance estimators share a very simple abstract structure asymptotically, which then allows to establish a local asymptotic minimax result [3, 5]. This abstract result can be used to show that this robustness property holds in a multitude of different settings, such as in regression, for quantile estimation, and for minimum distance estimators for M-functionals. In a series of papers with his department colleague and lifelong friend Rudolf Beran, Warry showed how random approximations together with the recently developed bootstrap principle can be used to make feasible a range of previously inaccessible statistical procedures [7], such as confidence sets for a distribution given by the half-space generalization of the Kolmogorov–Smirnov statistic introduced by Wolfowitz [6]. These papers lie at the intersection of mathematical statistics, probability theory, and algorithmic feasibility.
Warry was outstanding as a lecturer, as well as a mentor. He offered a rich variety of graduate courses on important topics that ranged from fine properties of Lévy processes, to advanced topics in Markov processes, to statistical decision theory, to inference in functional spaces, and so on. Much as he did in his research exposition, in his courses Warry transmitted the feeling that the material on the blackboard was somehow simple, being a natural consequence of what had been previously discussed. He had an almost magical ability to pace his lectures so that students could take effective notes, while he kept the dialogue flowing naturally. And he frequently prepared exercise sets to enrich the students’ learning experience, even when the course was offered at the most advanced graduate levels.
Beyond his excellent contributions to research, teaching and mentoring, we remember Warry the most for his generosity of spirit. According to the Mathematics Genealogy Project, https://www.mathgenealogy.org, Warry has 15 graduate students and 85 mathematical and statistical descendants. He leaves behind a rich scientific legacy that is likely to continue for a long time to come.
Written by Davar Khoshnevisan, University of Utah; Carl Mueller, University of Rochester; Maria Eulalia Vares, Universidade Federal do Rio de Janeiro; and Guenther Walther, Stanford University
References
[1] R. M. Blumenthal. 1957. “An extended Markov property.” Trans. Amer. Math. Soc. 85: 52–72. DOI 10.2307/1992961. MR0088102.
[2] G. A. Hunt. 1956. “Some theorems concerning Brownian motion.” Trans. Amer. Math. Soc. 81: 294–319. DOI 10.2307/1992918. MR0079377.
[3] P. Warwick Millar. 1981. “The Minimax Principle in Asymptotic Statistical Theory.” Eleventh Saint Flour probability summer school (Saint Flour, 1981), in Lecture Notes in Math. 1983 976: 75–265. Springer, Berlin. DOI 10.1007/BFb0067986. MR0722983.
[4] P. W. Millar. 1977. “Zero-one laws and the minimum of a Markov process.” Trans. Amer. Math. Soc. 226: 365–391. DOI 10.2307/1997959. MR0433606.
[5] ___. 1984. “A general approach to the optimality of minimum distance estimators.” Trans. Amer. Math. Soc. 286(1): 377–418. DOI 10.2307/1999411. MR0756045.
[6] R. Beran and P. W. Millar. 1986. “Confidence sets for a multivariate distribution.” Ann. Statist. 14(2): 431–443, DOI 10.1214/aos/1176349931. MR0840507.
[7] ___. 1987. “Stochastic estimation and testing.” Ann. Statist. 15(3): 1131–1154, DOI 10.1214/aos/1176350497. MR0902250.