Peter Bickel obtained his PhD in Statistics in 1963 from the University of California, Berkeley, advised by E.L. Lehmann. He remained there ever since and is now Professor Emeritus of Statistics and Professor of the Graduate School. His interests have focused on asymptotic methods, but otherwise have ranged widely, beginning with nonparametric statistics, robust methods and sequential analysis and then extending to semiparametric models and network analysis, as well as some forays into high-dimensional statistics and network analysis. Peter received the COPSS Presidents’ Award in 1980 and has given the Wald, Rietz and COPSS Fisher lectures. He won a MacArthur Award and was elected to the National Academy of Sciences, the American Academy of Arts and Sciences, and the Royal Netherlands Academy of Sciences. He was awarded an honorary doctorate from the Hebrew University of Jerusalem (1988) and ETH Zurich (2014). Peter Bickel will give this Le Cam Lecture at JSM Nashville, on Tuesday, August 5, at 10:30am.
Local Asymptotic Analysis: A general quantitative tool
Arguably, Lucien Le Cam’s most important contributions were in asymptotic analysis in decision theory. One of the earliest of these was local asymptotic normality. This grew out of his thesis, but the start of a full development appeared in a 1956 Third Berkeley Symposium paper. In it, he acknowledged that it was in part an explanation and extension of a 1943 Transactions of the American Mathematical Society paper of A. Wald. According to S. Stigler, in a private communication, “Le Cam did not grasp the relationship of Wald 1943 to his thesis at the time (1953), but by 1957 that changed dramatically.” Le Cam continued with his famous 1960 paper on locally asymptotically normal families, brought in the notion of distance between experiments and generalized far beyond the i.i.d. case. The scope of his work can be seen in his 1989 book with Grace Yang, and fully in his great 1986 monograph, Asymptotic Methods of Statistical Decision Theory.
The modern focus on machine learning with high-dimensional data gets away from local parametric analysis but validatory statistics such as causal inference requires it. Le Cam persuaded me at least that no one should care much about probabilities arbitrarily close to 1. That is what consistency results in inference lead to.
Even in machine learning, interpretability of results often leads to invocations of sparsity and these lead to local analysis.
Local theory can be put in the more modern form of non-asymptotic bounds but the constants involved in the uniformity required make it unrealistically conservative. Local analysis gives guidance, which typically gives useful approximations, even if these are no good in the face of an adversarial nature.
I will discuss the basic work of Neyman, Wald and Le Cam which can be viewed as a deep extension of the Fisher factorization theorem in this light. I will sketch some proofs for and connect to older results, such as the Hajek–Le Cam Theorem and Le Cam’s Third Lemma. I then hope to briefly point to some more modern issues including local robustness, sensitivity to confounding in causal models and semiparametric approaches in causality theory.
There are some very recent applications of these methods I’d like to draw attention to although I don’t intend to discuss them: the work of Foygel Barber and Janson on approximate co-sufficient sampling in the Annals of Statistics (2022).