Tudor Manole is a fifth-year PhD candidate in the Department of Statistics and Data Science at Carnegie Mellon University (CMU), jointly advised by Sivaraman Balakrishnan and Larry Wasserman. Before moving to CMU, he completed a BSc in Mathematics at McGill University. He is broadly interested in nonparametric statistics and statistical machine learning. Most of his recent research is focused on developing inferential methods for the optimal transport problem. He is also interested in theoretical aspects of latent variable models, and has worked on applications of statistical optimal transport to data-driven modeling in high energy physics. 

Tudor will give this talk in the Lawrence Brown PhD Student Award session at JSM Toronto.

Plugin Estimation of Smooth Optimal Transport Maps

The field of optimal transport has received a recent surge of interest as a methodological tool for statistical applications. One of the central objects arising from this theory is the notion of optimal transport map. For any two absolutely continuous probability distributions on d, the optimal transport map is the unique function which maps samples from one distribution onto samples from the second, and further satisfies a multivariate notion of monotonicity. Such mappings have diverse applications in statistical contexts. For example, a recent line of work has used optimal transport maps to define multivariate notions of quantiles and ranks, which has led to powerful generalizations of certain classical rank-based tests for univariate observations. Optimal transport maps have also notably been used as a methodological tool in areas such as transfer learning, generative modeling, causal inference, and in a variety of applications in the sciences.

In each of these applications, it is typically of interest to estimate the optimal transport map between unknown distributions on the basis of i.i.d. samples. Over the past decade, a number of heuristic estimators have been developed in both the statistics and computer science literature, but their theoretical properties have remained unknown. In a seminal paper, Hütter and Rigollet [Annals of Statistics 49 (2021)] initiated the theoretical analysis of optimal transport map estimators, under smoothness assumptions. They derive the minimax L2 rate of estimating optimal transport maps over classical smoothness classes, and show that this problem shares some of the same salient features as other function estimation problems in nonparametric statistics: optimal transport maps with high smoothness can be estimated at nearly the parametric rate, while those with low smoothness suffer a curse of dimensionality. The work of Hütter and Rigollet also derives an estimator which achieves the minimax rate, but which is computationally intractable. Their work thus leaves open the question of deriving practical estimators with optimal risk.

The aim of our work is to show that several natural, computationally tractable estimators of optimal transport maps are also minimax optimal. We adopt the plugin approach: our estimators are simply optimal transport maps between estimators of the underlying distributions. When the underlying map is assumed to be Lipschitz, we show that computing the optimal coupling between the empirical measures, and extending it using linear smoothers, already gives a minimax optimal estimator. When the underlying map enjoys higher regularity, we show that the optimal transport map between kernel or wavelet density estimators yields faster rates. Moving beyond the question of minimax estimation in L2, we also derive the pointwise rate of convergence of our estimators, and use this to show that they obey a pointwise central limit theorem under certain conditions. These results provide first steps towards practical statistical inference for multivariate optimal transport maps.

This talk is based on joint work with Sivaraman Balakrishnan, Jonathan Niles-Weed, and Larry Wasserman.