Tilmann Gneiting’s IMS Medallion Lecture (below) and David DeMets’ plenary address (below that) will be delivered at the ENAR/IMS 2015 Spring Meeting in Miami, FL, from March 15–18, 2015.
Tilmann Gneiting is Group Leader at Heidelberg Institute for Theoretical Studies (HITS) and Professor of Computational Statistics at Karlsruhe Institute of Technology (KIT) in Germany. He obtained his PhD in Mathematics at Bayreuth University, then held faculty positions at the University of Washington, where he remains affiliate faculty, and at the Institute for Applied Mathematics at Heidelberg University. Tilmann’s research focuses on the theory and practice of forecasting, and spatial and spatio-temporal statistics, with applications to meteorological, hydrologic, and economic problems, among others. Tilmann also serves as Editor for Physical Science, Computing, Engineering, and the Environment at the Annals of Applied Statistics (2011–14).
Uncertainty Quantification in Complex Simulation Models Using Ensemble Copula Coupling
Critical decisions frequently rely on high-dimensional output from complex computer simulation models that show intricate cross-variable, spatial and/or temporal dependence structures, with weather and climate predictions being key examples. There is a strongly increasing recognition of the need for uncertainty quantification in such settings, for which we propose and review a general multi-stage procedure called ensemble copula coupling (ECC), proceeding as follows.
1. Generate a raw ensemble, consisting of multiple runs of the computer model that differ in the inputs or model parameters in suitable ways.
2. Apply statistical postprocessing techniques, such as Bayesian model averaging or nonhomogeneous regression, to correct for systematic errors in the raw ensemble, to obtain calibrated and sharp predictive distributions for each univariate output variable individually.
3. Draw a sample from each postprocessed predictive distribution.
4. Rearrange the sampled values in the rank order structure of the raw ensemble, to obtain the ECC postprocessed ensemble.
The use of ensembles and statistical postprocessing have become routine in weather forecasting over the past decade. We show that seemingly unrelated, recent advances can be interpreted, fused and consolidated within the framework of ECC, the common thread being the adoption of the empirical copula of the raw ensemble. In some settings, the adoption of the empirical copula of historical data offers an attractive alternative. In a case study, the ECC approach is applied to predictions of temperature, pressure, precipitation, and wind over Germany, based on the 50-member European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble. This is joint work with Roman Schefzik and Thordis Thorarinsdottir.
ENAR President’s Invited Speaker: David DeMets
David L. DeMets, the Max Halperin Professor of Biostatistics and former Chair of the Department of Biostatistics and Medical Informatics at the University of Wisconsin–Madison, is the 2015 ENAR Presidential Invited Speaker.
Big Data, Big Opportunities, Big Challenges
Since the 1950s, biostatisticians have been successfully engaged in biomedical research, from laboratory experiments to observational studies to randomized clinical trials. We owe some of that success to the early pioneers, especially those biostatisticians who were present at the National Institutes of Health (NIH). They created a culture of scientific collaboration, working on the methodology as needed to solve the biomedical research problems in design, conduct and analysis.
Over the past five decades, we have experienced a tremendous increase in computational power, data storage capability and multidimensionality of data, or “big data”. Some of this expansion has been driven by genomics.
At present, we have the opportunity to contribute to the design and analysis of genomic data, data stored in the electronic health record and continued needs of clinical trials for greater efficiency. However, with these opportunities, we have serious challenges, starting with the fact that we need to develop new methodology to design and analyze the “big data” bases. The demand for quantitative scientists exceeds the supply and there is no strategic national plan to meet these demands.
Federal funding for biomedical research has been flat and likely to remain so for several years, impacting both the ability to train additional quantitative scientists and provide them with research funding for new methodologies. We face new, or more public, scrutiny, demanding that our data and analysis be shared earlier and earlier, even as the data are being gathered such as in clinical trials. Litigation is now part of our research environment. We will examine some of these issues and speculate on ways forward.
Comments on “Medallion Lecture Preview: Tilmann Gneiting”