TITLE: Bootstrap Methods
SPEAKERS: Professor Wei-Yin Loh, University of Wisconsin, Madison
MODERATOR: Ivan S. F. Chan
Introduced exactly 40 years ago (Efron, 1979, Ann. Statist.), the bootstrap is a widely-used statistical technique. Its popularity is due to two main factors: (i) in a large number of problems amenable to mathematical analysis, the bootstrap is shown to perform as well as, if not better than, classical methods and (ii) enabled by the widespread availability of computers, the bootstrap technique is often applicable and its solutions are useful even when no other solution exists.
The first part of the tutorial reviews the motivation, underlying concepts, and fundamental assumptions that ensure the asymptotic validity of the bootstrap. The second part considers an extension of the bootstrap idea to that of “bootstrap calibration” (Loh, 1987, JASA; 1991, Statist. Sinica) and its application to a previously unsolved problem of post-selection inference: assessing statistical significance of subgroup treatment effects in randomized trials where the subgroups are obtained from complex search algorithms such as regression trees (Loh et al., 2015, 2016, 2019, Statist. Med). To keep things simple, the main focus throughout is on confidence interval estimation, as it is by far the most important area of application of the bootstrap.
Efron, B. (1979). Bootstrap methods: another look at the jackknife. Annals of Statistics, 7, 1-26.
Loh, W.-Y. (1987). Calibrating confidence coefficients. JASA, 82, 155-162.
Loh, W.-Y. (1991). Bootstrap calibration for confidence interval construction and selection. Statist. Sinica, 1, 477-491.
Loh, W.-Y., Fu, H., Man, M., Champion, V. and Yu, M. (2016). Identification of subgroups with differential treatment effects for longitudinal and multiresponse variables. Statist. Med., 35, 4837-4855.
Loh, W.-Y., He, X. and Man, M. (2015). A regression tree approach to identifying subgroups with differential treatment effects. Statist. Med., 34, 1818-1833.
Loh, W.-Y., Man, M. and Wang, S. (2019). Subgroups from regression trees with adjustment for prognostic effects and post-selection inference. Statist. Med., 38, 545-557.
Wei-Yin Loh is Professor of Statistics at the University of Wisconsin, Madison. His research interests are in bootstrap theory and methodology and algorithms for classification and regression trees. Loh is a fellow of the American Statistical Association and the Institute of Mathematical Statistics, and a consultant to government and industry. He is a recipient of the Reynolds Award for teaching, the U.S. Army Wilks Award for statistics research and application, an Outstanding Science Alumni Award from the National University of Singapore, and visiting fellowships from AbbVie, IBM and the Bureau of Labor Statistics.