SESSION G

TITLE: Bayesian Adaptive Approaches Using Historical Data
INSTRUCTOR: Brad Carlin, Counterpoint Statistical Consulting, LLC
MODERATOR:Ivan S. F. Chan

 

Abstract:

As clinical trial costs continue to rise, industry statisticians have faced increasing pressure to develop and utilize more efficient statistical techniques. Fortunately, regulators at FDA, EMA and elsewhere are increasingly comfortable with Bayesian statistical techniques, which permit formal borrowing of strength from expert opinion and auxiliary data, and yield full probabilistic inference regarding model quantities of interest. These approaches are increasingly being encouraged by FDA though its Complex Innovative Trial Design (CID) initiative, consistent with the 21st Century Cures Act, passed by the US Congress in late 2016, and PDUFA VI. Bayesian statistical methods facilitate adaptive dose-finding and randomization, and have a long history of success in early phase clinical trial settings where patients and other resources are scarce and/or where reliable external information is available. However, it’s often unclear when and how much strength to borrow from external data sources, especially if they are historical, observational, or both.

In this tutorial, after a very brief review of the Bayesian approach, we illustrate its use in simple data combination methods, including traditional two-step approaches, as well as ones using power priors, commensurate priors, and robust mixture priors for incorporating sensibly downweighted versions of the auxiliary information. Here, the notion of effective sample size is important to judge the relative importance and impacts of the various data sources. Techniques specific to rare and pediatric diseases will be discussed, as will an approach for optimally selecting the timing of an interim look at the data. On the drug side, the use of PK/PD data to expand the range of useful auxiliary information will be explored. We also consider the problem of borrowing strength from observational data, where propensity score matching offers a way to correct for possible biases arising from the lack of randomization.

Throughout, we illustrate with practical examples from the instructor’s own consulting practice, which has included both device and drug approvals. In particular, we begin with a colorectal cancer case study in which relying solely on historical control information erroneously identifies a significant treatment effect. We then catalog situations where borrowing historical information may or may not be advisable. We also consider a Bayesian adaptive platform design that uses commensurate prior methods at interim analyses to borrow adaptively from the control group of an earlier-starting trial, which we show compares favorably to an ad hoc frequentist “all-or-nothing” borrowing approach. Finally, we discuss computational tools available to help implement the adaptive Bayesian inference methods presented.

Carlin, B.P. and Louis, T.A. (2009).  Bayesian Methods for Data Analysis, 3rd ed. Boca Raton, FL:  Chapman and Hall/CRC Press.
Berry, S.M., Carlin, B.P., Lee, J.J., and Muller, P. (2011).  Bayesian Adaptive Methods for Clinical Trials.  Boca Raton, FL:  Chapman and Hall/CRC Press.

 

Instructor’s Bio:

Brad Carlin is a statistical researcher, methodologist, consultant, and instructor.  He spent 27 years on the faculty of the Division of Biostatistics at the University of Minnesota School of Public Health, serving as division head for 7 of those years.  He has also held visiting positions at Carnegie Mellon University, Medical Research Council Biostatistics Unit, Cambridge University (UK), Medtronic Corporation, HealthPartners Research Foundation, the M.D Anderson Cancer Center, and AbbVie Pharmaceuticals.   He has published more than 185 papers in refereed books and journals, and has co-authored three popular textbooks: “Bayesian Methods for Data Analysis” with Tom Louis, “Hierarchical Modeling and Analysis for Spatial Data” with Sudipto Banerjee and Alan Gelfand, and “Bayesian Adaptive Methods for Clinical Trials” with Scott Berry, J. Jack Lee, and Peter Muller.  From 2006-2009 he served as editor-in-chief of Bayesian Analysis, the official journal of the International Society for Bayesian Analysis (ISBA).  During his academic career, he served as primary dissertation adviser for 20 PhD students.   Dr. Carlin has extensive experience teaching short courses and tutorials, and won both teaching and mentoring awards from the University of Minnesota. During his spare time, Brad is a health musician and bandleader, providing keyboards and vocals in a variety of venues.

Website:  https://www.counterpointstat.com/

This entry was posted in . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *