Event Information

Share this event

Date and Time

Location

Location

G.09, Fry Building

University of Bristol

Bristol

BS8 1TH

United Kingdom

View Map

Event description
'Using bagged posteriors for robust model-based inference' - a two-part talk by Jonathan Huggins

About this Event

Introducing a new seminar series in Data Science for 2019 - 2020.

The Jean Golding Institute has teamed up with the Heilbronn Institute for Mathematical Research to showcase the latest research in Data Science - methodology with roots in Mathematics and Computer Science with important applied implications.

To kick off the launch event of the Bristol Data Science Seminar Series, Jonathan Huggins (Department of Biostatistics, Harvard University) will be delivering a two-part talk on ‘Using bagged posteriors for robust model-based inference.’

Coffee and cake will be available during the break.

Schedule:

1pm – 2pm: Broad interest Data Science showcase - Jonathan Huggins, ‘Using bagged posteriors for robust model-based inference’

2pm – 2.30pm: Coffee and cake in the Common Room

2.30pm – 3.30pm: Statistics seminar - Jonathan Huggins, ‘Using bagged posteriors for robust inference and model criticism’

Talk abstracts:

Broad interest Data Science showcase: ‘Using bagged posteriors for robust model-based inference’

Standard Bayesian inference is known to be sensitive to misspecification between the model and the data-generating mechanism, leading to unreliable uncertainty quantification and poor predictive performance. However, finding generally applicable and computationally feasible methods for robust Bayesian inference under misspecification has proven to be a difficult challenge. An intriguing approach is to use bagging on the Bayesian posterior (“BayesBag”); that is, to use the average of posterior distributions conditioned on bootstrapped datasets. In this talk, I describe the statistical behavior of BayesBag, propose a model–data mismatch index for diagnosing model misspecification using BayesBag, and empirically validate our BayesBag methodology on synthetic and real-world data. We find that in the presence of significant misspecification, BayesBag yields more reproducible inferences, has better predictive accuracy, and selects correct models more often than the standard Bayesian posterior; meanwhile, when the model is correctly specified, BayesBag produces superior or equally good results for parameter inference and prediction, while being slightly more conservative for model selection. Overall, our results demonstrate that BayesBag combines the attractive modeling features of standard Bayesian inference with the distributional robustness properties of frequentist methods.

Statistics seminar: ‘Using bagged posteriors for robust inference and model criticism’

In the second talk of the seminar, Jonathan Huggins will present a deeper dive into the asymptotic theory of BayesBag, looking in more detail at model criticism. This talk will be a more technical exploration of the concepts outlined in the general talk before the break.

While registration is not required for this event, please do sign up here to allow us to make provisions for catering.

Join our mailing list

To keep up to date with events like this, as well as funding, news, blogs and more, join our mailing list

Date and Time

Location

G.09, Fry Building

University of Bristol

Bristol

BS8 1TH

United Kingdom

View Map

Save This Event

Event Saved