Log in

Early Career* Bayes Seminar Series: Bayesian hierarchical stacking—All models are wrong, but some are somewhere useful

  • 23 Jun 2021
  • 9:30 AM - 11:00 AM
  • Via Zoom and in person

Presented virtually and in person by Yuling Yao (Columbia) on

Wednesday, 23 June 23, 2021, 9:30 AM – 11:00 AM AEST

*Early Career: We want to highlight great research from early career researchers as opportunities for ECRs to present their work to broad audiences has declined.

Stacking is a widely used model averaging technique. Like many other ensemble methods, stacking is more effective when model predictive performance is heterogeneous in inputs, in which case we can further improve the stacked mixture with a hierarchical model. In this talk Yuling will focus on the recent development of Bayesian hierarchical stacking: an approach that locally aggregates models. The weight is a function of data, partially-pooled, inferred using Bayesian inference, and can further incorporate other structured priors and complex data. The presenter will also discuss some theory bounds: when and why model averaging is useful; what model dissimilarity metric is relevant to Bayesian ensembles.

Many thanks to our sponsors:

QUT Centre for Data Science

ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

Statistical Society of Australia (Queensland Branch)

by Joshua Bon

Register here.

Powered by Wild Apricot Membership Software