Log in

Bayesian generalized linear mixed models with brms

  • 3 Nov 2022 11:17 AM
    Message # 12976276

    November 7, 2022, 8pm (Brisbane)

    Bayesian generalized linear mixed models with brms

    Zoom link; Further details here

    Abstract: Linguistics is undergoing a rapid shift away from significance tests towards approaches emphasizing parameter estimation, such as linear mixed effects models. Alongside this shift, another revolution is underway: away from using p-values as part of a “null ritual” (Gigerenzer, 2004) towards Bayesian models. Both shifts can nicely be dealt with the ‘brms’ package (Bürkner, 2017). After briefly reviewing why we shouldn’t blindly follow the “null ritual” of significance testing, I will demonstrate how easy it is to fit quite complex models using this package. I will also talk about how mixed models are used in different subfields of linguistics (Winter & Grice, 2021), and why established practices such as dropping random slopes for non-converging models are a further reason to go Bayesian. Finally, I will briefly touch on issues relating to prior specification, especially the importance of weakly informative priors to prevent overfitting.

    About Bodo: Bodo Winter is a Senior Lecturer at the Department of Linguistics at the University of Birmingham, a UKRI Future Leaders Fellow, a Fellow of the Institute for Interdisciplinary Data Science and AI, and Editor-in-Chief at the journal Language and Cognition. Dr. Winter has received his PhD in Cognitive and Information Sciences from the University of California, Merced. His research focuses on multimodality, sound symbolism, gesture, an metaphor.

Powered by Wild Apricot Membership Software