Closed for registrations.
The Statistical Society of Australia and Macquarie University are pleased to announce the following workshop
Smooth modelling in R: splines, GAMs and related models
with presenter Professsor Simon Wood of the University of Bristol, UK.
Simon Wood is the author of R recommended package ‘mgcv’ for generalized additive modelling, and of the textbook ‘Generalized Additive Models: An Introduction with R’ (2006, second edition early 2017). He works as a Professor of Statistical Science at the University of Bristol. http://www.maths.bris.ac.uk/~sw15190/
About the Workshop
From one day ahead prediction of electricity consumption across France, to modelling the spatial distribution of extreme rainfall events, or pollution related human deaths, or the distribution of fish in the sea, many regression models can conveniently be specified in terms of smooth functions of covariates. This course is about how to do that using the ‘mgcv’ package in R. Starting with the basics of how to represent smooth functions using basis functions and penalties, the course goes on to look at the practical use of generalized additive models, including extensions beyond the usual exponential family distributions. For example, survival analysis, functional data, location-scale models and random effects components will all be covered, while the smooths include tensor product smoothing, and spatial smoothers, such as Gaussian processes, Gaussian Markov random fields and splines on the sphere. The focus is practical and hands on, with theory presented to support understanding of the practical use of the methods.
* A half hour introduction to smooth models in mgcv.
* Penalized regression smoothers: the basics.
– Bases, penalties, estimation.
– Reduced rank splines.
– The Bayesian, mixed model connection.
– Effective degrees of freedom.
– Smoothness selection (GCV, REML etc.).
* Computer lab: build your own smoother.
* Extending the model: from 1D smoothing to GAMs and beyond
– Several smoothers in a model, and identifiability constraints.
– Linear functionals of smooths (e.g. signal regression, varying coefficient models).
– Generalization to exponential family response variables and beyond: coefficient and smoothing parameter estimation.
* Computer lab: GAM data analysis with mgcv.
* A toolkit of smoothers
– Zero dimensional smooths: simple Gaussian random effects.
– One dimensional smoothers, including cyclic and adaptive.
– Isotropic smooths of several variables: thin plate splines, more general Duchon splines, splines on the sphere,Gaussian Markov random fields and finite region smoothing.
– Smooth interactions via tensor product smoothing (including spatio-temporal smoothing).
– Smooth – factor interactions.
* Computer lab: More advanced GAM data analysis with mgcv.
* Model Checking, selection and posterior simulation
– Residual checking, as GLMs.
– More checking: basis dimension, concurvity, partial residuals.
– Selection tools: generalized AIC, approximate p-values, approximate GLRT.
– Backwards/forwards selection.
– Selection by further penalization.
– Simulating from the approximate posterior
– Other packages: INLA, BayesX, gam, gamlss, gss, assist etc.
* Computer lab: More data analysis with mgcv.
* Optional, given time and participant interests:
– GAMs for big data set (the UK black smoke monitoring network)
– Or, another topic that comes up as being of general interest.
Applied statisticians and other numerate scientists, engineers and business people with some experience of regression modelling, and an interest in learning (more) about modelling with smooth functions, including some modern extensions of the standard generalized additive model. Participants are welcome to bring their own data to work on, in addition to (or instead of) the example data sets that will be provided.
Some prior experience with R is useful. If you have not used R before it would be worth installing it from cran.r-project.org and working through “An Introduction to R” from https://cran.r-project.org/manuals.html. Chapter 3 of http://www.maths.bris.ac.uk/~sw15190/core-statistics.pdf attempts to cover similar ground more concisely – if you are familiar with what is covered there than you have way more than enough.
* To acquire a sound understanding of basis penalty smoothing, and how it can be used to formulate regression models where the targets of inference are smooth functions of covariates.
* To gain an understanding of the associated inferential theory needed for practical data analysis.
* To be able to work with a wide variety of smooth models using R package mgcv, including models beyond the original exponential family generalized additive model.
Early Bird (until 25 October 2016)
SSA Members – $550
SSA Student Members** – $300
Non-SSA Members – $600
Non-SSA Student Members** – $325
From 26 October 2016
SSA Members – $700
SSA Student Members** – $300
Non-SSA Members – $750
Non-SSA Student Members** – $325
** Proof of Valid University ID required
Registrations close strictly on 17 November 2016. Should places be available after this date late registrations may be accepted. Late registrations will incur a $100 late registration fee.
The course fee includes morning and afternoon tea as well as lunch on both days.
Occasionally workshops have to be cancelled due to a lack of subscription. Early registration ensures that this will not happen. Please contact the SSA Office before making any travel arrangements to confirm that the workshop will go ahead, because the Society will not be held responsible for any travel or accommodation expenses incurred due to a workshop cancellation.
Cancellations received prior to Thursday, 17 November 2016 will be refunded, minus a $20 administration fee.
From 18 November 2016 no part of the registration fee will be refunded. However, registrations are transferable within the same organisation. Please advise any changes to firstname.lastname@example.org.
|When:||25/11/2016 - 26/11/2016|
|Time:||9:00 am - 4:00 pm|
|Cost:||from $300 (students)|
|Location:||Australian Hearing Hub, Lecture Theatre, Level 1,
16 University Avenue,