morey

Richard Morey is a Senior Lecturer in the School of Psychology at the Cardiff University. In 2008, he earned a PhD in Cognition and Neuroscience and a Masters degree in Statistics from the University of Missouri. He is the author of over 50 articles and book chapters, and in 2011 he was awarded the Netherlands Research Organization Veni Research Talent grant Innovational Research Incentives Scheme grant for work in cognitive psychology. His work spans cognitive science, where he develops and critiques statistical models of cognitive phenomena; statistics, where he is interested in the philosophy of statistical inference and the development of new statistical tools for research use; and the practical side of science, where he is interested in increasing openness in scientific methodology.

Course content
In recent decades, there has been an explosion of interest in Bayesian methodologies in the sciences. There are several reasons for this recent interest: first,
Bayesian methods often yield easier-to-interpret answers to statistical questions than classical methods; and second, Bayesian methods are applicable in situations where classical methods are difficult or impossible to implement. In this course, you will learn the basics of practical Bayesian data analysis.

Course objectives
The course will begin with the theory behind Bayesian data analysis, and move toward simple, common models in the social sciences, like t tests, ANOVA, and regression. From there, we will learn about more complicated models and how these may be fit to the data. Special attention will be given to Markov Chain Monte Carlo (MCMC) methods, which give Bayesian methods their immense flexibility and power. Using software, the power of MCMC methods are available to researchers who are not specialists in Bayesian methods. This class will give you the tools to fit a wide variety of models easily, though the use of the WinBUGS software.

Course Prerequisites
A working knowledge of probability theory is assumed for this class. In addition, knowledge of common statistical models used in the social sciences is necessary , including t tests, ANOVA, and regression. A familiarity with more complicated models such as logistic regression will also prove helpful. Finally, a basic knowledge of the R statistical environment, which will be extensively used in the course, will be very helpful. For many methods, we will use WinBUGS or JAGS to fit models.

Background reading
Gelman, Carlin, Rubin, and Stern’s classic Bayesian Data Analysis,
Kruschke. Doing Bayesian data analysis,
Lee. Introductory Bayesian Statistics.

Required text
Jackman, S. Bayesian Analysis for the Social Sciences (Wiley, 2009) This will be provided on arrival as the primary text for the course.

Lecture 1
Introduction to course and brief review of basic concepts in probability and statistics; Subjective versus frequentist probability; Bayes theorem; frequentist methods, from a Bayesian perspective

Reading: Jackman, Intro and Ch 1

Lecture 2
Basic Bayesian inference: parameter estimation for the bi-nomial and the normal models; conjugacy; parameter estimation versus model comparison; Bayes factors; asymptotic properties of Bayesian estimates and Bayesian factors;

Reading: Jackman, Ch 2

Lecture 3
Common statistical techniques with a Bayesian spin: t tests, ANOVA, and regression; numerical integration

Reading: Rouder, Speckman, Sun, Morey, and Iverson (2009): Bayesian t tests for accepting and rejecting the null hypothesis

Lecture 4
Monte Carlo methods; more numerical integration; sampling algorithms; properties of Monte Carlo estimates

Reading: Jackman, Ch 3

Lecture 5
Markov Chain Monte Carlo methods I: definition and properties of Markov chains

Reading: Jackman, Ch 4

Lecture 6
Markov Chain Monte Carlo methods II: applied MCMC; Metropolis-Hastings; the Gibbs sampler; full conditional distributions; building a Gibbs sampler in R; data augmentation and missing data

Reading: Jackman, Ch 5

Lecture 7
Markov Chain Monte Carlo methods III: making MCMC easy; the BUGS language; using WinBUGS; calling JAGS from R; assessing convergence of Markov chains; reparameterization and over parameterization

Reading: Jackman, Ch 6

Lecture 8
Hierarchical/multilevel modelling I: the basics; exchangeability; hyperparameters; hierarchical pooling; a simple normal random effects model

Reading: Jackman, Ch 7

Lecture 9
Hierarchical/multilevel modelling II: models for discrete data; binomial and multinomial models; probit models; logit models

Reading: Jackman, Ch 8