Modules

Week Learning Objectives

By the end of this module, you will be able to

  • Navigate the course website and Blackboard site
  • Identify the Slack channels relevant for the course
  • Describe the historical origin of Bayesian statistics
  • Identify components in research papers involving Bayesian analyses
  • Render a simple Quarto (.qmd) file

Task List

  1. Review the syllabus
  2. Review the resources (slides and note)
  3. Install/Update R and RStudio on your computer
  4. Attend the Tuesday and Thursday class meetings
  5. Complete the assigned readings
  6. Introduce yourself on the #introduction Slack channel (as part of HW 1)
  7. Complete Homework 1 (see instruction on Brightspace)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Describe the subjectivist interpretation of probability, and contrast it with the frequentist interpretation
  • Compute probability density using simulations
  • Compute joint, marginal, and conditional probabilities with two variables
  • Apply Bayes’ rule to obtain posterior from prior and data
  • Explain what data-order invariance and exchangeability are
  • Use grid approximation to obtain the posterior for a Bernoulli model

Task List

  1. Review the resources (slides and notes)
  2. Attend the Tuesday and Thursday class meetings
  3. Complete the assigned readings
  4. Complete Homework 2 (due in two weeks; see instruction on Brightspace)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Apply Bayesian workflow to analyze real data with a Bernoulli model
  • Explain the idea of a conjugate prior
  • Summarize the posterior distribution using simulations
  • Apply Bayesian terminology in summarizing the posterior
  • Use R to perform prior and posterior predictive checks

Task List

  1. Review the resources (slides and notes)
  2. Attend the Tuesday and Thursday class meetings
  3. Complete the assigned readings
  4. Complete Homework 2 (see instruction on Brightspace)

Lecture Videos

Check your learning
If we do not use a conjugate prior for the Bernoulli model, what is the distribution of the posterior \(\theta\)?



Check your learning
In the posterior distribution above, the interval bounding the shaded area is



Check your learning
A researcher asks a participant 10 true/false questions to assess their statistics knowledge. A posterior predictive distribution in this case would be




Check your learning
Which of the following is the correct way to declare a data variable of responses on a 5-point scale from N participants?




Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Explain the logic of a hierarchical model
  • Apply the binomial distribution to describe the sum of multiple Bernoulli trials
  • Program a hierarchical binomial model in Stan
  • Analyze secondary data using a hierarchical normal model (i.e., random-effect meta-analysis)

Task List

  1. Review the resources (slides and notes)
  2. Watch the lecture videos below (to be posted)
  3. Complete the assigned readings
  4. Start working on Homework 3 (see instruction on Brightspace)

Lecture Videos

Check your learning
In terms of statistical inference, a binomial variable \(Z\) with N = 10 and theta (\(\theta\)) is equivalent to




Check your learning

Based on the notation in the slides, a Beta2(0.5, 6) distribution is the same as




Check your learning
A Gamma distribution is handy as a prior for kappa (\(\kappa\)) because



The following videos are from an older class, so the slides may look slightly different.

For the first video, please skip to 04:12.

Check your learning
With shrinkage, the posterior distribution of individual-specific \(\theta_j\) is



The following videos are from an older class, so the slides may look slightly different.

Check your learning
In the hierarchical normal distribution discussed in the lecture, the treatment effect estimate in each study



Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Interpret the coefficients in a linear regression model
  • Obtain posterior predictive distributions and checks
  • Explain how the assumptions of regression are coded in the model equations
  • Perform Bayesian regression with the R package brms
  • Interpret results from an interaction model using plots and posterior predictions

Task List

  1. Review the resources (slides and notes)
  2. Attend the Tuesday and Thursday class meetings
  3. Complete the assigned readings
    • McElreath ch. 4, 5, 7, 8
  4. Complete Homework 3 (see instruction on Brightspace)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Explain how information criteria approximates out-of-sample divergence from the β€œtrue” model
  • Use WAIC and LOO-IC to compare models

Task List

  1. Review the resources (slides and notes)
  2. Attend the Tuesday and Thursday class meetings
  3. Complete Homework 4 (see instruction on Brightspace)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Draw a directed acyclic graph (DAG) to represent causal assumptions
  • Use a DAG to guide analyses for obtaining causal effects
  • Describe how randomization can remove potential confounders
  • Explain how the back-door criterion can be used to identify a set of adjusted variables with nonexperimental data
  • Perform a mediation analysis and interpret the results

Task List

  1. Complete the assigned readings
    • McElreath ch 6
  2. Review the resources (slides and notes)
  3. Attend the Tuesday and Thursday class meetings
  4. Complete Project Prospectus (see instruction on Brightspace)
  5. Schedule a meeting with the instructor for your prospectus (sign-up link will be posted on Slack)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Explain what is unique for samples using Markov Chain Monte Carlo (MCMC)
  • Explain why we need MCMC to approximate the posterior
  • Describe when MCMC samples are representative and accurate for approximating the posterior
  • Use R to perform convergence diagnostics for MCMC samples

Task List

  1. Complete the assigned readings
    • McElreath ch 9
  2. Review the resources (slides and notes)
  3. Attend the Tuesday and Thursday class meetings
  4. Complete Homework 6 (see instruction on Brightspace)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Describe the three components of the generalized linear model (GLM)
  • Name examples of the GLM (e.g., linear regression, Poisson regression)
  • Obtain posterior predictive distributions and checks

Task List

  1. Complete the assigned readings
    • McElreath ch 10, 11
  2. Review the resources (slides and notes)
  3. Attend the Tuesday and Thursday class meetings
  4. Complete Homework 7 (see instruction on Brightspace)

Slides

Link to HTML slides

Week Learning Objectives

By the end of this module, you will be able to

  • Provide examples of clustered data
  • Fit Bayesian multilevel models (MLMs)
  • Name advantages of MLM
  • Interpret coefficients in MLM

Task List

  1. Complete the assigned readings
    • McElreath ch 13, 14.1, 14.2
  2. Review the resources (slides and notes)
  3. Attend the Tuesday class meetings
  4. Continue to work on final project

Slides

Link to HTML slides

P.S.: If you’d like to print the slides to PDF, follow https://quarto.org/docs/presentations/revealjs/presenting.html#print-to-pdf