Probability and Bayesian Modeling  book cover
1st Edition

Probability and Bayesian Modeling

ISBN 9781138492561
Published December 18, 2019 by Chapman and Hall/CRC
552 Pages

FREE Standard Shipping
USD $99.95

Prices & shipping based on shipping country


Book Description

Probability and Bayesian Modeling is an introduction to probability and Bayesian thinking for undergraduate students with a calculus background. The first part of the book provides a broad view of probability including foundations, conditional probability, discrete and continuous distributions, and joint distributions. Statistical inference is presented completely from a Bayesian perspective. The text introduces inference and prediction for a single proportion and a single mean from Normal sampling. After fundamentals of Markov Chain Monte Carlo algorithms are introduced, Bayesian inference is described for hierarchical and regression models including logistic regression. The book presents several case studies motivated by some historical Bayesian studies and the authors’ research.

This text reflects modern Bayesian statistical practice. Simulation is introduced in all the probability chapters and extensively used in the Bayesian material to simulate from the posterior and predictive distributions. One chapter describes the basic tenets of Metropolis and Gibbs sampling algorithms; however several chapters introduce the fundamentals of Bayesian inference for conjugate priors to deepen understanding. Strategies for constructing prior distributions are described in situations when one has substantial prior information and for cases where one has weak prior knowledge. One chapter introduces hierarchical Bayesian modeling as a practical way of combining data from different groups. There is an extensive discussion of Bayesian regression models including the construction of informative priors, inference about functions of the parameters of interest, prediction, and model selection.

The text uses JAGS (Just Another Gibbs Sampler) as a general-purpose computational method for simulating from posterior distributions for a variety of Bayesian models. An R package ProbBayes is available containing all of the book datasets and special functions for illustrating concepts from the book.

A complete solutions manual is available for instructors who adopt the book in the Additional Resources section.

Table of Contents

  1. Probability: A Measurement of Uncertainty
  2. Introduction

    The Classical View of a Probability

    The Frequency View of a Probability

    The Subjective View of a Probability

    The Sample Space

    Assigning Probabilities

    Events and Event Operations

    The Three Probability Axioms

    The Complement and Addition Properties


  3. Counting Methods
  4. Introduction: Rolling Dice, Yahtzee, and Roulette

    Equally Likely Outcomes

    The Multiplication Counting Rule



    Arrangements of Non-Distinct Objects

    Playing Yahtzee


  5. Conditional Probability
  6. Introduction: The Three Card Problem

    In Everyday Life

    In a Two-Way Table

    Definition and the Multiplication Rule

    The Multiplication Rule Under Independence

    Learning Using Bayes' Rule

    R Example: Learning About a Spinner


  7. Discrete Distributions
  8. Introduction: The Hat Check Problem

    Random Variable and Probability Distribution

    Summarizing a Probability Distribution

    Standard Deviation of a Probability Distribution

    Coin-Tossing Distributions

    Binomial probabilities

    Binomial computations

    Mean and standard deviation of a Binomial

    Negative Binomial Experiments


  9. Continuous Distributions
  10. Introduction: A Baseball Spinner Game

    The Uniform Distribution

    Probability Density: Waiting for a Bus

    The Cumulative Distribution Function

    Summarizing a Continuous Random Variable

    Normal Distribution

    Binomial Probabilities and the Normal Curve

    Sampling Distribution of the Mean


  11. Joint Probability Distributions
  12. Introduction

    Joint Probability Mass Function: Sampling From a Box

    Multinomial Experiments

    Joint Density Functions

    Independence and Measuring Association

    Flipping a Random Coin: The Beta-Binomial Distribution

    Bivariate Normal Distribution


  13. Learning About a Binomial Probability
  14. Introduction: Thinking About a Proportion Subjectively

    Bayesian Inference with Discrete Priors

    Example: students' dining preference

    Discrete prior distributions for proportion p

    Likelihood of proportion p

    Posterior distribution for proportion p

    Inference: students' dining preference

    Discussion: using a discrete prior

    Continuous Priors

    The Beta distribution and probabilities

    Choosing a Beta density curve to represent prior opinion

    Updating the Beta Prior

    Bayes' rule calculation

    From Beta prior to Beta posterior: conjugate priors

    Bayesian Inferences with Continuous Priors

    Bayesian hypothesis testing

    Bayesian credible intervals

    Bayesian prediction

    Predictive Checking


  15. Modeling Measurement and Count Data
  16. Introduction

    Modeling Measurements


    The general approach

    Outline of chapter

    Bayesian Inference with Discrete Priors

    Example: Roger Federer's time-to-serve

    Simplification of the likelihood

    Inference: Federer's time-to-serve

    Continuous Priors

    The Normal prior for mean _

    Choosing a Normal prior

    Updating the Normal Prior


    A quick peak at the update procedure

    Bayes' rule calculation

    Conjugate Normal prior

    Bayesian Inferences for Continuous Normal Mean

    Bayesian hypothesis testing and credible interval

    Bayesian prediction

    Posterior Predictive Checking

    Modeling Count Data


    The Poisson distribution

    Bayesian inferences

    Case study: Learning about website counts


  17. Simulation by Markov Chain Monte Carlo
  18. Introduction

    The Bayesian computation problem

    Choosing a prior

    The two-parameter Normal problem

    Overview of the chapter

    Markov Chains


    Some properties

    Simulating a Markov chain

    The Metropolis Algorithm

    Example: Walking on a number line

    The general algorithm

    A general function for the Metropolis algorithm

    Example: Cauchy-Normal problem

    Choice of starting value and proposal region

    Collecting the simulated draws

    Gibbs Sampling

    Bivariate discrete distribution

    Beta-Binomial sampling

    Normal sampling { both parameters unknown

    MCMC Inputs and Diagnostics

    Burn-in, starting values, and multiple chains


    Graphs and summaries

    Using JAGS

    Normal sampling model

    Multiple chains

    Posterior predictive checking

    Comparing two proportions


  19. Bayesian Hierarchical Modeling
  20. Introduction

    Observations in groups

    Example: standardized test scores

    Separate estimates?

    Combined estimates?

    A two-stage prior leading to compromise estimates

    Hierarchical Normal Modeling

    Example: ratings of animation movies

    A hierarchical Normal model with random _

    Inference through MCMC

    Hierarchical Beta-Binomial Modeling

    Example: Deaths after heart attack

    A hierarchical Beta-Binomial model

    Inference through MCMC


  21. Simple Linear Regression
  22. Introduction

    Example: Prices and Areas of House Sales

    A Simple Linear Regression Model

    A Weakly Informative Prior

    Posterior Analysis

    Inference through MCMC

    Bayesian Inferences with Simple Linear Regression

    Simulate fits from the regression model

    Learning about the expected response

    Prediction of future response

    Posterior predictive model checking

    Informative Prior


    Prior distributions

    Posterior Analysis

    A Conditional Means Prior


  23. Bayesian Multiple Regression and Logistic Models
  24. Introduction

    Bayesian Multiple Linear Regression

    Example: expenditures of US households

    A multiple linear regression model

    Weakly informative priors and inference through MCMC


    Comparing Regression Models

    Bayesian Logistic Regression

    Example: US women labor participation

    A logistic regression model

    Conditional means priors and inference through MCMC



  25. Case Studies


          Federalist Papers Study


          Data on word use

          Poisson density sampling

          Negative Binomial sampling

          Comparison of rates for two authors

         Which words distinguish the two authors?

          Career Trajectories


          Measuring hitting performance in baseball

          A hitter's career trajectory

          Estimating a single trajectory

          Estimating many trajectories by a hierarchical model

          Latent Class Modeling

         Two classes of test takers

         A latent class model with two classes

         Disputed authorship of the Federalist Papers



       Appendix A: The constant in the Beta posterior

       Appendix B: The posterior predictive distribution

       Appendix C: Comparing Bayesian models

View More



Jim Albert is a Distinguished University Professor of Statistics at Bowling Green State University. His research interests include Bayesian modeling and applications of statistical thinking in sports. He has authored or coauthored several books including Ordinal Data Modeling, Bayesian Computation with R, and Workshop Statistics: Discovery with Data, A Bayesian Approach.

Jingchen (Monika) Hu is an Assistant Professor of Mathematics and Statistics at Vassar College. She teaches an undergraduate-level Bayesian Statistics course at Vassar, which is shared online across several liberal arts colleges. Her research focuses on dealing with data privacy issues by releasing synthetic data.


"The book can be used by upper undergraduate and graduate students as well as researchers and practitioners in statistics and data science from all disciplines…A background of calculus is required for the reader but no experience in programming is needed. The writing style of the book is extremely reader friendly. It provides numerous illustrative examples, valuable resources, a rich collection of materials, and a memorable learning experience."

"Over many years, I have wondered about the following: Should a first undergraduate course in statistics be a Bayesian course? After reading this book, I have come to the conclusion that the answer is…yes!... this is very well written textbook that can also be used as self-learning material for practitioners. It presents a clear, accessible, and entertaining account of the interplay of probability, computations, and statistical inference from the Bayesian perspective."
~ISCB News