Probability and Bayesian Modeling: 1st Edition (Hardback) book cover

Probability and Bayesian Modeling

1st Edition

By Jim Albert, Jingchen Hu

Chapman and Hall/CRC

550 pages

Purchasing Options:$ = USD
Hardback: 9781138492561
pub: 2019-12-17
SAVE ~$19.99
Available for pre-order. Item will ship after 17th December 2019
$99.95
$79.96
x


FREE Standard Shipping!

Description

Probability and Bayesian Modeling is an introduction to probability and Bayesian thinking for undergraduate students with a calculus background. The first part of the book provides a broad view of probability including foundations, conditional probability, discrete and continuous distributions, and joint distributions. Statistical inference is presented completely from a Bayesian perspective. The text introduces inference and prediction for a single proportion and a single mean from Normal sampling. After fundamentals of Markov Chain Monte Carlo algorithms are introduced, Bayesian inference is described for hierarchical and regression models including logistic regression. The book presents several case studies motivated by some historical Bayesian studies and the authors’ research.

This text reflects modern Bayesian statistical practice. Simulation is introduced in all the probability chapters and extensively used in the Bayesian material to simulate from the posterior and predictive distributions. One chapter describes the basic tenets of Metropolis and Gibbs sampling algorithms; however several chapters introduce the fundamentals of Bayesian inference for conjugate priors to deepen understanding. Strategies for constructing prior distributions are described in situations when one has substantial prior information and for cases where one has weak prior knowledge. One chapter introduces hierarchical Bayesian modeling as a practical way of combining data from different groups. There is an extensive discussion of Bayesian regression models including the construction of informative priors, inference about functions of the parameters of interest, prediction, and model selection.

The text uses JAGS (Just Another Gibbs Sampler) as a general-purpose computational method for simulating from posterior distributions for a variety of Bayesian models. An R package ProbBayes is available containing all of the book datasets and special functions for illustrating concepts from the book.

 

Table of Contents

  1. Probability: A Measurement of Uncertainty
  2. Introduction

    The Classical View of a Probability

    The Frequency View of a Probability

    The Subjective View of a Probability

    The Sample Space

    Assigning Probabilities

    Events and Event Operations

    The Three Probability Axioms

    The Complement and Addition Properties

    Exercises

  3. Counting Methods
  4. Introduction: Rolling Dice, Yahtzee, and Roulette

    Equally Likely Outcomes

    The Multiplication Counting Rule

    Permutations

    Combinations

    Arrangements of Non-Distinct Objects

    Playing Yahtzee

    Exercises

  5. Conditional Probability
  6. Introduction: The Three Card Problem

    In Everyday Life

    In a Two-Way Table

    Definition and the Multiplication Rule

    The Multiplication Rule Under Independence

    Learning Using Bayes' Rule

    R Example: Learning About a Spinner

    Exercises

  7. Discrete Distributions
  8. Introduction: The Hat Check Problem

    Random Variable and Probability Distribution

    Summarizing a Probability Distribution

    Standard Deviation of a Probability Distribution

    Coin-Tossing Distributions

    Binomial probabilities

    Binomial computations

    Mean and standard deviation of a Binomial

    Negative Binomial Experiments

    Exercises

  9. Continuous Distributions
  10. Introduction: A Baseball Spinner Game

    The Uniform Distribution

    Probability Density: Waiting for a Bus

    The Cumulative Distribution Function

    Summarizing a Continuous Random Variable

    Normal Distribution

    Binomial Probabilities and the Normal Curve

    Sampling Distribution of the Mean

    Exercises

  11. Joint Probability Distributions
  12. Introduction

    Joint Probability Mass Function: Sampling From a Box

    Multinomial Experiments

    Joint Density Functions

    Independence and Measuring Association

    Flipping a Random Coin: The Beta-Binomial Distribution

    Bivariate Normal Distribution

    Exercises

  13. Learning About a Binomial Probability
  14. Introduction: Thinking About a Proportion Subjectively

    Bayesian Inference with Discrete Priors

    Example: students' dining preference

    Discrete prior distributions for proportion p

    Likelihood of proportion p

    Posterior distribution for proportion p

    Inference: students' dining preference

    Discussion: using a discrete prior

    Continuous Priors

    The Beta distribution and probabilities

    Choosing a Beta density curve to represent prior opinion

    Updating the Beta Prior

    Bayes' rule calculation

    From Beta prior to Beta posterior: conjugate priors

    Bayesian Inferences with Continuous Priors

    Bayesian hypothesis testing

    Bayesian credible intervals

    Bayesian prediction

    Predictive Checking

    Exercises

  15. Modeling Measurement and Count Data
  16. Introduction

    Modeling Measurements

    Examples

    The general approach

    Outline of chapter

    Bayesian Inference with Discrete Priors

    Example: Roger Federer's time-to-serve

    Simplification of the likelihood

    Inference: Federer's time-to-serve

    Continuous Priors

    The Normal prior for mean _

    Choosing a Normal prior

    Updating the Normal Prior

    Introduction

    A quick peak at the update procedure

    Bayes' rule calculation

    Conjugate Normal prior

    Bayesian Inferences for Continuous Normal Mean

    Bayesian hypothesis testing and credible interval

    Bayesian prediction

    Posterior Predictive Checking

    Modeling Count Data

    Examples

    The Poisson distribution

    Bayesian inferences

    Case study: Learning about website counts

    Exercises

  17. Simulation by Markov Chain Monte Carlo
  18. Introduction

    The Bayesian computation problem

    Choosing a prior

    The two-parameter Normal problem

    Overview of the chapter

    Markov Chains

    Definition

    Some properties

    Simulating a Markov chain

    The Metropolis Algorithm

    Example: Walking on a number line

    The general algorithm

    A general function for the Metropolis algorithm

    Example: Cauchy-Normal problem

    Choice of starting value and proposal region

    Collecting the simulated draws

    Gibbs Sampling

    Bivariate discrete distribution

    Beta-Binomial sampling

    Normal sampling { both parameters unknown

    MCMC Inputs and Diagnostics

    Burn-in, starting values, and multiple chains

    Diagnostics

    Graphs and summaries

    Using JAGS

    Normal sampling model

    Multiple chains

    Posterior predictive checking

    Comparing two proportions

    Exercises

  19. Bayesian Hierarchical Modeling
  20. Introduction

    Observations in groups

    Example: standardized test scores

    Separate estimates?

    Combined estimates?

    A two-stage prior leading to compromise estimates

    Hierarchical Normal Modeling

    Example: ratings of animation movies

    A hierarchical Normal model with random _

    Inference through MCMC

    Hierarchical Beta-Binomial Modeling

    Example: Deaths after heart attack

    A hierarchical Beta-Binomial model

    Inference through MCMC

    Exercises

  21. Simple Linear Regression
  22. Introduction

    Example: Prices and Areas of House Sales

    A Simple Linear Regression Model

    A Weakly Informative Prior

    Posterior Analysis

    Inference through MCMC

    Bayesian Inferences with Simple Linear Regression

    Simulate fits from the regression model

    Learning about the expected response

    Prediction of future response

    Posterior predictive model checking

    Informative Prior

    Standardization

    Prior distributions

    Posterior Analysis

    A Conditional Means Prior

    Exercises

  23. Bayesian Multiple Regression and Logistic Models
  24. Introduction

    Bayesian Multiple Linear Regression

    Example: expenditures of US households

    A multiple linear regression model

    Weakly informative priors and inference through MCMC

    Prediction

    Comparing Regression Models

    Bayesian Logistic Regression

    Example: US women labor participation

    A logistic regression model

    Conditional means priors and inference through MCMC

    Prediction

    Exercises

  25. Case Studies

Introduction

Federalist Papers Study

Introduction

Data on word use

Poisson density sampling

Negative Binomial sampling

Comparison of rates for two authors

Which words distinguish the two authors?

Career Trajectories

Introduction

Measuring hitting performance in baseball

A hitter's career trajectory

Estimating a single trajectory

Estimating many trajectories by a hierarchical model

Latent Class Modeling

Two classes of test takers

A latent class model with two classes

Disputed authorship of the Federalist Papers

Exercises

Appendix

Appendix A: The constant in the Beta posterior

Appendix B: The posterior predictive distribution

Appendix C: Comparing Bayesian models

About the Authors

Jim Albert is a Distinguished University Professor of Statistics at Bowling Green State University. His research interests include Bayesian modeling and applications of statistical thinking in sports. He has authored or coauthored several books including Ordinal Data Modeling, Bayesian Computation with R, and Workshop Statistics: Discovery with Data, A Bayesian Approach.

Jingchen (Monika) Hu is an Assistant Professor of Mathematics and Statistics at Vassar College. She teaches an undergraduate-level Bayesian Statistics course at Vassar, which is shared online across several liberal arts colleges. Her research focuses on dealing with data privacy issues by releasing synthetic data.

About the Series

Chapman & Hall/CRC Texts in Statistical Science

Learn more…

Subject Categories

BISAC Subject Codes/Headings:
MAT029000
MATHEMATICS / Probability & Statistics / General