Bayesian Statistical Methods: 1st Edition (Hardback) book cover

Bayesian Statistical Methods

1st Edition

By Brian J. Reich, Sujit K. Ghosh

Chapman and Hall/CRC

274 pages

Purchasing Options:$ = USD
Hardback: 9780815378648
pub: 2019-06-04
SAVE ~$13.49
Available for pre-order
$89.95
$76.46
x


FREE Standard Shipping!

Description

Bayesian Statistical Methods provides data scientists with the foundational and computational tools needed to carry out a Bayesian analysis. This book focuses on Bayesian methods applied routinely in practice including multiple linear regression, mixed effects models and generalized linear models (GLM). The authors include many examples with complete R code and comparisons with analogous frequentist procedures.

In addition to the basic concepts of Bayesian inferential methods, the book covers many general topics:

  • Advice on selecting prior distributions
  • Computational methods including Markov chain Monte Carlo (MCMC)
  • Model-comparison and goodness-of-fit measures, including sensitivity to priors
  • Frequentist properties of Bayesian methods

Case studies covering advanced topics illustrate the flexibility of the Bayesian approach:

  • Semiparametric regression
  • Handling of missing data using predictive distributions
  • Priors for high-dimensional regression models
  • Computational techniques for large datasets
  • Spatial data analysis

The advanced topics are presented with sufficient conceptual depth that the reader will be able to carry out such analysis and argue the relative merits of Bayesian and classical methods. A repository of R code, motivating data sets, and complete data analyses are available on the book’s website.

Brian J. Reich, Associate Professor of Statistics at North Carolina State University, is currently the editor-in-chief of the Journal of Agricultural, Biological, and Environmental Statistics and was awarded the LeRoy & Elva Martin Teaching Award.

Sujit K. Ghosh, Professor of Statistics at North Carolina State University, has over 22 years of research and teaching experience in conducting Bayesian analyses, received the Cavell Brownie mentoring award, and served as the Deputy Director at the Statistical and Applied Mathematical Sciences Institute.

 

Reviews

"A book that gives a comprehensive coverage of Bayesian inference for a diverse background of scientific practitioners is needed. The book Bayesian Statistical Methods seems to be a good candidate for this purpose, which aims at a balanced treatment between theory and computation. The authors are leading researchers and experts in Bayesian statistics. I believe this book is likely to be an excellent text book for an introductory course targeting at first-year graduate students or undergraduate statistics majors…This new book is more focused on the most fundamental components of Bayesian methods. Moreover, this book contains many simulated examples and real-data applications, with computer code provided to demonstrate the implementations."

~Qing Zhou, UCLA

"The book gives an overview of Bayesian statistical modeling with a focus on the building blocks for fitting and analyzing hierarchical models. The book uses a number of interesting and realistic examples to illustrate the methods. The computational focus is in the use of JAGS, as a tool to perform Bayesian inference using Markov chain Monte Carlo methods…It can be targeted as a textbook for upper-division undergraduate students in statistics and some areas of science, engineering and social sciences with an interest in a reasonably formal development of data analytic methods and uncertainty quantification. It could also be used for a Master’s class in statistical modeling."

~Bruno Sansó, University of California Santa Cruz

"The given manuscript sample is technically correct, clearly written, and at an appropriate level of difficulty… I enjoyed the real-life problems in the Chapter 1 exercises. I especially like the problem on the Federalist Papers, because the students can revisit this problem and perform more powerful inferences using the advanced Bayesian methods that they will learn later in the textbook… I would seriously consider adopting the book as a required textbook. This text provides more details, R codes, and illuminating visualizations compared to competing books, and more quickly introduces a broad scope of regression models that are important in practical applications."

~Arman Sabbaghi, Purdue University

"The authors are leading researchers and experts in Bayesian statistics. I believe this book is likely to be an excellent textbook for an introductory course targeting at first-year graduate students or

undergraduate statistics majors…" (Qing Zhou, UCLA)

"I would seriously consider adopting the book as a required textbook. This text provides more details, R codes, and illuminating visualizations compared to competing books, and more quickly introduces a broad scope of regression models that are important in practical applications…" (Arman Sabbaghi, Purdue University)

"The book gives an overview of Bayesian statistical modeling with a focus on the building blocks for fitting and analyzing hierarchical models. The book uses a number of interesting and realistic examples to illustrate the methods. The computational focus is in the use of JAGS, as a tool to perform Bayesian inference using Markov chain Monte Carlo methods…It can be targeted as a textbook for upper-division undergraduate students in statistics and some areas of science, engineering and social sciences with an interest in a reasonably formal development of data analytic methods and uncertainty quantification. It could also be used for a Master’s class in statistical modeling." (Bruno Sansó, University of California Santa Cruz)

Table of Contents

1. Basics of Bayesian Inference

Probability background

Univariate distributions

Discrete distributions

Continuous distributions

Multivariate distributions

Marginal and conditional distributions

Bayes' Rule

Discrete example of Bayes' Rule

Continuous example of Bayes' Rule

Introduction to Bayesian inference

Summarizing the posterior

Point estimation

Univariate posteriors

Multivariate posteriors

The posterior predictive distribution

Exercises

2. From Prior Information to Posterior Inference

Conjugate Priors

Beta-binomial model for a proportion

Poisson-gamma model for a rate

Normal-normal model for a mean

Normal-inverse gamma model for a variance

Natural conjugate priors

Normal-normal model for a mean vector

Normal-inverse Wishart model for a covariance matrix

Mixtures of conjugate priors

Improper Priors

Objective Priors

Jeffreys prior

Reference Priors

Maximum Entropy Priors

Empirical Bayes

Penalized complexity priors

Exercises

3. Computational approaches

Deterministic methods

Maximum a posteriori estimation

Numerical integration

Bayesian Central Limit Theorem (CLT)

Markov Chain Monte Carlo (MCMC) methods

Gibbs sampling

Metropolis-Hastings (MH) sampling

MCMC software options in R

Diagnosing and improving convergence

Selecting initial values

Convergence diagnostics

Improving convergence

Dealing with large datasets

Exercises

4. Linear models

Analysis of normal means

One-sample/paired analysis

Comparison of two normal means

Linear regression

Jeffreys prior

Gaussian prior

Continuous shrinkage priors

Predictions

Example: Factors that affect a home's microbiome

Generalized linear models

Binary data

Count data

Example: Logistic regression for NBA clutch free throws

Example: Beta regression for microbiome data

Random effects

Flexible linear models

Nonparametric regression

Heteroskedastic models

Non-Gaussian error models

Linear models with correlated data

Exercises

5. Model selection and diagnostics

Cross validation

Hypothesis testing and Bayes factors

Stochastic search variable selection

Bayesian model averaging

Model selection criteria

Goodness-of-fit checks

Exercises

6. Case studies using hierarchical modeling

Overview of hierarchical modeling

Case study: Species distribution mapping via data fusion

Case study: Tyrannosaurid growth curves

Case study: Marathon analysis with missing data

7. Statistical properties of Bayesian methods

Decision theory

Frequentist properties

Bias-variance tradeoff

Asymptotics

Simulation studies

Exercises

Appendices

Probability distributions

Univariate discrete

Multivariate discrete

Univariate continuous

Multivariate continuous

List of conjugacy pairs

Derivations

Normal-normal model for a mean

Normal-normal model for a mean vector

Normal-inverse Wishart model for a covariance matrix

Jeffreys' prior for a normal model

Jeffreys' prior for multiple linear regression

Convergence of the Gibbs sampler

Marginal distribution of a normal mean under Jeffreys’ prior

Marginal posterior of the regression coefficients under Jeffreys prior

Proof of posterior consistency

Computational algorithms

Integrated nested Laplace approximation (INLA)

Metropolis-adjusted Langevin algorithm

Hamiltonian Monte Carlo (HMC)

Delayed Rejection and Adaptive Metropolis

Slice sampling

Software comparison

Example - Simple linear regression

Example - Random slopes model

About the Authors

Brian J. Reich, Associate Professor of Statistics at North Carolina State University, is currently the editor-in-chief of the Journal of Agricultural, Biological, and Environmental Statistics and was awarded the LeRoy & Elva Martin Teaching Award.

Sujit K. Ghosh, Professor of Statistics at North Carolina State University, has over 22 years of research and teaching experience in conducting Bayesian analyses, received the Cavell Brownie mentoring award, and served as the Deputy Director at the Statistical and Applied Mathematical Sciences Institute

About the Series

Chapman & Hall/CRC Texts in Statistical Science

Learn more…

Subject Categories

BISAC Subject Codes/Headings:
MAT029000
MATHEMATICS / Probability & Statistics / General