2nd Edition

# Statistical Theory A Concise Introduction

By Felix Abramovich, Ya'acov Ritov Copyright 2023
236 Pages 27 B/W Illustrations
by Chapman & Hall

236 Pages 27 B/W Illustrations
by Chapman & Hall

Also available as eBook on:

Designed for a one-semester advanced undergraduate or graduate statistical theory course, Statistical Theory: A Concise Introduction, Second Edition clearly explains the underlying ideas, mathematics, and principles of major statistical concepts, including parameter estimation, confidence intervals, hypothesis testing, asymptotic analysis, Bayesian inference, linear models, nonparametric statistics, and elements of decision theory. It introduces these topics on a clear intuitive level using illustrative examples in addition to the formal definitions, theorems, and proofs.

Based on the authors’ lecture notes, the book is self-contained, which maintains a proper balance between the clarity and rigor of exposition. In a few cases, the authors present a "sketched" version of a proof, explaining its main ideas rather than giving detailed technical mathematical and probabilistic arguments.

Features:

• Second edition has been updated with a new chapter on Nonparametric Estimation; a significant update to the chapter on Statistical Decision Theory; and other updates throughout
• No requirement for heavy calculus, and simple questions throughout the text help students check their understanding of the material
• Each chapter also includes a set of exercises that range in level of difficulty
• Self-contained, and can be used by the students to understand the theory
• Chapters and sections marked by asterisks contain more advanced topics and may be omitted
• Special chapters on linear models and nonparametric statistics show how the main theoretical concepts can be applied to well-known and frequently used statistical tools

The primary audience for the book is students who want to understand the theoretical basis of mathematical statistics—either advanced undergraduate or graduate students. It will also be an excellent reference for researchers from statistics and other quantitative disciplines.

1. Introduction. 1.1. Preamble. 1.2. Likelihood. 1.3. Sufficiency. 1.4. Minimal sufficiency. 1.5. Completeness. 1.6. Exponential family of distributions. 1.7. Exercises. 2. Point Estimation. 2.1. Introduction. 2.2. Maximum likelihood estimation. 2.3. Method of moments. 2.4. Method of least squares. 2.5. M-estimators. 2.6. Goodness-of-estimation. Mean squared error. 2.7. Unbiased estimation. 2.8. Exercises. 3. Confidence Intervals, Bounds, and Regions. 3.1. Introduction. 3.2. Quoting the estimation error. 3.3. Confidence intervals. 3.4. Confidence bounds. 3.5. Confidence regions. 3.6. Exercises. 4. Hypothesis testing. 4.1. Introduction. 4.2. Simple hypotheses. 4.3. Composite hypotheses. 4.4. Duality between hypothesis testing and confidence intervals (regions). 4.5. Sequential testing. 4.6. Multiple testing. 4.7. Exercises. 5. Asymptotic Analysis. 5.1. Introduction. 5.2. Convergence and consistency in MSE. 5.3. Convergence and consistency in probability. 5.4. Convergence in distribution. 5.5. The central limit theorem. 5.6. Asymptotically normal consistency. 5.7. Asymptotic confidence intervals. 5.8. Asymptotic properties of MLEs, Wald confidence intervals, and tests. 5.9. Multiparameter case. 5.10. Asymptotic properties of M-estimators. 5.11. Score (Rao) asymptotic tests, and confidence regions. 5.12. Asymptotic distribution of the GLRT, Wilks’ theorem. 5.13. Exercises. 6. Bayesian Inference. 6.1. Introduction. 6.2. Choice of priors. 6.3. Point estimation. 6.4. Interval estimation. Credible sets. 6.5. Hypothesis Testing. 6.6. Asymptotic properties of the posterior distribution. 6.7. Exercises. 7. Elements of Statistical Decision Theory. 7.1. Introduction and notations. 7.2. Risk function and admissibility. 7.3. Minimax risk and minimax rules. 7.4. Bayes risk and Bayes rules. 7.5. Posterior expected loss and Bayes actions. 7.6. Admissibility of Bayes rules. 7.7. Minimaxity and Bayes rules. 7.8. Exercises 8. Linear Models. 8.1. Introduction. 8.2. Definition and examples. 8.3. Estimation of regression coefficients. 8.4. Residuals. Estimation of the variance. 8.5. Examples. 8.6. Goodness-of-fit. Multiple correlation coefficient. 8.7. Confidence intervals and regions for the coefficients. 8.8. Hypothesis testing in linear models. 8.9. Predictions. 8.10. Analysis of variance. 9. Nonparametric Estimation. 9.1. Introduction. 9.2. The empirical distribution function and the histogram. 9.3. Kernel density estimation. 9.4. The minimax rate. 9.5. Nonparametric kernel regression. 9.6. Nonparametric estimation by orthonormal series. 9.7. Spline smoothing. 9.8. Choice of the smoothing parameter. A. Probabilistic review. A.1. Introduction. A.2. Basic probabilistic laws. A.3. Random variables. A.4. Special families of distributions. B. Solutions of Selected Exercises. Index.

### Biography

Felix Abramovich is a professor at the Department of Statistics and Operations Research at Tel Aviv University.

Ya’acov Ritov is a professor in the Department of Statistics at the University of Michigan, Ann Arbor. He is a professor emeritus of Statistics at the Hebrew University of Jerusalem.