Chapman and Hall/CRC
This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.
Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of distributions, and standard probability inequalities. It develops the Helmert transformation for normal distributions, introduces the notions of convergence, and spotlights the central limit theorems. Coverage highlights sampling distributions, Basu's theorem, Rao-Blackwellization and the Cramér-Rao inequality. The text also provides in-depth coverage of Lehmann-Scheffé theorems, focuses on tests of hypotheses, describes Bayesian methods and the Bayes' estimator, and develops large-sample inference. The author provides a historical context for statistics and statistical discoveries and answers to a majority of the end-of-chapter exercises.
Designed primarily for a one-semester, first-year graduate course in probability and statistical inference, this text serves readers from varied backgrounds, ranging from engineering, economics, agriculture, and bioscience to finance, financial mathematics, operations and information management, and psychology.
Review of Probability and Related Concepts. Sufficiency, Completeness, and Ancillarity. Point Estimation. Tests of Hypotheses. Confidence Interval Estimation. Bayesian Methods. Likelihood Ratio and Other Tests. Large-Sample Inference. Sample Size Determination: Two-Stage Procedures. Regression Analysis: Fitting a Straight Line. Nonparametric Methods. Bootstrap Methods. Appendix. References.