1st Edition

Handbook of Bayesian, Fiducial, and Frequentist Inference

Edited By James Berger, Xiao-Li Meng, Nancy Reid, Min-ge Xie Copyright 2024
    420 Pages 30 Color & 31 B/W Illustrations
    by Chapman & Hall

    420 Pages 30 Color & 31 B/W Illustrations
    by Chapman & Hall

    The emergence of data science, in recent decades, has magnified the need for efficient methodology for analyzing data and highlighted the importance of statistical inference. Despite the tremendous progress that has been made, statistical science is still a young discipline and continues to have several different and competing paths in its approaches and its foundations. While the emergence of competing approaches is a natural progression of any scientific discipline, differences in the foundations of statistical inference can sometimes lead to different interpretations and conclusions from the same dataset. The increased interest in the foundations of statistical inference has led to many publications, and recent vibrant research activities in statistics, applied mathematics, philosophy and other fields of science reflect the importance of this development. The BFF approaches not only bridge foundations and scientific learning, but also facilitate objective and replicable scientific research, and provide scalable computing methodologies for the analysis of big data. Most of the published work typically focusses on a single topic or theme, and the body of work is scattered in different journals. This handbook provides a comprehensive introduction and broad overview of the key developments in the BFF schools of inference. It is intended for researchers and students who wish for an overview of foundations of inference from the BFF perspective and provides a general reference for BFF inference.

    Key Features:

    • Provides a comprehensive introduction to the key developments in the BFF schools of inference
    • Gives an overview of modern inferential methods, allowing scientists in other fields to expand their knowledge
    • Is accessible for readers with different perspectives and backgrounds

    1. Risky Business
    Stephen Stigler

    2. Empirical Bayes: Concepts and Methods
    Bradley Efron

    3. Distributions for Parameters
    Nancy Reid

    4. Objective Bayesian Inference and its Relationship to Frequentism
    James Berger, Jose Bernardo and Dongchu Sun

    5. Fiducial Inference, Then and Now
    Alexander Philip Dawid

    6. Bridging Bayesian, frequentist and fiducial inferences using confidence distributions
    Suzanne Thornton and Min-ge Xie

    7. Objective Bayesian Testing and Model Uncertainty
    James Berger, Gonzalo García-Donato, Elias Moreno and Luis Pericchi

    8. "A BFFer’s Exploration with Nuisance Constructs: Bayesian p-value, H likelihood, and Cauchyanity"
    Xiao-Li Meng

    9. Bayesian neural networks and dimensionality reduction
    Deborshee Sen, Theodore Papamarkou and David Dunson

    10. The Tangent Exponential Model
    Anthony Davison and Nancy Reid

    11. Data Integration and Model Fusion in the Bayesian and Frequentist Frameworks
    Emily C. Hector, Lu Tang, Ling Zhou and Peter X.K. Song

    12. How the game-theoretic foundation for probability resolves the Bayesian vs. frequentist     standoff
    Glenn Shafer

    13. "Introduction to Generalized Fiducial Inference"
    Alexander Murph, Jan Hannig and Jonathan P. Williams

    14. "Dempster-Shafer Theory for Statistical Inference"
    Ruobin Gong

    15. Slicing and Dicing a Path Through the Fiducial Forest
    Joseph B. Lang

    16. Inferential models and possibility measures
    Chuanhai Liu and Ryan Martin

    17. Conformal predictive distributions: an approach to nonparametric ducial prediction
    Vladimir Vovk

    18. Fiducial Inference and Decision Theory
    Gunnar Taraldsen and Bo Henry Lindquist

    Index

    Biography

    James Berger, PhD is the Arts and Sciences Distinguished Professor Emeritus of Statistics at Duke University. Dr. Berger received his PhD in mathematics from Cornell University in 1974. Among the awards and honors, Dr. Berger has received Guggenheim and Sloan Fellowships, the COPSS President's Award in 1985, the Sigma Xi Research Award at Purdue University for contribution of the year to science in 1993, the COPSS Fisher Lecturer in 2001, the Wald Lecturer of the IMS in 2007 and the Wilks Award from the ASA in 2015. He was elected as foreign member of the Spanish Real Academia de Ciencias in 2002, elected to the USA National Academy of Sciences in 2003, was awarded an honorary Doctor of Science degree from Purdue University in 2004, and became an Honorary Professor at East China Normal University in 2011.

    Xiao-Li Meng, PhD is the Whipple V. N. Jones Professor of Statistics at Harvard University. Dr. Meng received his PhD in statistics from Harvard University. He is the Founding Editor-in-Chief of Harvard Data Science Review. In 2020 he was elected to the American Academy of Arts and Sciences. His interests range from the theoretical foundations of statistical inferences to statistical methods and computation.

    Nancy Reid, PhD is a University Professor of Statistical Sciences at the University of Toronto. Dr. Reid received her PhD in statistics from Stanford University, and is a Fellow of the Royal Society, the Royal Society of Canada, the Royal Society of Edinburgh, and a Foreign Associate of the National Academy of Sciences. In 2015 she was appointed Officer of the Order of Canada. Her research interests include the foundations and theory of statistical inference.

    Min-ge Xie, PhD is a Distinguished Professor at Rutgers, The State University of New Jersey. Dr. Xie received his PhD in Statistics from the University of Illinois at Urbana-Champaign (UIUC). He is the current Editor of The American Statistician and a co-founding Editor-in-Chief of The New England Journal of Statistics in Data Science. His research work on confidence distributions was described as a “grounding process with energy and insight." His research interests include statistical inference, foundations of data science, fusion learning, and interdisciplinary research.