
Linear Algebra and Matrix Analysis for Statistics
Preview
Book Description
Linear Algebra and Matrix Analysis for Statistics offers a gradual exposition to linear algebra without sacrificing the rigor of the subject. It presents both the vector space approach and the canonical forms in matrix theory. The book is as self-contained as possible, assuming no prior knowledge of linear algebra.
The authors first address the rudimentary mechanics of linear systems using Gaussian elimination and the resulting decompositions. They introduce Euclidean vector spaces using less abstract concepts and make connections to systems of linear equations wherever possible. After illustrating the importance of the rank of a matrix, they discuss complementary subspaces, oblique projectors, orthogonality, orthogonal projections and projectors, and orthogonal reduction.
The text then shows how the theoretical concepts developed are handy in analyzing solutions for linear systems. The authors also explain how determinants are useful for characterizing and deriving properties concerning matrices and linear systems. They then cover eigenvalues, eigenvectors, singular value decomposition, Jordan decomposition (including a proof), quadratic forms, and Kronecker and Hadamard products. The book concludes with accessible treatments of advanced topics, such as linear iterative systems, convergence of matrices, more general vector spaces, linear transformations, and Hilbert spaces.
Table of Contents
Matrices, Vectors, and Their Operations
Basic definitions and notations
Matrix addition and scalar-matrix multiplication
Matrix multiplication
Partitioned matrices
The "trace" of a square matrix
Some special matrices
Systems of Linear Equations
Introduction
Gaussian elimination
Gauss-Jordan elimination
Elementary matrices
Homogeneous linear systems
The inverse of a matrix
More on Linear Equations
The LU decomposition
Crout’s Algorithm
LU decomposition with row interchanges
The LDU and Cholesky factorizations
Inverse of partitioned matrices
The LDU decomposition for partitioned matrices
The Sherman-Woodbury-Morrison formula
Euclidean Spaces
Introduction
Vector addition and scalar multiplication
Linear spaces and subspaces
Intersection and sum of subspaces
Linear combinations and spans
Four fundamental subspaces
Linear independence
Basis and dimension
The Rank of a Matrix
Rank and nullity of a matrix
Bases for the four fundamental subspaces
Rank and inverse
Rank factorization
The rank-normal form
Rank of a partitioned matrix
Bases for the fundamental subspaces using the rank normal form
Complementary Subspaces
Sum of subspaces
The dimension of the sum of subspaces
Direct sums and complements
Projectors
Orthogonality, Orthogonal Subspaces, and Projections
Inner product, norms, and orthogonality
Row rank = column rank: A proof using orthogonality
Orthogonal projections
Gram-Schmidt orthogonalization
Orthocomplementary subspaces
The fundamental theorem of linear algebra
More on Orthogonality
Orthogonal matrices
The QR decomposition
Orthogonal projection and projector
Orthogonal projector: Alternative derivations
Sum of orthogonal projectors
Orthogonal triangularization
Revisiting Linear Equations
Introduction
Null spaces and the general solution of linear systems
Rank and linear systems
Generalized inverse of a matrix
Generalized inverses and linear systems
The Moore-Penrose inverse
Determinants
Definitions
Some basic properties of determinants
Determinant of products
Computing determinants
The determinant of the transpose of a matrix — revisited
Determinants of partitioned matrices
Cofactors and expansion theorems
The minor and the rank of a matrix
The Cauchy-Binet formula
The Laplace expansion
Eigenvalues and Eigenvectors
Characteristic polynomial and its roots
Spectral decomposition of real symmetric matrices
Spectral decomposition of Hermitian and normal matrices
Further results on eigenvalues
Singular value decomposition
Singular Value and Jordan Decompositions
Singular value decomposition (SVD)
The SVD and the four fundamental subspaces
SVD and linear systems
SVD, data compression and principal components
Computing the SVD
The Jordan canonical form
Implications of the Jordan canonical form
Quadratic Forms
Introduction
Quadratic forms
Matrices in quadratic forms
Positive and nonnegative definite matrices
Congruence and Sylvester’s law of inertia
Nonnegative definite matrices and minors
Extrema of quadratic forms
Simultaneous diagonalization
The Kronecker Product and Related Operations
Bilinear interpolation and the Kronecker product
Basic properties of Kronecker products
Inverses, rank and nonsingularity of Kronecker products
Matrix factorizations for Kronecker products
Eigenvalues and determinant
The vec and commutator operators
Linear systems involving Kronecker products
Sylvester’s equation and the Kronecker sum
The Hadamard product
Linear Iterative Systems, Norms, and Convergence
Linear iterative systems and convergence of matrix powers
Vector norms
Spectral radius and matrix convergence
Matrix norms and the Gerschgorin circles
SVD – revisited
Web page ranking and Markov chains
Iterative algorithms for solving linear equations
Abstract Linear Algebra
General vector spaces
General inner products
Linear transformations, adjoint and rank
The four fundamental subspaces - revisited
Inverses of linear transformations
Linear transformations and matrices
Change of bases, equivalence and similar matrices
Hilbert spaces
References
Exercises appear at the end of each chapter.
Reviews
"… a unique and remarkable book … has much to offer that is not found elsewhere. … In Linear Algebra and Matrix Analysis for Statistics, Sudipto Bannerjee and Anindya Roy have raised the bar for textbooks in this genre. For me, this book will be an invaluable resource for my teaching and research. … an outstanding choice for research-oriented statisticians who want a comprehensive theoretical treatment of the subject that will take them well beyond the prerequisites for the study of linear models."
—Journal of the American Statistical Association, Vol. 110, 2015"The sixteen chapters cover the full range of topics … Topics are presented in a logical order and in a reasonable pace. The book is compactly written and the approach throughout is rigorous, yet well readable. … an excellent introduction to linear algebra."
—Zentralblatt MATH 1309"This would be a reasonable candidate for use in a standard linear algebra course, even at institutions with no statistics majors. … The proofs are very detailed and the authors bind the argument together with clear text that flows beautifully. … Some linear algebra courses put a greater emphasis on concrete applications or on using software to get computations done. Other texts treat linear algebra as a branch of abstract algebra and allow spaces over arbitrary fields. This book is a strong contender for the vast majority of linear algebra courses that fall between those two extremes."
—MAA Reviews, October 2014"This beautifully written text is unlike any other in statistical science. It starts at the level of a first undergraduate course in linear algebra, and takes the student all the way up to the graduate level, including Hilbert spaces. It is extremely well crafted and proceeds up through that theory at a very good pace. The book is compactly written and mathematically rigorous, yet the style is lively as well as engaging. This elegant, sophisticated work will serve upper-level and graduate statistics education well. All and all a book I wish I could have written."
—Jim Zidek, University of British Columbia, Vancouver, Canada