Chapman and Hall/CRC
320 pages | 97 B/W Illus.
The rapid advancement in the theoretical understanding of statistical and machine learning methods for semisupervised learning has made it difficult for nonspecialists to keep up to date in the field. Providing a broad, accessible treatment of the theory as well as linguistic applications, Semisupervised Learning for Computational Linguistics offers self-contained coverage of semisupervised methods that includes background material on supervised and unsupervised learning.
The book presents a brief history of semisupervised learning and its place in the spectrum of learning methods before moving on to discuss well-known natural language processing methods, such as self-training and co-training. It then centers on machine learning techniques, including the boundary-oriented methods of perceptrons, boosting, support vector machines (SVMs), and the null-category noise model. In addition, the book covers clustering, the expectation-maximization (EM) algorithm, related generative methods, and agreement methods. It concludes with the graph-based method of label propagation as well as a detailed discussion of spectral methods.
Taking an intuitive approach to the material, this lucid book facilitates the application of semisupervised learning methods to natural language processing and provides the framework and motivation for a more systematic study of machine learning.
"…I would have loved to have had this book when I started working as a computational linguist … The book is well laid out, enjoyable to read, and the formulae aesthetically presented … The book does a very amicable job of being self-contained given the number of subjects and size of the book. I would recommend this book to mathematicians, statisticians, and libraries alike."
—CHOICE, February 2009
"However when it works, it works well, and whereas the book provides great breadth, but little depth, it will be a useful springboard for the beginning student."
– Chris J.C. Burges, Microsoft Research, in Journal of the American Statistical Association, June 2009, Vol. 104, No. 486
A brief history
Organization and assumptions
SELF-TRAINING AND CO-TRAINING
APPLICATIONS OF SELF-TRAINING AND CO-TRAINING
Two simple classifiers
Evaluating detectors and classifiers that abstain
Binary classifiers and ECOC
MATHEMATICS FOR BOUNDARY-ORIENTED METHODS
Support vector machines (SVMs)
Null-category noise model
Cluster and label
The EM algorithm
Computing the solution
Graph mincuts revisited
MATHEMATICS FOR SPECTRAL METHODS
Some basic concepts
Eigenvalues and eigenvectors
Eigenvalues and the scaling effects of a matrix
Simple harmonic motion
Spectra of matrices and graphs
Spectral methods for semisupervised learning