Anita C Faul
I did my undergraduate degree in Germany and at Cambridge, UK. This was followed by a PhD on the FaulPowell Algorithm for Radial Basis Function Interpolation under the supervision of Mike Powell. I then worked on the Relevance Vector Machine with Mike Tipping at Microsoft Research Cambridge. Ten years in industry followed. In my books "A Concise Introduction to Numerical Analysis" and "A Concise Introduction to Machine Learning" I bring out the underlying mathematics from first principles.
Biography
Anita Faul came to Cambridge after studying two years in Germany. She did Part II and Part III Mathematics at Churchill College, Cambridge. Since these are only two years, and three years are necessary for a first degree, she does not hold one. However, this was followed by a PhD on the FaulPowell Algorithm for Radial Basis Function Interpolation under the supervision of Professor Mike Powell. She then worked on the Relevance Vector Machine with Mike Tipping at Microsoft Research Cambridge. Ten years in industry followed where she worked on various algorithms on mobile phone networks, image processing and data visualization. After six year as Teaching Associate at the University of Cambridge which included being a Fellow, Director of Studies for Mathematics and Graduate Tutor at Selwyn College, she works as Data Scientist at the British Antarctic Survey. In her books "A Concise Introduction to Numerical Analysis" and "A Concise Introduction to Machine Learning" she brings out the underlying mathematics from first principles. A moodle site accompanying her books is at acfaul.gnomio.com. Please contact her, if you require access.Areas of Research / Professional Expertise

Numerical Analysis, Machine Learning
Websites
Books
Articles
Relevance Vector Machines with Uncertainty Measure for Seismic Bayesian Compress
Published: Dec 18, 2016 by 15th IEEE International Conference on Machine Learning and Applications (ICMLA)
Authors: Georgios Pilikos, A.C. Faul
Relevance Vector Machines with Uncertainty Measure for Seismic Bayesian Compressive Sensing and Survey Design
The model is simple, until proven otherwise  how to cope in an ever changing wo
Published: Sep 01, 2016 by Data for Policy 2016, Frontiers of Data Science for Government
Authors: A.C. Faul, Georgios Pilikos
The model is simple, until proven otherwise  how to cope in an ever changing world
A Krylov subspace algorithm for multiquadric interpolation in many dimensions
Published: Jan 01, 2005 by IMA JOURNAL OF NUMERICAL ANALYSIS
Authors: A.C. Faul, G. Goodsell, M.J.D. Powell
Subjects:
Mathematics
Convergence is guaranteed by the inclusion of a Krylov subspace technique that employs the native seminorm of multiquadric functions. An algorithm is specified, its convergence is proven, and careful attention is given to the choice of the operator that defines the Krylov subspace, which is analogous to preconditioning in the conjugate gradient method.
Fast Marginal Likelihood Maximisation for Sparse Bayesian Models
Published: Sep 01, 2002 by Proceedings of the Ninth International Workshop on Artificial Intelligence and Statistics
Authors: M.E.Tipping, A.C.Faul
Subjects:
Mathematics
The `sparse Bayesian' modelling approach, as exemplified by the `relevance vector machine', enables sparse classification and regression functions to be obtained by linearlyweighting a small number of fixed basis functions from a large dictionary of potential candidates. We describe a new and highly accelerated algorithm which exploits recentlyelucidated properties of the marginal likelihood function to enable maximisation via sequential addition and deletion of candidate basis functions.
Analysis of Sparse Bayesian Learning
Published: Dec 01, 2001 by Advances in Neural Information Processing Systems
Authors: A.C.Faul, M.E.Tipping
Using a particular form of Gaussian parameter prior, `learning' is the maximisation, with respect to hyperparameters, of the marginal likelihood of the data. This paper studies the properties of that objective function, and demonstrates that conditioned on an individual hyperparameter, the marginal likelihood has a unique maximum which is computable in closed form.
A Variational Approach to Robust Regression
Published: Jan 01, 2001 by Artificial Neural Networks  ICANN 2001
Authors: A.C.Faul, M.E.Tipping
Subjects:
Mathematics
We consider the problem of regression estimation within a Bayesian framework for models linear in the parameters and where the target variables are contaminated by 'outliers'. We introduce an explicit distribution to explain outlying observations, and utilise a variational approximation to realise a practical inference strategy.
Proof of convergence of an iterative technique for thin plate spline interpolati
Published: Oct 01, 1999 by ADVANCES IN COMPUTATIONAL MATHEMATICS
Authors: A.C.Faul, M.J.D.Powell
Subjects:
Mathematics
Thin plate spline methods provide an interpolant to values of a real function. The need for iterative procedures arises, since hardly any sparsity occurs in the linear system of interpolation equations. A proof of convergence of this method is given. All the changes to the thin plate spline coefficients reduce a semi‐norm of the difference between the required interpolant and the current approximation.
Krylov subspace methods for radial function interpolation
Published: Aug 01, 1999 by 18th Biennial Conference on Numerical Analysis
Authors: A.C.Faul, M.J.D.Powell
Subjects:
Mathematics
The kth iteration calculates the element in a kdimensional linear subspace of radial functions that is closest to the required interpolant, the subspaces being generated by a Krylov construction that employs a selfadjoint operator A. Distances between functions are measured by the seminorm that is induced by the wellknown conditional positive or negative definite properties of the matrix of the interpolation problem.
Videos
Published: Feb 16, 2021
Contribution to PhiWeek 2020
Published: Jun 03, 2020
Rework expert seminar
Published: May 22, 2019
Talk given at the London Machine Learning MeetUp