Pattern Recognition and Machine Learning
![]() This completely new textbook reflects these recent developments while providing a comprehensive introduction to the fields of pattern recognition and machine learning. It is aimed at advanced undergraduates or first-year PhD students, as well as researchers and practitioners. No previous knowledge of pattern recognition or machine learning concepts is assumed. Familiarity with multivariate calculus and basic linear algebra is required, and some experience in the use of probabilities would be helpful though not essential as the book includes a self-contained introduction to basic probability theory. The book is suitable for courses on machine learning, statistics, computer science, signal processing, computer vision, data mining, and bioinformatics. Extensive support is provided for course instructors, including more than 400 exercises, graded according to difficulty. Example solutions for a subset of the exercises are available from the book web site, while solutions for the remainder can be obtained by instructors from the publisher. The book is supported by a great deal of additional material, and the reader is encouraged to visit the book web site for the latest information. Coming soon: *For students, worked solutions to a subset of exercises available on a public web site (for exercises marked "www" in the text) *For instructors, worked solutions to remaining exercises from the Springer web site *Lecture slides to accompany each chapter *Data sets available for download The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition
![]() This major new edition features many topics not covered in the original, including graphical models, random forests, ensemble methods, least angle regression & path algorithms for the lasso, non-negative matrix factorization, and spectral clustering. There is also a chapter on methods for ``wide'' data (p bigger than n), including multiple testing and false discovery rates. Trevor Hastie, Robert Tibshirani, and Jerome Friedman are professors of statistics at Stanford University. They are prominent researchers in this area: Hastie and Tibshirani developed generalized additive models and wrote a popular book of that title. Hastie co-developed much of the statistical modeling software and environment in R/S-PLUS and invented principal curves and surfaces. Tibshirani proposed the lasso and is co-author of the very successful An Introduction to the Bootstrap. Friedman is the co-inventor of many data-mining tools including CART, MARS, projection pursuit and gradient boosting. Information Theory, Inference & Learning Algorithms
![]() Gaussian Processes for Machine Learning
![]() The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective. Many connections to other well-known techniques from machine learning and statistics are discussed, including support-vector machines, neural networks, splines, regularization networks, relevance vector machines and others. Theoretical issues including learning curves and the PAC-Bayesian framework are treated, and several approximation methods for learning with large datasets are discussed. The book contains illustrative examples and exercises, and code and datasets are available on the Web. Appendixes provide mathematical background and a discussion of Gaussian Markov processes. Generalized principal component analysis
![]() All of Statistics: A Concise Course in Statistical Inference
![]() This book covers a much wider range of topics than a typical introductory text on mathematical statistics. It includes modern topics like nonparametric curve estimation, bootstrapping and classification, topics that are usually relegated to follow-up courses. The reader is assumed to know calculus and a little linear algebra. No previous knowledge of probability and statistics is required. The text can be used at the advanced undergraduate and graduate level. Larry Wasserman is Professor of Statistics at Carnegie Mellon University. He is also a member of the Center for Automated Learning and Discovery in the School of Computer Science. His research areas include nonparametric inference, asymptotic theory, causality, and applications to astrophysics, bioinformatics, and genetics. He is the 1999 winner of the Committee of Presidents of Statistical Societies Presidents' Award and the 2002 winner of the Centre de recherches mathematiques de Montreal-Statistical Society of Canada Prize in Statistics. He is Associate Editor of The Journal of the American Statistical Association and The Annals of Statistics. He is a fellow of the American Statistical Association and of the Institute of Mathematical Statistics. |