MATH 774: Statistical Learning Theory (Fall 2007)

Instructor: Michael Nussbaum

Meeting Time & Room

Prerequisite: Basic mathematical statistics (MATH 674  or equivalent), and measure theoretic probability (MATH 671  or equivalent), or permission of instructor.

Required Textbook:The Elements of Statistical Learning (Data Mining, Inference and Prediction) by T. Hastie, R. Tibshirani, J. H. Friedman, Springer, 2001.

The course aims to present the developing interface between machine learning theory and statistics. Topics include an introduction to classification and pattern recognition;  the connection to nonparametric regression is emphasized throughout. Some classical  statistical methodology will be reviewed, like discriminant analysis and  logistic regression, as well as the notion of perceptron which played a key role in the development of machine learning theory. The empirical risk minimization principle will be introduced, as well as its justification by Vapnik-Chervonenkis bounds. Basic principles of constructing classifiers will be treated in detail, such  as support vector machines,  kernelization, neural networks and tree methods.  The course will conclude with an outline of bagging and boosting as the  most active research areas in learning  theory today.