Wednesday 17 August 2016

This MIT course was one of the many Machine Learning materials I went thru. to know in and out of ML ....


Machine Learning Course Description

This introductory course gives an overview of many concepts, techniques, and algorithms in machine learning, beginning with topics such as classification and linear regression and ending up with more recent topics such as boosting, support vector machines, hidden Markov models, and Bayesian networks. The course will give the student the basic ideas and intuition behind modern machine learning methods as well as a bit more formal understanding of how, why, and when they work. The underlying theme in the course is statistical inference as it provides the foundation for most of the methods covered.
http://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-867-machine-learning-fall-2006/lecture-notes/

LEC #TOPICS
1Introduction, linear classification, perceptron update rule (PDF)
2Perceptron convergence, generalization (PDF)
3Maximum margin classification (PDF)
4Classification errors, regularization, logistic regression (PDF)
5Linear regression, estimator bias and variance, active learning (PDF)
6Active learning (cont.), non-linear predictions, kernals (PDF)
7Kernal regression, kernels (PDF)
8Support vector machine (SVM) and kernels, kernel optimization (PDF)
9Model selection (PDF)
10Model selection criteria (PDF)
11Description length, feature selection (PDF)
12Combining classifiers, boosting (PDF)
13Boosting, margin, and complexity (PDF)
14Margin and generalization, mixture models (PDF)
15Mixtures and the expectation maximization (EM) algorithm (PDF)
16EM, regularization, clustering (PDF)
17Clustering (PDF)
18Spectral clustering, Markov models (PDF)
19Hidden Markov models (HMMs) (PDF)
20HMMs (cont.) (PDF)
21Bayesian networks (PDF)
22Learning Bayesian networks (PDF)
23
Probabilistic inference
Guest lecture on collaborative filtering (PDF)

No comments:

Post a Comment