Machine Learning (May-August 2009)

Instructor:  Dr. Debrup Chakraborty  (debrup(AT)cs.cinvestav.mx)

References:  No specific text book for the course. We shall refer to multiple sources. We shall extensively use the course notes of Prof. A. Ng. Additionally we may refer to the following texts
                                 1) Machine Learning, Tom M. Mitchell, McGraw-Hill International Edition, 1997.
                                 2) Pattern Classification, Duda, Hart and Stork, Wiley 2000.

Classes:  Monday and Wednesday  from 10:00-12:00

Grading Policies:   40% on home works,   30% on tests and 30% on a project.

Home works:     Home work 1: Due 1st June 2009  

                                          Home work 2: Due  June, 24 2009
                                Home work 3: Due  July, 20 2009

                                

 

 

Schedule (tentative, we shall fill the details as we progress)

11th May

Introduction: Intuitive introduction to the process of learning. Supervised, un-supervised and reinforcement learning. Function approximation and classification. Model selection and feature selection.

 

13th May

Linear Regression: Linear regression, online and batch gradient descent, probabilistic viewpoint, maximum likelihood estimation, logistic regression and parameter estimation for logistic regression.

Notes. By Prof. Ng

18th May.

Bayesian Learning: Conditional probabilities and the Bayes rule. The Bayes classifier. Normal density. Discriminant functions. Class Boundaries for Bayesian discriminant functions.

Read Chapter2 of Duda, Hart and Stork

20th May.

Bayesian Lerning: Maximum likelihood estimation, estimation of the parameters for multivariate normal distribution

 

25th May.

Bayesian Learning: Naïve Bayes classifier: the case of classifying emails into spam and non-spams

 

1st June

Bayesian Learning: Discussion on homework 1, correction to the gradient descent algorithm which was give in class on May 13, the multinomial events model for text classification, Laplace smoothing

HW 1, due

3rd June

Non parametric methods: The k-nearest neighbor classifiers, locally weighted regression

8th  June

Neural networks: The biological neural network, its analogy with artificial neural networks, the model of a neuron, the perceptron

 

10th  June

Neural Networks: The multilayered perceptron, the back propagation algorithm

 

15th June

Neural Networks: Discussion on HW2, the radial basis function network, the k-means algorithm for selecting centers for basis functions.

17th June

Support Vector machines: Functional and geometric Margins. Formulating the optimization problem for SVM.

Notes by Prof. Ng

22nd June

Support vector machines: Lagrange duality. The primal and dual formulation for the SVM problem

 

24th June

Support vector machines: Mercer Kernels, the non-separable case with regularization

 

29th June

Review

6th  July

Test 1

 

15th July

Support Vector Machines: The SMO Algorithm

Simplified SMO by Prof. Ng

Platts paper

20th July

Feature Selection and Dimensionality Reduction

Notes by Prof. Ng

22nd July

Principle Component Analysis

Notes by Prof. Ng

27th July

PCA for face recognition

Eigen faces

29th July

Ensemble Methods

Bagging
Boosting

30th July

More Unsupervised Techniques: The EM algorithm

Notes by Prof. Ng

3rd Aug

Learning Theory

Notes by Prof. Ng

5th Aug

Learning Theory

 

10th Aug

Review

 

12th Aug

Test 2

 

24th Aug

Final Project Submission: