CAP 6610: Machine Learning
Schedule: T 7th Period, R 7-8th Periods
Location: CSE E107
Texts:
  1. Required: Pattern Recognition and Machine Learning, Christopher M. Bishop, Publisher: Springer, 2007.
  2. Recommended: Pattern Classification, Richard O. Duda, Peter E. Hart and David G. Stork, Publisher: Wiley Interscience, second edition, 2000.
  3. Additional: Statistical Learning Theory, Vladimir N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
  4. Other Material: Notes and papers from the research literature.
Instructor: Prof. Anand Rangarajan, CSE E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu

Office hours: Anand, T 8-9th Periods and F 7th Period or by appointment. 

Grading:

  1. Homeworks: 20%.
  2. Two Midterms: 30% each.
  3. Project: 20%.
Homeworks, Projects and other Announcements

Notes
:
  1. Prerequisites: A familiarity with basic concepts in calculus, linear algebra, and probability theory. A partial list of basic requirements follows. Calculus: Differentiation, chain rule, integration. Linear algebra: Matrix multiplication, inverse, pseudo-inverse. Probability theory: Conditional probability, Bayes rule, conditional expectations. While AI is listed as a pre-requisite, if any aspect of AI turns out to be required, it will be taught in class in order to make the course self-contained.
  2. Homeworks/programs will be assigned bi-weekly. If you do not have any prior numerical computing experience, I suggest you use MATLAB for the programs.
  3. First midterm will be held on Wednesday, March 5th, 2008 from 6PM-12 midnight and the second will be held on Wednesday, April 23rd, 2008  from 6PM-12 midnight.
  4. The project is due at the end of the semester. Depending on the number of students, the project will be either in teams of two or individual.
  5. A set of informal notes which will evolve with the course can be found here.


Syllabus
  1. Probability, decision and information theory review.
  2. Linear regression and classification.
  3. Multi-layer perceptrons and neural networks.
  4. Kernel methods.
  5. Graphical models.
  6. Mixture models and Expectation-Maximization (EM).
  7. Sampling methods and Markov Chain Monte Carlo (MCMC).
  8. Special topics such as hidden Markov models, product of experts etc.