CIS 6930: Advanced Machine Learning
Schedule: T 7th Period, R 7-8th Periods
Location: CSE E220
Texts:
  1. Required: Pattern Recognition and Machine Learning, Christopher M. Bishop, Publisher: Springer, 2007.
  2. Recommended: Pattern Classification, Richard O. Duda, Peter E. Hart and David G. Stork, Publisher: Wiley Interscience, second edition, 2000.
  3. Additional: Statistical Learning Theory, Vladimir N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
  4. Other Material: Notes and papers from the research literature.
Instructor: Prof. Anand Rangarajan, CSE E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu

Office hours: T 8-9th Periods and R 9th Period or by appointment. 

Grading:

  1. Homeworks: 20%.
  2. Midterm: 40%.
  3. Project: 40%.
Homeworks, Projects and other Announcements

Notes
:
  1. Prerequisites: A familiarity with basic concepts in calculus, linear algebra, and probability theory. A partial list of basic requirements follows. Calculus: Differentiation, chain rule, integration. Linear algebra: Matrix multiplication, inverse, pseudo-inverse. Probability theory: Conditional probability, Bayes rule, conditional expectations. While Machine Learning (CAP6610) is obviously a useful precursor to this course, every attempt will be made to keep this course self-contained. Still, it would be enormously helpful if you had a good background in supervised and unsupervised learning, density estimation and the basics of information theory.
  2. Homeworks/programs will be assigned on an ad-hoc basis. If you do not have any prior numerical computing experience, I suggest you use MATLAB for the programs.
  3. The midterm wil be scheduled in the second half of the semester (probably in early November).
  4. The project is due at the end of the semester. Depending on the number of students, the project will be either in teams of two or individual.
  5. A set of informal notes which will evolve with the course can be found here.


Syllabus
  1. Specialized supervised learning methods such as Ada-Boost.
  2. Manifold learning.
  3. Density estimation on non-vectorial data (matrices).
  4. Fisher information, Fisher kernels and geodesics.
  5. Sampling methods and Markov Chain Monte Carlo (MCMC).