CAP 6615: Neural Networks for Computing
Schedule: MWF, 8th Period
Location: CSE E122

Texts:

  1. Recommended: Neural Networks for Pattern Recognition, Chris Bishop, Publisher: Oxford University Press, 1995.
  2. Recommended: Neural Networks: A Comprehensive Foundation, Simon Haykin, Publisher: Prentice Hall, second edition, 1998.
  3. Recommended: Pattern Classification, Duda, Hart and Stork, Publisher: Wiley Interscience, second edition, 2000.
  4. Additional: Statistical Learning Theory, Vladimir N. Vapnik, Publisher: John Wiley and Sons, New York, 1998.
  5. Other Material: Notes and papers from the following: Neural Computation, IEEE Trans. Neural Networks, Neural Networks
Instructor: Prof. Anand Rangarajan, CSE E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu

Teaching Assistant: Mingxi Wu, Office location: TBA, email: mwu@cise.ufl.edu

Office hours: Anand, MW 4-5PM and F 2-3PM or by appointment. Mingxi, TR 4-5PM, F 1-2PM.

Grading:

  1. Homeworks: 20%.
  2. Two Midterms: 20% each.
  3. Two Projects: 20% each
Notes:
  1. Prerequisites: A familiarity with basic concepts in calculus, linear algebra, and probability theory. A partial list of basic requirements follows. Calculus: Differentiation, chain rule, integration. Linear algebra: Matrix multiplication, inverse, pseudo-inverse. Probability theory: Conditional probability, Bayes rule, conditional expectations. While AI is listed as a pre-requisite, if any aspect of AI turns out to be required, it will be taught in class in order to make the course self-contained.
  2. Homeworks/programs will be assigned bi-weekly. If you do not have any prior numerical computing experience, I suggest you use MATLAB for the programs.
  3. First midterm will be held on October 17th, 2005 from 6PM-12 midnight in CSE E404 and the second will be held on December 7th, 2005 from 6PM-12 midnight.
  4. The two projects will be the same for all students. The first project is due November 2nd 2005 and the second is due December 2nd 2005. Depending on the number of students, the project will be either in teams of two or individual.
  5. A set of informal notes which will evolve with the course can be found here.


Syllabus
Supervised Learning: linear discriminants, the perceptron, backpropagation, multi-layer perceptrons, radial basis functions, learning and generalization theory, support vector machines.
Density Estimation: finite Gaussian mixtures, the expectation-maximization (EM) algorithm.
Unsupervised Learning: competitive networks, clustering, Kohonen self-organizing feature maps, principal and independent component analysis (PCA and ICA), local linear embeddings (LLE), ISOMAP.