CAP 6615: Neural Networks for Computing
Schedule: MWF, 8th Period
Location: CSE E107

Texts:

  1. Recommended: Neural Networks for Pattern Recognition, Chris Bishop, Publisher: Oxford University Press, 1995.
  2. Recommended: Neural Networks: A Comprehensive Foundation, Simon Haykin, Publisher: Prentice Hall, second edition, 1998.
  3. Recommended: Pattern Classification, Duda, Hart and Stork, Publisher: Wiley Interscience, second edition, 2000.
  4. Additional: Statistical Learning Theory, Vladimir N. Vapnik, Publicher: John Wiley and Sons, New York, 1998.
  5. Other Material: Notes and papers from the following: Neural Computation, IEEE Trans. Neural Networks, Neural Networks
Instructor: Prof. Anand Rangarajan, CSE E352. Phone: 352 392 1507, Fax: 352 392 1220, email: anand@cise.ufl.edu

Office hours: MW 4-5PM and F 2-3PM or by appointment.

Grading:

  1. Homeworks: 25%.
  2. Two Midterms: 25% each.
  3. Project: 25%
Notes:
  1. Prerequisites: A familiarity with basic concepts in calculus, linear algebra, and probability theory. A partial list of basic requirements follows. Calculus: Differentiation, chain rule, integration. Linear algebra: Matrix multiplication, inverse, pseudo-inverse. Probability theory: Conditional probability, Bayes rule, conditional expectations.
  2. Homeworks/programs will be assigned bi-weekly. If you do not have any prior numerical computing experience, I suggest you use MATLAB for the programs.
  3. First midterm will be given approximately at the middle of the semester and the second will be held on December 8th, 2004.
  4. The project will be the same for all students. A project demonstration is due Nov. 23rd, 2004 and will be graded competitively. Depending on the number of students, the project will be either in teams of two or individual.
  5. A set of informal notes which will evolve with the course can be found here.


Syllabus
Supervised Learning: linear discriminants, the perceptron, backpropagation, multi-layer perceptrons, radial basis functions, learning and generalization theory, support vector machines.
Density Estimation: finite mixtures, the expectation-maximization (EM) algorithm.
Unsupervised Learning: competitive networks, clustering, Kohonen self-organizing feature maps, principal and independent component analysis (PCA and ICA), kernel methods, local linear embeddings (LLE).