11/06/2000: Kohonen SOFM notes will be handed out in class today. Also, I've downloaded and placed .ps and .pdf versions of the Generative Topographic Mapping (GTM) paper here and here respectively.

10/31/00: Homework #4 has been placed on the egroups website. Please download it from there. Let me know if you experience any problems with the download. I'm trying it out to see if it works.

10/24/00: Addendum to Homework #3 are available here (in Postscript) and here (in pdf). The "addendum" portion of the homework has to do with testing the performance of the backpropagation algorithm. I've also simplified the mixtures problem so that more of you will be able to get it working.

10/10/2000: Solutions to Homework #2 are available here (in Postscript) and here (in pdf).

10/09/2000: Homework #3 is due on October 23rd. It is available here (in Postscript) and here (in pdf).

09/22/2000: Homework #2 is due on October 6th. It is available here (in Postscript) and here (in pdf).

09/19/2000: Solutions to Homework #1. The solutions to Bishop 3.2, 3.4 and 3.8 are available here (in Postscript format) and here (in pdf format).

09/01/2000: Homework #1. Bishop 3.2, 3.4, 3.8 and 3.9. This homework is due on Sept. 15th. That is the day when the solutions will be posted on the web. Note that 3.9 is a programming assignment, so please give yourself enough time to complete it. I expect 3.4 to be a bit difficult for some people, so please see me if you don't think you have the right prerequisites for tackling this problem.

08/30/2000: I've set up an egroup for the course at http://www.egroups.com/group/fall2000_cap6615
Please take a moment to subscribe to this egroup if you're interested in participating in and learning from informal (and perhaps some formal) discussions on neural computation.

08/27/2000: The cost function for the perceptron can be difficult to understand especially because the summation is over the set of misclassified patterns. Note that this set changes during the training process. Try to write out the cost function (Bishop 3.67) explicitly in terms of the individual components of the weight (w) and feature (x) vectors. Set the unknown function to identity if you're confused by it. The superscript (n) indexes the current pattern being trained on. Also, remember that x_0=1 and that there is an extra "bias" weight w_0 associated with x_0.

08/25/2000: So far, we've examined linear discriminants and classifiers for both the two class and multiple class problems. This topic is a subset of supervised learning for pattern recognition. We won't talk much about linear regression - the regression counterpart to linear discriminants.

08/23/2000: Introduction to Neural Networks. Material mostly taken from Bishop 1.1, 1.2 and 1.5. Please browse through http://ai.about.com/compute/ai/msubneuralnet.htm and the Statsoft page for much more information on NNs, their origin, purpose and application.  For specific fields such as data mining and medicine, please see the data mining entry in http://ai.about.com/compute/ai/msubneuralnet.htm and Richard Dybowski's paper on NNs in medicine available as a PDF file entitled "Neural Computation in Medicine" on  his webpage.