Fall 2013: Notes
- Week 1: Introduction, probability, and bayes (week1)
- Week 2: Parametric density estimation (09/03/2013); and MAP estimation, mixture densities, EM (09/05/2013)
- Week 3: Non-parametric density estimation (09/10/2013); and linear discriminants (09/12/2013)
- Week 4: Separability (09/17/2013); and MSE (09/19/2013)
- Week 5: Loss functions, gradient descent (09/24/2013); and linear discriminant analysis (09/26/2013)
- Week 6 (10/1): Function representation with one, two, three layers
- Week 7 (10/8,10/10): Error back-propagation, Levenberg-Marquardt (10/10/2013)
- Week 8 (10/15,10/17): Radial Basis Functions (Reading: chapter 5 and Smola and Schoelkopf, 10/15/2013), Probabilistic interpretations of L2 error (10/17/2013)
- Week 9 (10/22,10/24): Cross-entropy error (Reading: chapter 7 and Blahut, 1974, 10/22/2013), PCA and auto-encoders (chapter 9, PCA Tutorial)
- Week 10 (10/29,10/31): kernel PCA, Representer Theorem (10/29). Deep auto-encoders, restricted Boltzmann machines (reading: chapter 9, Deep Learning Tutorial on RBM and 10/31/2013)
- Week 11 (11/5,11/7): Lecture notes (11/05/2013 and 11/07/2013) and papers on deep belief networks (Mohamed et al 2009), convolutional nets ( Krizhevsky et al 2012 and Waibel et al 1989), deep architectures ( Bengio & Lecun 2007)
- Week 12: 11/12 proposal due in class, 11/14 proposal comments returned, Bayesian stuff.
- Week 13: 11/20 exam review, 11/21 exam
- Week 14 (12/3,12/5): Bayesian stuff.
- Week 15 (12/10,12/12): Final project presentations.