Computing and Information Technology Interactive Digital Educational Library

 

CITIDEL >
Syllabus Collection >
Syllabus >

Please use this identifier to cite or link to this item: http://hdl.handle.net/10117/5196

Title: Mathematical Foundations of Artificial Intelligence
Authors: Computer Science at Rochester
Issue Date: 
Publisher: Computer Science at Rochester
Citation: http://www.cs.rochester.edu/~gildea/2006_Spring/
Abstract: Computer Science 246/446 Mathematical Foundations of Artificial Intelligence Spring 2006 Instructor: Dan Gildea office hours 1-2pm Wed TA: Ding Liu office hours 9:30-11am Tu/Th Location: Tu/Th 11:05am-12:20pm, CSB 601 Homeworks There is no required text, but the following are useful references in addition to the reading material assigned for each class: * Stuart Russell and Peter Norvig, Artificial Intelligence, A Modern Approach. * David J. C. MacKay, Information Theory, Inference, and Learning Algorithms. * Trevor Hastie, Robert Tibshirani, Jerome Friedman, The Elements of Statistical Learning: Data Mining, Inference, and Prediction. * Dana Ballard, Natural Computation. * Christopher Bishop, Neural Networks for Pattern Recognition. Readings for each class are available in the box in the CS mailroom. Syllabus On we will cover which means that after class you will understand if before class you have read 1/19 Probability Theory independence, bayes rule wasserman ch 1, 2, 3 1/24 Information Theory entropy, kl-distance, coding mackay ch 2 1/26 Probabilistic Inference priors: bayesian reasoning, MAP heckerman 1/31 Probabilistic Inference priors on continuous variables mackay ch 24 2/2 Minimum Description Length decision trees mitchell 2/7 Probabilistic Inference polytree mackay ch 26 2/9 Expectation Maximization latent variable clustering bilmes §1-3 2/14 Independent Component Analysis source separation mackay ch 34 2/16 Learning Theory probably approximately correct kearns&vazirani ch 1 2/21 Learning Theory VC dimension kearns&vazirani ch 2, 3 2/23 Eigenvectors least squares, PCA bishop 310-314, appendix E 2/28 Nonlinear Dimensionality Reduction isomap, locally linear embedding roweis; tenenbaum 3/2 Optimization conjugate gradient bishop 274-282 3/7 Optimization Gibbs Sampling, MCMC mackay ch 29 3/9 Review 3/21 Midterm 3/23 Midterm Solutions 3/28 MCMC, Gibbs (continued from before midterm) 3/30 Perceptron, Backpropagation the chain rule mackay ch 38, 39; bishop 140-148 4/4 Support Vectors the wolfe dual burges §3-4 4/6 Support Vectors the kernel trick 4/11 Hidden Markov Models forward-backward bilmes §4 4/13 Reinforcement Learning q-learning ballard ch 11 4/18 Reinforcement Learning partial observability ballard ch 11 4/20 Games nash equilibrium 4/25 Games learning to co-operate hauert, zhu 4/27 Something Fun 5/2 Review come to class with questions! Final Exam: Friday May 12, 8:30am. Grading * Final exam: 35% * Homeworks: 35% * Midterm: 25% * Class participation: 5% gildea @ cs rochester edu February 7, 2006
URI: http://www.citidel.org/handle/10117/5196
Appears in Collections:Syllabus

Files in This Item:

File SizeFormat
40-2006_Spring6KbUnknownView/Open

All items in DSpace are protected by copyright, with all rights reserved.

 

Valid XHTML 1.0! DSpace Software Copyright © 2002-2006 MIT and Hewlett-Packard - Feedback