# AI Prelim

(Last update: June 2016)

There are two parts: (i) An exam that focuses on foundations, (ii) Coursework breadth requirements.

## Exam Syllabus

### (1) General Foundations

• Linear algebra / SVD + eigenvectors -- Strang Linear Algebra and Its Applications 4th Edition, 5.1-5.3, 6.3
• Optimization basics: gradient descent, Newton’s method -- Boyd & VdB 9.1-9.5
• Multivariate Gaussians -- Jordan (J.) 13
• Kalman Filtering and Smoothing -- J. 15
• Info-theory basics -- Cover and Thomas 2.1-2.6

### (2) CS188++

• Uninformed Search -- Russell & Norvig 3rd ed. (R&N) 3.1-3.4
• A* Search and Heuristics -- R&N 3.5-3.6
• CSPs I, II -- R&N 6.1-6.5
• Local Search -- R&N 4.1-4.2
• Logic and Planning -- R&N 7.1-7.7 (omitting 7.5.2), 8.1-8.3.3, 9.1-9.4, 10.1-10.2, 12.1-12.3
• Game trees -- R&N 5.2-5.5
• Utilities / decision theory -- R&N 16.1-16.3
• MDPs and RL -- R&N 17.1-17.4, 21.1-21.6
• Decision theory / VPI -- R&N 16.5 & 16.6
• Graphical models -- R&N 14.1-14.5, J. 2 (independence/factorization), J. 3 (elimination), J. 4 (propagation-factor-graphs)
• HMMs R&N 15.1-15.5, J. 12 (hmm)

### (3) CS189++

• Statistical concepts -- J. 5 (Statistical Concepts)
• Linear regression -- J. 6 (Linear Regression and the LMS Algorithm)
• Logistic regression -- J. 7 (Linear Classification)
• SVMs, kernel methods -- R&N 18.9; Ng pp. 1-8
• Nearest neighbor -- R&N 18.8-18.8.3; Hastie, Tibshirani, Friedman (HTF) 13.3.0
• Decision trees -- R&N 18.3; HTF 9.2.1-9.2.3
• Neural nets -- R&N 18.7
• Clustering, k-means, mixture of Gaussians -- J. 10 (Mixtures and Conditional Mixtures)
• PCA -- HTF 14.5.1
• EM -- J. 11 (EM Algorithm)