====== Course Outline ====== The course outline is a guideline to topics that will be discussed in the course, and when they will be discussed: ===== Weeks 1 and 2 ===== **Machine Learning** (basic concepts) **Math Background**: probabilities; Bayes theorem; statistics; estimation; regression; hypothesis testing; entropy; mutual information; decision tree; optimization theory and convex optimization; matrix decomposition ===== Week 3 ===== **Data, Feature and Model**: Feature Engineering, Feature Extraction (PCA, LDA, etc), Data Virtualization ===== Week 4 ===== **Machine Learning (ML) Basics**: Learnability; Some basic ML Concepts; Bayesian decision theory ===== Week 5 ===== **Generative Models (I)**: model estimation: maximum likelihood, Bayesian learning, multivariate Gaussian, Multinomial, Markov Chain model, etc. ===== Week 6 ===== **Discriminative Models (I) **: Linear discriminant functions; support vector machine (SVM); large margin classifiers; sparse kernel machines; ===== Week 7 ===== **Discriminative Models (II) **: Neural Networks and Deep Learning ===== Week 8 ===== **Generative Models (II)**: EM algorithm; Finite Mixture models, e.g. Gaussian mixture model ===== Week 9 ===== **Generative Models (III)**: graphical models, directed vs. indirected graphical models, exact inference, approximate inference (loopy belief propagation, variational inference, Monte Carlo methods) ===== Week 10 ===== **Selected Advanced Topics**: Hidden Markov Model (HMM), HMM concepts; Three algorithms: forward-backward; Viterbi decoding; Baum-Welch learning. ===== Weeks 11-12 ===== student presentation