The course outline is a guideline to topics that will be discussed in the course, and when they will be discussed:
Machine Learning (basic concepts)
Math Background: probabilities; Bayes theorem; statistics; estimation; regression; hypothesis testing; entropy; mutual information; decision tree; optimization theory and convex optimization; matrix decomposition
Data, Feature and Model: Feature Engineering, Feature Extraction (PCA, LDA, etc), Data Virtualization
Machine Learning (ML) Basics: Learnability; Some basic ML Concepts; Bayesian decision theory
Generative Models (I): model estimation: maximum likelihood, Bayesian learning, multivariate Gaussian, Multinomial, Markov Chain model, etc.
Discriminative Models (I) : Linear discriminant functions; support vector machine (SVM); large margin classifiers; sparse kernel machines;
Discriminative Models (II) : Neural Networks and Deep Learning
Generative Models (II): EM algorithm; Finite Mixture models, e.g. Gaussian mixture model
Generative Models (III): graphical models, directed vs. indirected graphical models, exact inference, approximate inference (loopy belief propagation, variational inference, Monte Carlo methods)
Selected Advanced Topics: Hidden Markov Model (HMM), HMM concepts; Three algorithms: forward-backward; Viterbi decoding; Baum-Welch learning.
student presentation