Weeks 1-2: A.
Machine Learning (basic concepts)
; B.
Math foundation
: probabilities and statistics, Bayes theorem, Entropy, mutual information, decision tree, optimization, matrix factorization (Weekly Reading:
W2
) (A useful online manual on
Matrix Calculus
)
Week 3:
Feature Extraction in Machine Learning
: PCA, LDA, manifold learning (MDS, SNE, LLE, Isomap), data virtualization. (Weakly reading:
a tutorial paper on PCA
,
a paper on ML
).
Week 4-5:
Discriminative Models
: Statistical Learning theory; Perceptron; Linear Regression; Minimum Classification Error; Support Vector Machines (SVM); Ridge Regression; LASSO; Compressed Sensing
Week 6:
Artificial Neural Networks and Deep Learning
: artificial neural networks (model structure and learning criteria); error back-propagation algorithm; fine-tuning tricks; Convolutional neural networks; Recurrent neural networks (Extra reading assignment:
other gradient descent algorithms for deep learning
).
Week 7:
Bayesian Decision Rule
: MAP decision rule; The Bayes Error; Statistical Data Modelling; Generative vs. Discriminative models
Week 8:
Generative Models and Parameter Estimation
: generative models in general; maximum likelihood estimation; Bayesian Learning; Gaussian, logistic regression, Multinomial, Markov chains, GMM.
Week 9:
Graphical models
: Bayesian Networks vs Markov random fields; Conditional independence; Inference in graphical models; belief propagation; variational inference