lecture_notes
This is an old revision of the document!
- Weeks 1-2: A. Machine Learning (basic concepts); B. Math foundation (updated a bit): probabilities, Bayes theorem, statistics, Entropy, mutual information, decision tree, optimization, matrix decomposition (Weekly Reading: W2) (A useful online manual on Matrix Calculus)
- Week 3: Feature Extraction in Machine Learning: PCA, LDA, manifold learning (MDS, SNE, LLE, Isomap), data virtualization. (Weakly reading: a tutorial paper on PCA, a paper on ML).
- Week 4: Bayesian Decision Rule: MAP decision rule; The Bayes Error; Statistical Data Modelling; Generative vs. Discriminative models
- Week 5-6: Discriminative Models: Statistical Learning theory; Perceptron; Linear Regression; Minimum Classification Error; Support Vector Machines (SVM); Ridge Regression; LASSO; Compressed Sensing
lecture_notes.1476903461.txt.gz · Last modified: 2016/10/19 18:57 by hj