User Tools

Site Tools


course_outline

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
course_outline [2016/08/29 19:19] hjcourse_outline [2016/09/12 21:11] (current) hj
Line 4: Line 4:
  
 ===== Weeks 1 and 2 ===== ===== Weeks 1 and 2 =====
 +
 +**Machine Learning** (basic concepts)
  
 **Math Background**: probabilities; Bayes theorem; statistics; estimation; regression; hypothesis testing; entropy; mutual information; decision tree; optimization theory and convex optimization; matrix decomposition  **Math Background**: probabilities; Bayes theorem; statistics; estimation; regression; hypothesis testing; entropy; mutual information; decision tree; optimization theory and convex optimization; matrix decomposition 
Line 32: Line 34:
 **Generative Models (II)**: EM algorithm; Finite Mixture models, e.g. Gaussian mixture model **Generative Models (II)**: EM algorithm; Finite Mixture models, e.g. Gaussian mixture model
  
-===== Week 9 =====+===== Week 9  =====
  
-**Generative Models (III)**:  +**Generative Models (III)**:  graphical models, directed vs. indirected graphical models, exact inference, approximate inference (loopy belief propagation, variational inference, Monte Carlo methods)
  
 ===== Week 10  ===== ===== Week 10  =====
  
-**Hidden Markov Model (HMM)**: HMM vs. Markov chains; HMM concepts; Three algorithms: forward-backward; Viterbi decoding; Baum-Welch learning. +**Selected Advanced Topics**: Hidden Markov Model (HMM)HMM concepts; Three algorithms: forward-backward; Viterbi decoding; Baum-Welch learning. 
    
- +===== Weeks 11-12 =====
- +
-midterm presentation +
- +
-===== Week 8 ===== +
- +
-**Automatic Speech Recognition (ASR) (I)**: Acoustic and Language Modeling:  +
-HMM for ASR;  ASR as an example of pattern classification; +
-Acoustic modeling: HMM learning (ML, MAP, DT); parameter tying (decision tree based state tying); n-gram models: smoothing, learning, perplexity, class-based. +
- +
-===== Week 9 ===== +
- +
-**Automatic Speech Recognition (ASR) (II)**: Search - why search; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST) +
- +
-===== Week 10 ===== +
- +
-**Spoken Language Processing (I)**: text categorization +
-classify text documents: call/email routing, topic detection, etc. +
-vector-based approach, Naïve Bayes classifier; Bayesian networks, etc. +
-(2) HMM applications: Statistical Part-of-Speech (POS) tagging; +
- Language understanding: hidden concept models. +
- +
-===== Week 11 ===== +
- +
-**Spoken Language Processing (II)**: statistical machine translation +
- IBM’s models for machine translation: lexicon model, alignment model, language model +
- training process, generation & search +
- +
- +
-===== Week 12 =====+
  
 student presentation student presentation
course_outline.1472498372.txt.gz · Last modified: 2016/08/29 19:19 by hj