course_outline
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
course_outline [2012/08/30 18:50] – hj | course_outline [2016/09/12 21:11] (current) – hj | ||
---|---|---|---|
Line 3: | Line 3: | ||
The course outline is a guideline to topics that will be discussed in the course, and when they will be discussed: | The course outline is a guideline to topics that will be discussed in the course, and when they will be discussed: | ||
- | ===== Week 1 ===== | + | ===== Weeks 1 and 2 ===== |
- | **Introduction**: application background; a big picture; speech sounds; spoken language | + | **Machine Learning** (basic concepts) |
- | (reading assignment) | + | |
- | ===== Week 2 ===== | + | **Math Background**: |
- | + | ||
- | **Math Background**: | + | |
===== Week 3 ===== | ===== Week 3 ===== | ||
- | **Pattern Classification**: pattern classification & pattern verification; | + | **Data, Feature and Model**: |
===== Week 4 ===== | ===== Week 4 ===== | ||
- | **Generative Models**: model estimation: maximum likelihood, Bayesian learning, EM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, | + | **Machine Learning (ML) Basics**: |
===== Week 5 ===== | ===== Week 5 ===== | ||
- | **Discriminative | + | **Generative |
===== Week 6 ===== | ===== Week 6 ===== | ||
- | **Hidden Markov Model (HMM)**: HMM vs. Markov chains; HMM concepts; Three algorithms: forward-backward; Viterbi decoding; Baum-Welch learning. | + | **Discriminative Models |
- | + | ||
- | ===== Week 7 ===== | + | |
- | midterm presentation | ||
- | ===== Week 8 ===== | + | ===== Week 7 ===== |
- | **Automatic Speech Recognition | + | **Discriminative Models |
- | HMM for ASR; ASR as an example of pattern classification; | + | |
- | Acoustic modeling: HMM learning (ML, MAP, DT); parameter tying (decision tree based state tying); n-gram models: smoothing, learning, perplexity, class-based. | + | |
- | ===== Week 9 ===== | + | ===== Week 8 |
- | **Automatic Speech Recognition (ASR) (II)**: | + | **Generative Models |
- | ===== Week 10 ===== | + | ===== Week 9 |
- | **Spoken Language Processing | + | **Generative Models |
- | classify text documents: call/email routing, topic detection, etc. | + | |
- | vector-based approach, Naïve Bayes classifier; Bayesian networks, etc. | + | |
- | (2) HMM applications: | + | |
- | | + | |
- | ===== Week 11 ===== | + | ===== Week 10 |
- | **Spoken Language Processing (II)**: statistical machine translation | + | **Selected Advanced Topics**: Hidden Markov Model (HMM), HMM concepts; Three algorithms: forward-backward; |
- | | + | |
- | | + | ===== Weeks 11-12 ===== |
- | + | ||
- | + | ||
- | ===== Week 12 ===== | + | |
student presentation | student presentation |
course_outline.1346352654.txt.gz · Last modified: 2012/08/30 18:50 by hj