lecture_notes
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revisionNext revisionBoth sides next revision | ||
lecture_notes [2012/08/30 18:48] – hj | lecture_notes [2012/11/05 19:26] – hj | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | |||
- | * Week 1: background introduction; | ||
- | * Week 2: Math foundation: probabilities; | + | * {{: |
- | * Week 3: Bayesian | + | * {{: |
- | * Week 4: Generative Models; model estimation; maximum likelihood, EM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, | + | * {{: |
- | * {{:cse6328-w5.pdf|Week | + | * {{:cse6328-f12-w4.pdf|Week |
- | * {{:cse6328_w6.pdf|Week | + | * {{:cse6328-f12-w5.pdf|Week |
- | * Week 8: Automatic Speech Recognition (ASR) (I): ASR introduction; | + | * {{: |
+ | |||
+ | * {{: | ||
* Week 9: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram models: smoothing, learning, perplexity, class-based. | * Week 9: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram models: smoothing, learning, perplexity, class-based. | ||
- | * Week 10: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST) | + | * Week 10: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST) |
lecture_notes.txt · Last modified: 2012/11/19 16:00 by hj