User Tools

Site Tools


lecture_notes
  • Week 1: background introduction; speech and spoken language (Weekly Reading: W1)
  • Week 2: Math foundation: probabilities; Bayes theorem; statistics; Entropy; mutual information; decision tree; optimization (Weekly Reading: W2) (A useful online manual on Matrix Calculus)
  • Week 3: Bayesian decision theory; Discriminative Models: Linear discriminant functions, support vector machine (SVM); (Weekly Reading: W3_1(part I), W3_2(part II), W3_3(part III))
  • Week 4: Generative Models; model estimation; maximum likelihood, EM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, Markov Chain model; (Weekly Reading: W4)
  • Week 5: Discriminative Learning; Bayesian Learning; Pattern Verification
  • Week 6: Hidden Markov Model (HMM): HMM vs. Markov chains; HMM concepts; Three algorithms (forward-backward, Viterbi decoding, Baum-Welch learning) (Weekly Reading: W6, the full version of this paper is not required but is here for your reference)
  • Week 8: Automatic Speech Recognition (ASR) (I): ASR introduction; ASR as an example of pattern classification; Acoustic modeling: parameter tying (decision tree based state tying); (Weekly Reading: W8})
  • Week 9: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram models: smoothing, learning, perplexity, class-based.
  • Week 10: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST) (Additional slides for WFST)
lecture_notes.txt · Last modified: 2012/11/19 16:00 by hj

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki