User Tools

Site Tools


lecture_notes

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Last revisionBoth sides next revision
lecture_notes [2012/10/01 13:12] hjlecture_notes [2012/11/07 19:34] hj
Line 6: Line 6:
   * {{:cse6328-f12-w3.pdf|Week 3}}: Bayesian decision theory; Discriminative Models: Linear discriminant functions, support vector machine (SVM); (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Bayes_decision_rule.pdf|W3_1]](part I), {{:svmtutorial.pdf|W3_2}}(part II), {{:nn-bishop.pdf|W3_3}}(part III))   * {{:cse6328-f12-w3.pdf|Week 3}}: Bayesian decision theory; Discriminative Models: Linear discriminant functions, support vector machine (SVM); (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Bayes_decision_rule.pdf|W3_1]](part I), {{:svmtutorial.pdf|W3_2}}(part II), {{:nn-bishop.pdf|W3_3}}(part III))
  
-  * {{:cse6328-f12-w4.pdf|Week 4}}: Generative Models; model estimation; maximum likelihood, EM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, Markov Chain model; (Weekly Reading: [[http://www.cse.yorku.ca/course/6328/EM_tutorial.pdf|W4]])+  * {{:cse6328-f12-w4.pdf|Week 4}}: Generative Models; model estimation; maximum likelihood, EM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, Markov Chain model; (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/EM_tutorial.pdf|W4]])
  
-  * Week 5: Discriminative Learning; Bayesian Learning; Pattern Verification +  * {{:cse6328-f12-w5.pdf|Week 5}}: Discriminative Learning; Bayesian Learning; Pattern Verification 
  
-  * Week 6: Hidden Markov Model (HMM): HMM vs. Markov chains; HMM concepts; Three algorithms (forward-backward, Viterbi decoding, Baum-Welch learning)  (Weekly Reading: W6)+  * {{:cse6328-f12-w6.pdf|Week 6}}: Hidden Markov Model (HMM): HMM vs. Markov chains; HMM concepts; Three algorithms (forward-backward, Viterbi decoding, Baum-Welch learning)  (Weekly Reading: {{:rabiner1_hmm_asr_part1.pdf|W6}}, the full version of this paper is not required but is {{:rabiner1_hmm_asr.pdf|here}} for your reference)
  
-  * Week 8: Automatic Speech Recognition (ASR) (I): ASR introduction; ASR as an example of pattern classification; Acoustic modeling: parameter tying (decision tree based state tying); (Weekly Reading: W8)+  * {{:cse6328-f12-w8.pdf|Week 8}}: Automatic Speech Recognition (ASR) (I): ASR introduction; ASR as an example of pattern classification; Acoustic modeling: parameter tying (decision tree based state tying); (Weekly Reading: {{{:htkbook31_part1.pdf|W8}}})
  
-  * Week 9: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram models: smoothing, learning, perplexity, class-based.+  * {{:cse6328_f12_w9.pdf|Week 9}}: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram models: smoothing, learning, perplexity, class-based.
  
   * Week 10: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST)  (Additional slides for WFST)   * Week 10: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST)  (Additional slides for WFST)
lecture_notes.txt · Last modified: 2012/11/19 16:00 by hj