User Tools

Site Tools


lecture_notes

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
lecture_notes [2012/08/30 18:48] hjlecture_notes [2012/11/19 16:00] (current) hj
Line 1: Line 1:
-  
-  * Week 1: background introduction; speech and spoken language  (Weekly Reading:[[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Makhoul_BBN.pdf| W1]]) 
  
-  * Week 2Math foundation: probabilities; Bayes theorem; statistics; Entropy; mutual information; decision treeoptimization (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Duda_AppMath.pdf|W2]])    (A useful online manual on [[http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/calculus.html|Matrix Calculus]])+  * {{:slp-f12-w1.pdf|Week 1}}background introductionspeech and spoken language  (Weekly Reading:[[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Makhoul_BBN.pdf| W1]])
  
-  * Week 3Bayesian decision theory; Discriminative Models: Linear discriminant functions, support vector machine (SVM); (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Bayes_decision_rule.pdf|W3_1]](part I), {{:svmtutorial.pdf|W3_2}}(part II))+  * {{:cse6328-f12-w2.pdf|Week 2}}Math foundation: probabilities; Bayes theorem; statistics; Entropy; mutual information; decision treeoptimization (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Duda_AppMath.pdf|W2]])    (A useful online manual on [[http://www.ee.ic.ac.uk/hp/staff/dmb/matrix/calculus.html|Matrix Calculus]])
  
-  * Week 4Generative Models; model estimationmaximum likelihoodEM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, Markov Chain model; (Weekly Reading: [[http://www.cse.yorku.ca/course/6328/EM_tutorial.pdf|W4]])+  * {{:cse6328-f12-w3.pdf|Week 3}}Bayesian decision theoryDiscriminative Models: Linear discriminant functionssupport vector machine (SVM); (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/Bayes_decision_rule.pdf|W3_1]](part I), {{:svmtutorial.pdf|W3_2}}(part II), {{:nn-bishop.pdf|W3_3}}(part III))
  
-  * {{:cse6328-w5.pdf|Week 5}}: Discriminative LearningBayesian LearningPattern Verification +  * {{:cse6328-f12-w4.pdf|Week 4}}: Generative Modelsmodel estimationmaximum likelihood, EM algorithm; multivariate Gaussian, Gaussian mixture model, Multinomial, Markov Chain model; (Weekly Reading: [[http://www.cse.yorku.ca/course_archive/2011-12/W/6328/EM_tutorial.pdf|W4]])
  
-  * {{:cse6328_w6.pdf|Week 6}}: Hidden Markov Model (HMM): HMM vs. Markov chainsHMM conceptsThree algorithms (forward-backward, Viterbi decoding, Baum-Welch learning)  (Weekly Reading: {{:rabiner1_hmm_asr_part1.pdf|W6}})+  * {{:cse6328-f12-w5.pdf|Week 5}}: Discriminative LearningBayesian LearningPattern Verification 
  
-  * Week 8Automatic Speech Recognition (ASR) (I): ASR introductionASR as an example of pattern classificationAcoustic modeling: parameter tying (decision tree based state tying)(Weekly Reading: {{:htkbook31_part1.pdf|W8}})+  * {{:cse6328-f12-w6.pdf|Week 6}}Hidden Markov Model (HMM): HMM vs. Markov chainsHMM conceptsThree algorithms (forward-backward, Viterbi decoding, Baum-Welch learning (Weekly Reading: {{:rabiner1_hmm_asr_part1.pdf|W6}}, the full version of this paper is not required but is {{:rabiner1_hmm_asr.pdf|here}} for your reference)
  
-  * Week 9: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram modelssmoothing, learning, perplexity, class-based.+  * {{:cse6328-f12-w8.pdf|Week 8}}: Automatic Speech Recognition (ASR) (I): ASR introduction; ASR as an example of pattern classification; Acoustic modeling: parameter tying (decision tree based state tying); (Weekly Reading{{{:htkbook31_part1.pdf|W8}}})
  
-  * Week 10: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST)  ({{:wfst-tutorial.pdf|Additional slides for WFST}})+  * {{:cse6328_f12_w9.pdf|Week 9}}: Automatic Speech Recognition (ASR) (II): Language Modelling (LM); N-gram models: smoothing, learning, perplexity, class-based. 
 + 
 +  * {{:cse6328_f12_w10.pdf|Week 10}}: Automatic Speech Recognition (ASR) (III): Search - why search; Search space in n-gram LM; Viterbi decoding in a large HMM; beam search; tree-based lexicon; dynamic decoding; static decoding; weighted finite state transducer (WFST)  (Additional slides for WFST)
lecture_notes.1346352533.txt.gz · Last modified: 2012/08/30 18:48 by hj