High-density discrete HMM with the use of scalar quantization indexing

Brian Mak, Siu Kei Au Yeung, Yiu Pong Lai, Manhung Siu

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)

Abstract

With the advance in semiconductor memory and the availability of very large speech corpora (of hundreds to thousands of hours of speech), we would like to revisit the use of discrete hidden Markov model (DHMM) in automatic speech recognition. To estimate the discrete density in a DHMM state, the acoustic space is divided into bins and one simply count the relative amount of observations falling into each bin. With a very large speech corpus, we believe that the number of bins may be greatly increased to get a much higher density than before, and we will call the new models, the high-density discrete hidden Markov model (HDDHMM). Our HDDHMM is different from traditional DHMM in two aspects: firstly, the codebook will have a size in thousands or even tens of thousands; secondly, we propose a method based on scalar quantization indexing so that for a d-dimensional acoustic vector, the discrete codeword can be determined in O(d) time. During recognition, the state probability is reduced to an O(1) table look-up. The new HDDHMM was tested on WSJO with 5K vocabulary. Compared with a baseline 4-stream continuous density HMM system which has a WER of 9.71 %, a 4-stream HDDHMM system converted from the former achieves a WER of 11.60%, with no distance or Gaussian computation.

Original languageEnglish
Pages2121-2124
Number of pages4
Publication statusPublished - 2005
Externally publishedYes
Event9th European Conference on Speech Communication and Technology - Lisbon, Portugal
Duration: 4 Sept 20058 Sept 2005

Conference

Conference9th European Conference on Speech Communication and Technology
Country/TerritoryPortugal
CityLisbon
Period4/09/058/09/05

Fingerprint

Dive into the research topics of 'High-density discrete HMM with the use of scalar quantization indexing'. Together they form a unique fingerprint.

Cite this