Matches in SemOpenAlex for { <https://semopenalex.org/work/W2461497102> ?p ?o ?g. }
Showing items 1 to 63 of
63
with 100 items per page.
- W2461497102 abstract "This work considers the problems of Automatic Speech Recognition, i.e., converting an acoustic speech signal into its text representation. To this end, a modular, research-oriented speech recognition system was built in Java. The special focus was given to language modeling, efficient search (decoding) algorithms and implementation of a large vocabulary decoder. Language modeling is considered first. The common point for all Large Vocabulary Continuous Speech Recognition Systems is the use of an n-gram language model, in which the probability for the next word is assumed to depend only on the previous n-1 words. For the usual trigram model and typical vocabularies of over 20,000 words, most contexts are never observed and accurate probability estimation is difficult. To deal with this problem, many smoothing techniques have been proposed and their performance was compared experimentally. In this work we analyze smoothing algorithms from a universal-compression perspective. We show that universal compression bounds can explain the empirical performance of several smoothing methods. We also describe a new Interpolated Additive smoothing algorithm, and show that it has lower training complexity and better compression performance than existing smoothing techniques on Wall Street Journal data. We consider efficient search strategies for finding the most likely transcription of an utterance. The task of a decoder is to find the most likely sequence of words among all possible sequences, combining all available knowledge sources, acoustic, pronunciation and language models. Dynamic programming can save the search effort for the sequences with the same prefix, but the number of possible paths is still enormous, and most of the paths with lower scores need to be efficiently discarded. A simple pruning algorithm, that we term Estimated Rank Pruning is presented and its advantages over traditional approaches to rank pruning such as histogram pruning are demonstrated. Finally, traditional approaches to decoder design were addressed, and a new architecture with different data structures for handling inherently large language and pronunciation models is proposed. It combines efficient algorithms for traversing the active hypotheses together with smaller data structures for describing their contextual information. The new decoder architecture was also evaluated on WSJ data." @default.
- W2461497102 created "2016-07-22" @default.
- W2461497102 creator A5012034611 @default.
- W2461497102 creator A5039224734 @default.
- W2461497102 date "2005-01-01" @default.
- W2461497102 modified "2023-09-23" @default.
- W2461497102 title "Estimation and modeling techniques for speech recognition" @default.
- W2461497102 hasPublicationYear "2005" @default.
- W2461497102 type Work @default.
- W2461497102 sameAs 2461497102 @default.
- W2461497102 citedByCount "0" @default.
- W2461497102 crossrefType "journal-article" @default.
- W2461497102 hasAuthorship W2461497102A5012034611 @default.
- W2461497102 hasAuthorship W2461497102A5039224734 @default.
- W2461497102 hasConcept C137293760 @default.
- W2461497102 hasConcept C137546455 @default.
- W2461497102 hasConcept C138885662 @default.
- W2461497102 hasConcept C154945302 @default.
- W2461497102 hasConcept C23224414 @default.
- W2461497102 hasConcept C2777601683 @default.
- W2461497102 hasConcept C28490314 @default.
- W2461497102 hasConcept C31972630 @default.
- W2461497102 hasConcept C3770464 @default.
- W2461497102 hasConcept C41008148 @default.
- W2461497102 hasConcept C41895202 @default.
- W2461497102 hasConcept C90805587 @default.
- W2461497102 hasConceptScore W2461497102C137293760 @default.
- W2461497102 hasConceptScore W2461497102C137546455 @default.
- W2461497102 hasConceptScore W2461497102C138885662 @default.
- W2461497102 hasConceptScore W2461497102C154945302 @default.
- W2461497102 hasConceptScore W2461497102C23224414 @default.
- W2461497102 hasConceptScore W2461497102C2777601683 @default.
- W2461497102 hasConceptScore W2461497102C28490314 @default.
- W2461497102 hasConceptScore W2461497102C31972630 @default.
- W2461497102 hasConceptScore W2461497102C3770464 @default.
- W2461497102 hasConceptScore W2461497102C41008148 @default.
- W2461497102 hasConceptScore W2461497102C41895202 @default.
- W2461497102 hasConceptScore W2461497102C90805587 @default.
- W2461497102 hasOpenAccess W2461497102 @default.
- W2461497102 hasRelatedWork W1975040318 @default.
- W2461497102 hasRelatedWork W2017602591 @default.
- W2461497102 hasRelatedWork W2259271473 @default.
- W2461497102 hasRelatedWork W2268975860 @default.
- W2461497102 hasRelatedWork W2295078202 @default.
- W2461497102 hasRelatedWork W2488962029 @default.
- W2461497102 hasRelatedWork W2573116950 @default.
- W2461497102 hasRelatedWork W2778459472 @default.
- W2461497102 hasRelatedWork W2897462945 @default.
- W2461497102 hasRelatedWork W2905417435 @default.
- W2461497102 hasRelatedWork W2909288227 @default.
- W2461497102 hasRelatedWork W2914277910 @default.
- W2461497102 hasRelatedWork W2928413680 @default.
- W2461497102 hasRelatedWork W2946026089 @default.
- W2461497102 hasRelatedWork W2963972328 @default.
- W2461497102 hasRelatedWork W3091975789 @default.
- W2461497102 hasRelatedWork W3104216863 @default.
- W2461497102 hasRelatedWork W3156639177 @default.
- W2461497102 hasRelatedWork W3171601240 @default.
- W2461497102 hasRelatedWork W3118389281 @default.
- W2461497102 isParatext "false" @default.
- W2461497102 isRetracted "false" @default.
- W2461497102 magId "2461497102" @default.
- W2461497102 workType "article" @default.