Publication Details
Recurrent Neural Network based Language Modeling in Meeting Recognition
Mikolov Tomáš, Ing. (DCGM FIT BUT)
Karafiát Martin, Ing., Ph.D. (DCGM FIT BUT)
Burget Lukáš, doc. Ing., Ph.D. (DCGM FIT BUT)
automatic speech recognition, language modeling, recurrent neural networks, rescoring, adaptation
In this paper we recommend the use of RNN language models as easy mean to improve an existing LVCSR system, either by improving ngram models using data sampled from an RNN or by performing the proposed rescoring and adaptation postprocessing steps.
We use recurrent neural network (RNN) based language models to improve the BUT English meeting recognizer. On the baseline setup using the original language models we decrease word error rate (WER) more than 1% absolute by n-best list rescoring and language model adaptation. When n-gram language models are trained on the same moderately sized data set as the RNN models, improvements are higher yielding a system which performs comparable to the baseline. A noticeable improvement was observed with unsupervised adaptation of RNN models. Furthermore, we examine the influence of word history on WER and show how to speed-up rescoring by caching common prefix strings.
@INPROCEEDINGS{FITPUB9760, author = "Stefan Kombrink and Tom\'{a}\v{s} Mikolov and Martin Karafi\'{a}t and Luk\'{a}\v{s} Burget", title = "Recurrent Neural Network based Language Modeling in Meeting Recognition", pages = "2877--2880", booktitle = "Proceedings of Interspeech 2011", journal = "Proceedings of Interspeech - on-line", volume = 2011, number = 8, year = 2011, location = "Florence, IT", publisher = "International Speech Communication Association", ISBN = "978-1-61839-270-1", ISSN = "1990-9772", language = "english", url = "https://www.fit.vut.cz/research/publication/9760" }