Publication Details
How Does Pre-Trained Wav2Vec 2.0 Perform on Domain-Shifted ASR? an Extensive Benchmark on Air Traffic Control Communications
Prasad Amrutha (DCGM FIT BUT)
Nigmatulina Iuliia (IDIAP)
Sarfjoo Seyyed Saeed (IDIAP)
Motlíček Petr, doc. Ing., Ph.D. (DCGM FIT BUT)
Kleinert Matthias (DLR)
Helmke Hartmut (DLR)
Ohneiser Oliver (DLR)
Zhan Qingran (IDIAP)
Automatic speech recognition, Wav2Vec 2.0, self-supervised pre-training, air traffic control communications.
Recent work on self-supervised pre-training focus on leveraging large-scale unlabeled speech data to build robust end-to-end (E2E) acoustic models (AM) that can be later fine-tuned on downstream tasks e.g., automatic speech recognition (ASR). Yet, few works investigated the impact on performance when the data properties substantially differ between the pre-training and fine-tuning phases, termed domain shift. We target this scenario by analyzing the robustness of Wav2Vec 2.0 and XLS-R models on downstream ASR for a completely unseen domain, air traffic control (ATC) communications. We benchmark these two models on several open-source and challenging ATC databases with signal-to-noise ratio between 5 to 20 dB. Relative word error rate (WER) reductions between 20% to 40% are obtained in comparison to hybrid-based ASR baselines by only fine-tuning E2E acoustic models with a smaller fraction of labeled data. We analyze WERs on the low-resource scenario and gender bias carried by one ATC dataset.
@INPROCEEDINGS{FITPUB13047, author = "Juan Zuluaga-Gomez and Amrutha Prasad and Iuliia Nigmatulina and Saeed Seyyed Sarfjoo and Petr Motl\'{i}\v{c}ek and Matthias Kleinert and Hartmut Helmke and Oliver Ohneiser and Qingran Zhan", title = "How Does Pre-Trained Wav2Vec 2.0 Perform on Domain-Shifted ASR? an Extensive Benchmark on Air Traffic Control Communications", pages = "205--212", booktitle = "IEEE Spoken Language Technology Workshop, SLT 2022 - Proceedings", year = 2023, location = "Doha, QA", publisher = "IEEE Signal Processing Society", ISBN = "978-1-6654-7189-3", doi = "10.1109/SLT54892.2023.10022724", language = "english", url = "https://www.fit.vut.cz/research/publication/13047" }