Publication Details
KInITVeraAI at SemEval-2023 Task 3: Simple yet Powerful Multilingual Fine-Tuning for Persuasion Techniques Detection
Smoleň Timotej ()
Remiš Tomáš ()
Pecher Branislav, Ing. (DCGM FIT BUT)
Srba Ivan ()
multilingual persuasion technique detection, fine-tuning, SemEval
This paper presents the best-performing solution to the SemEval 2023 Task 3 on the subtask 3 dedicated to persuasion techniques detection. Due to a high multilingual character of the input data and a large number of 23 predicted labels (causing a lack of labelled data for some language-label combinations), we opted for fine-tuning pre-trained transformer-based language models. Conducting multiple experiments, we find the best configuration, which consists of large multilingual model (XLM-RoBERTa large) trained jointly on all input data, with carefully calibrated confidence thresholds for seen and surprise languages separately. Our final system performed the best on 6 out of 9 languages (including two surprise languages) and achieved highly competitive results on the remaining three languages.
@INPROCEEDINGS{FITPUB12968, author = "Timo Hrom\'{a}dka and Timotej Smole\v{n} and Tom\'{a}\v{s} Remi\v{s} and Branislav Pecher and Ivan Srba", title = "KInITVeraAI at SemEval-2023 Task 3: Simple yet Powerful Multilingual Fine-Tuning for Persuasion Techniques Detection", pages = "629--637", booktitle = "17th International Workshop on Semantic Evaluation, SemEval 2023 - Proceedings of the Workshop", year = 2023, location = "Toronto, CA", publisher = "Association for Computational Linguistics", ISBN = "978-1-959429-99-9", doi = "10.18653/v1/2023.semeval-1.86", language = "english", url = "https://www.fit.vut.cz/research/publication/12968" }