Publication Details
Learning Document Embeddings Along With Their Uncertainties
Plchot Oldřich, Ing., Ph.D. (DCGM FIT BUT)
Burget Lukáš, doc. Ing., Ph.D. (DCGM FIT BUT)
Gangashetty Suryakanth V (IIIT)
Bayesian methods, embeddings, topic identification.
Majority of the text modeling techniques yield only point-estimates of document embeddings and lack in capturing the uncertainty of the estimates. These uncertainties give a notion of how well the embeddings represent a document. We present Bayesian subspace multinomial model (Bayesian SMM), a generative log-linear model that learns to represent documents in the form of Gaussian distributions, thereby encoding the uncertainty in its covariance. Additionally, in the proposed Bayesian SMM, we address a commonly encountered problem of intractability that appears during variational inference in mixed-logit models. We also present a generative Gaussian linear classifier for topic identification that exploits the uncertainty in document embeddings. Our intrinsic evaluation using perplexity measure shows that the proposed Bayesian SMM fits the unseen test data better as compared to the state-of-the-art neural variational document model on (Fisher) speech and (20Newsgroups) text corpora. Our topic identification experiments showthat the proposed systems are robust to over-fitting on unseen test data. The topic ID results show that the proposedmodel outperforms state-of-the-art unsupervised topic models and achieve comparable results to the state-of-the-art fully supervised discriminative models.
@ARTICLE{FITPUB12343, author = "Santosh Kesiraju and Old\v{r}ich Plchot and Luk\'{a}\v{s} Burget and V Suryakanth Gangashetty", title = "Learning Document Embeddings Along With Their Uncertainties", pages = "2319--2332", journal = "IEEE/ACM TRANSACTIONS ON AUDIO, SPEECH AND LANGUAGE PROCESSING", volume = 2020, number = 28, year = 2020, ISSN = "2329-9290", doi = "10.1109/TASLP.2020.3012062", language = "english", url = "https://www.fit.vut.cz/research/publication/12343" }