AAAI 2019 paper: "Document Informed Neural Autoregressive Topic Models with Distributional Prior" (a Contextualized Neural Topic Model with Word Embeddings). Models: DocNADE, iDocNADE, DocNADEe and iDocNADEe
Thank you for sharing the source code. I try to reproduce the iDocNADEe results on the 20NSshort dataset by running train_20NSshort_docnade.sh file, however, I get a much lower PPL (18.21) than the number (633) presented in the paper. Do I ignore some important configuration?
sorry,i just read your paper.But there are so many questions,for example,after Eq.2,3,we can get the log-likelihood of a document,however,how to get the topic-word distribution and the topic-document distribution?