论文标题

从解释中学习概率句子表示

Learning Probabilistic Sentence Representations from Paraphrases

论文作者

Chen, Mingda, Gimpel, Kevin

论文摘要

概率单词的嵌入在捕获普遍性和综合性的概念方面显示出有效性,但是在进行句子的类似调查方面几乎没有工作。在本文中,我们定义了为句子产生分布的概率模型。我们表现​​最好的模型将每个单词视为应用于多元高斯分布的线性转换操作员。我们在释义上训练模型,并证明它们自然捕获了句子特异性。尽管我们提出的模型总体上达到了最佳性能,但我们还表明,特异性是通过句子向量规范更简单的体系结构来表示的。定性分析表明,我们的概率模型捕获了句子的影响,并提供了分析单词的特异性和精确性的方法。

Probabilistic word embeddings have shown effectiveness in capturing notions of generality and entailment, but there is very little work on doing the analogous type of investigation for sentences. In this paper we define probabilistic models that produce distributions for sentences. Our best-performing model treats each word as a linear transformation operator applied to a multivariate Gaussian distribution. We train our models on paraphrases and demonstrate that they naturally capture sentence specificity. While our proposed model achieves the best performance overall, we also show that specificity is represented by simpler architectures via the norm of the sentence vectors. Qualitative analysis shows that our probabilistic model captures sentential entailment and provides ways to analyze the specificity and preciseness of individual words.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源