论文标题
功能字可以增强注意力网络,用于几个反向关系分类
Function-words Enhanced Attention Networks for Few-Shot Inverse Relation Classification
论文作者
论文摘要
关系分类是确定给定文本中两个实体之间的语义关系。尽管现有模型在与大数据集进行逆关系分类方面表现良好,但对于很少的学习,它们的性能大大降低了。在本文中,我们提出了一个函数词,以适应增强的注意框架(FAEA),以进行几个逆关系分类,其中混合注意模型被设计为基于元学习的类别相关的函数词。由于功能单词的参与带来了重要的类内冗余,因此引入了一种自适应消息传递机制来捕获和转移阶层间的差异。我们数学分析了DOT-rododuct Leasurements函数单词对函数词的负面影响,这解释了为什么消息传递机制有效地降低了影响的影响。我们的实验结果表明,在Lighlel1.0中,FAEA胜过强大的基准,尤其是在1次设置以下的相反关系精度提高了14.33%。
The relation classification is to identify semantic relations between two entities in a given text. While existing models perform well for classifying inverse relations with large datasets, their performance is significantly reduced for few-shot learning. In this paper, we propose a function words adaptively enhanced attention framework (FAEA) for few-shot inverse relation classification, in which a hybrid attention model is designed to attend class-related function words based on meta-learning. As the involvement of function words brings in significant intra-class redundancy, an adaptive message passing mechanism is introduced to capture and transfer inter-class differences.We mathematically analyze the negative impact of function words from dot-product measurement, which explains why message passing mechanism effectively reduces the impact. Our experimental results show that FAEA outperforms strong baselines, especially the inverse relation accuracy is improved by 14.33% under 1-shot setting in FewRel1.0.