论文标题

模仿和征服:句法NLP的异质树结构蒸馏

Mimic and Conquer: Heterogeneous Tree Structure Distillation for Syntactic NLP

论文作者

Fei, Hao, Ren, Yafeng, Ji, Donghong

论文摘要

语法已显示对各种NLP任务有用,而现有工作主要使用一个层次神经网络编码Singleton句法树。在本文中,我们研究了一种简单有效的方法,即知识蒸馏,以将异质结构知识整合到统一的顺序LSTM编码器中。对四个典型语法依赖性任务的实验结果表明,我们的方法通过有效整合丰富的异质结构语法,以效率和准确性来优于富含的异质结构语法,以降低误差传播,并优于合奏方法,从而优于树的编码。

Syntax has been shown useful for various NLP tasks, while existing work mostly encodes singleton syntactic tree using one hierarchical neural network. In this paper, we investigate a simple and effective method, Knowledge Distillation, to integrate heterogeneous structure knowledge into a unified sequential LSTM encoder. Experimental results on four typical syntax-dependent tasks show that our method outperforms tree encoders by effectively integrating rich heterogeneous structure syntax, meanwhile reducing error propagation, and also outperforms ensemble methods, in terms of both the efficiency and accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源