论文标题

Ernie-gram:使用明确的n-gram蒙版语言建模进行自然语言理解的预训练

ERNIE-Gram: Pre-Training with Explicitly N-Gram Masked Language Modeling for Natural Language Understanding

论文作者

Xiao, Dongling, Li, Yu-Kun, Zhang, Han, Sun, Yu, Tian, Hao, Wu, Hua, Wang, Haifeng

论文摘要

粗粒元素的语言信息,例如指定的实体或短语,促进了预训练中充分表示学习。先前的作品主要集中于将BERT掩盖语言建模(MLM)的目标从掩盖单个令牌掩盖到N标记的连续序列。我们认为,这种连续的掩蔽方法忽略了对粗粒语言信息的依赖性和相互关联的建模。作为替代方案,我们提出了Ernie-gram,这是一种明确的n-gram掩蔽方法,以增强粗粒信息的整合到预训练中。在Ernie-gram中,N-grams使用显式n-gram身份直接掩盖和预测,而不是n标记的连续序列。此外,Ernie-gram采用生成器模型来样本合理的n-gram身份作为可选的n-gram掩码,并以粗粒和细粒度的方式进行预测,以实现全面的N-gram预测和关系建模。我们预先培训Ernie-gram关于英语和中文文本语料库,并对19项下游任务进行微调。实验结果表明,Ernie-gram的表现优于先前的预训练模型,例如XLNET和Roberta,并以很大的边缘获得了可比较的结果。源代码和预培训模型已在https://github.com/paddlepaddle/ernie上发布。

Coarse-grained linguistic information, such as named entities or phrases, facilitates adequately representation learning in pre-training. Previous works mainly focus on extending the objective of BERT's Masked Language Modeling (MLM) from masking individual tokens to contiguous sequences of n tokens. We argue that such contiguously masking method neglects to model the intra-dependencies and inter-relation of coarse-grained linguistic information. As an alternative, we propose ERNIE-Gram, an explicitly n-gram masking method to enhance the integration of coarse-grained information into pre-training. In ERNIE-Gram, n-grams are masked and predicted directly using explicit n-gram identities rather than contiguous sequences of n tokens. Furthermore, ERNIE-Gram employs a generator model to sample plausible n-gram identities as optional n-gram masks and predict them in both coarse-grained and fine-grained manners to enable comprehensive n-gram prediction and relation modeling. We pre-train ERNIE-Gram on English and Chinese text corpora and fine-tune on 19 downstream tasks. Experimental results show that ERNIE-Gram outperforms previous pre-training models like XLNet and RoBERTa by a large margin, and achieves comparable results with state-of-the-art methods. The source codes and pre-trained models have been released at https://github.com/PaddlePaddle/ERNIE.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源