论文标题

MAD-X:基于适配器的多任务跨语言转移框架

MAD-X: An Adapter-Based Framework for Multi-Task Cross-Lingual Transfer

论文作者

Pfeiffer, Jonas, Vulić, Ivan, Gurevych, Iryna, Ruder, Sebastian

论文摘要

最先进的预训练的多语言模型(例如多语言BERT和XLM-R)的主要目标是通过零击或几次播放的跨语性转移来启用和引导NLP应用程序。但是,由于模型能力有限,其转移性能是在预训练期间看不见的低资源语言和语言上最弱的。我们提出了MAD-X,这是一个基于适配器的框架,可以通过学习模块化语言和任务表示来实现高便携性和参数有效的任意任务和语言。此外,我们还引入了一种新颖的可逆适配器体系结构和强大的基线方法,用于将预训练的多语言模型适应新语言。 MAD-X在命名实体识别和因果共识推理上的一组代表性多样性的语言中,以跨语性转移的方式优于艺术状况,并在问题回答上取得了竞争成果。我们的代码和适配器可在adapterhub.ml上找到

The main goal behind state-of-the-art pre-trained multilingual models such as multilingual BERT and XLM-R is enabling and bootstrapping NLP applications in low-resource languages through zero-shot or few-shot cross-lingual transfer. However, due to limited model capacity, their transfer performance is the weakest exactly on such low-resource languages and languages unseen during pre-training. We propose MAD-X, an adapter-based framework that enables high portability and parameter-efficient transfer to arbitrary tasks and languages by learning modular language and task representations. In addition, we introduce a novel invertible adapter architecture and a strong baseline method for adapting a pre-trained multilingual model to a new language. MAD-X outperforms the state of the art in cross-lingual transfer across a representative set of typologically diverse languages on named entity recognition and causal commonsense reasoning, and achieves competitive results on question answering. Our code and adapters are available at AdapterHub.ml

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源