论文标题
通过域自适应元学习的低资源风格转移
Low Resource Style Transfer via Domain Adaptive Meta Learning
论文作者
论文摘要
没有并行数据的文本样式传输(TST)取得了一些实际的成功。但是,大多数现有的无监督文本样式转移方法都遭受(i)需要大量非并行数据来指导转移不同文本样式的数据。 (ii)在新域中微调模型时,巨大的性能退化。在这项工作中,我们提出了Daml-ATM(具有对抗传递模型的域自适应元学习),其中包括两个部分:DAML和ATM。 DAML是一种自适应元学习方法,可以在多个异质源域中学习通用知识,能够适应具有少量数据的新看不见的域。此外,我们提出了一种新的无监督的TST方法对抗转移模型(ATM),该模型由序列到序列预训练的语言模型组成,并使用对抗性样式培训来更好地保存内容和样式转移。多域数据集的结果表明,我们的方法在看不见的低资源域上很好地概括了,可以针对十个强大的基线实现最新的结果。
Text style transfer (TST) without parallel data has achieved some practical success. However, most of the existing unsupervised text style transfer methods suffer from (i) requiring massive amounts of non-parallel data to guide transferring different text styles. (ii) colossal performance degradation when fine-tuning the model in new domains. In this work, we propose DAML-ATM (Domain Adaptive Meta-Learning with Adversarial Transfer Model), which consists of two parts: DAML and ATM. DAML is a domain adaptive meta-learning approach to learn general knowledge in multiple heterogeneous source domains, capable of adapting to new unseen domains with a small amount of data. Moreover, we propose a new unsupervised TST approach Adversarial Transfer Model (ATM), composed of a sequence-to-sequence pre-trained language model and uses adversarial style training for better content preservation and style transfer. Results on multi-domain datasets demonstrate that our approach generalizes well on unseen low-resource domains, achieving state-of-the-art results against ten strong baselines.