论文标题

IDPG:依赖实例的提示生成方法

IDPG: An Instance-Dependent Prompt Generation Method

论文作者

Wu, Zhuofeng, Wang, Sinong, Gu, Jiatao, Hou, Rui, Dong, Yuxiao, Vydiswaran, V. G. Vinod, Ma, Hao

论文摘要

提示调整是一种新的,有效的NLP传输学习范式,在模型培训阶段,在每个输入实例中添加了特定于任务的提示。它冻结了预训练的语言模型,仅优化了一些特定于任务的提示。在本文中,我们提出了一种有条件的提示生成方法,以生成每个输入实例的提示,称为实例依赖性提示生成(IDPG)。与使用固定提示的传统提示调谐方法不同,IDPG引入了一个轻巧且可训练的组件,以根据每个输入句子生成提示。关于十种自然语言理解(NLU)任务的广泛实验表明,所提出的策略始终胜过各种及时调整基线,并且与其他有效的传输学习方法(例如compacter)相提并论,同时调整了较少的模型参数。

Prompt tuning is a new, efficient NLP transfer learning paradigm that adds a task-specific prompt in each input instance during the model training stage. It freezes the pre-trained language model and only optimizes a few task-specific prompts. In this paper, we propose a conditional prompt generation method to generate prompts for each input instance, referred to as the Instance-Dependent Prompt Generation (IDPG). Unlike traditional prompt tuning methods that use a fixed prompt, IDPG introduces a lightweight and trainable component to generate prompts based on each input sentence. Extensive experiments on ten natural language understanding (NLU) tasks show that the proposed strategy consistently outperforms various prompt tuning baselines and is on par with other efficient transfer learning methods such as Compacter while tuning far fewer model parameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源