论文标题
Slotrefine:一种快速的非自动回忆模型,用于关节意图检测和插槽填充
SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling
论文作者
论文摘要
插槽填充和意图检测是口语理解(SLU)系统的两个主要任务。在本文中,我们提出了一种新型的非自动入学模型,称为Slotrefine,以进行联合意图检测和插槽填充。此外,我们设计了一种新型的两通迭代机制,以处理由非自动回旋模型的有条件独立性引起的不协调的老虎机问题。实验表明,我们的模型在插槽填充任务中的表现显着优于先前的模型,同时大大加快了解码的加速(x 10.77)。深入的分析表明,1)预处理方案可以进一步增强我们的模型; 2)两次通行机制确实纠正了不协调的插槽。
Slot filling and intent detection are two main tasks in spoken language understanding (SLU) system. In this paper, we propose a novel non-autoregressive model named SlotRefine for joint intent detection and slot filling. Besides, we design a novel two-pass iteration mechanism to handle the uncoordinated slots problem caused by conditional independence of non-autoregressive model. Experiments demonstrate that our model significantly outperforms previous models in slot filling task, while considerably speeding up the decoding (up to X 10.77). In-depth analyses show that 1) pretraining schemes could further enhance our model; 2) two-pass mechanism indeed remedy the uncoordinated slots.