论文标题

NLU用于实际的基于游戏的学习:初步评估

NLU for Game-based Learning in Real: Initial Evaluations

论文作者

Okur, Eda, Sahay, Saurav, Nachman, Lama

论文摘要

为基于游戏的互动而设计的智能系统应在上下文上意识到用户及其周围环境。口语对话系统(SD)对于这些交互式代理人实时与用户进行有效的目标沟通至关重要。对于这种对话代理的现实世界(即野外)部署,改善了面向目标的SDS管道的自然语言理解(NLU)模块至关重要,尤其是对于特定于任务的数据集有限。这项研究探讨了最近提出的基于变压器的多任务NLU体系结构的潜在好处,主要是为了对小型领域特定的教育游戏数据集执行意图识别。评估数据集是从儿童通过基于游戏的学习设置中的基于游戏的互动来练习基本数学概念的。我们研究了最初的概念验证游戏数据集中的NLU性能与现实世界部署数据集,并观察到预期的性能下降。我们已经表明,与更直接的基线方法相比,双重意图和实体变压器(饮食)体系结构足够强大,可以在很大程度上处理现实世界数据,以实现这些特定领域的内部游戏数据集的意图识别任务。

Intelligent systems designed for play-based interactions should be contextually aware of the users and their surroundings. Spoken Dialogue Systems (SDS) are critical for these interactive agents to carry out effective goal-oriented communication with users in real-time. For the real-world (i.e., in-the-wild) deployment of such conversational agents, improving the Natural Language Understanding (NLU) module of the goal-oriented SDS pipeline is crucial, especially with limited task-specific datasets. This study explores the potential benefits of a recently proposed transformer-based multi-task NLU architecture, mainly to perform Intent Recognition on small-size domain-specific educational game datasets. The evaluation datasets were collected from children practicing basic math concepts via play-based interactions in game-based learning settings. We investigate the NLU performances on the initial proof-of-concept game datasets versus the real-world deployment datasets and observe anticipated performance drops in-the-wild. We have shown that compared to the more straightforward baseline approaches, Dual Intent and Entity Transformer (DIET) architecture is robust enough to handle real-world data to a large extent for the Intent Recognition task on these domain-specific in-the-wild game datasets.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源