论文标题

GCNET:对话中多模式学习的图形完成网络

GCNet: Graph Completion Network for Incomplete Multimodal Learning in Conversation

论文作者

Lian, Zheng, Chen, Lan, Sun, Licai, Liu, Bin, Tao, Jianhua

论文摘要

对话已成为社交媒体平台上的关键数据格式。了解情感,内容和其他方面的对话也吸引了研究人员在人力计算中的广泛应用,从而引起了研究人员的越来越多的关注。在实际环境中,我们经常遇到不完整方式的问题,这已成为对话理解的核心问题。为了解决这个问题,研究人员提出了各种方法。但是,现有方法主要是为单个话语而不是对话数据设计的,而不是完全利用对话中的时间和扬声器信息。为此,我们为对话中的多模式学习提出了一个新颖的框架,称为“图形完整网络(GCNET)”,填补了现有作品的空白。我们的GCNE​​T包含两个精心设计的图形神经网络模块,“扬声器GNN”和“ Perimal GNN”,以捕获时间和扬声器依赖性。为了充分利用完整和不完整的数据,我们以端到端的方式共同优化分类和重建任务。为了验证我们方法的有效性,我们在三个基准对话数据集上进行实验。实验结果表明,我们的GCNE​​T优于不完整的多模式学习中现有的最新方法。代码可在https://github.com/zeroqiaoba/gcnet上找到。

Conversations have become a critical data format on social media platforms. Understanding conversation from emotion, content and other aspects also attracts increasing attention from researchers due to its widespread application in human-computer interaction. In real-world environments, we often encounter the problem of incomplete modalities, which has become a core issue of conversation understanding. To address this problem, researchers propose various methods. However, existing approaches are mainly designed for individual utterances rather than conversational data, which cannot fully exploit temporal and speaker information in conversations. To this end, we propose a novel framework for incomplete multimodal learning in conversations, called "Graph Complete Network (GCNet)", filling the gap of existing works. Our GCNet contains two well-designed graph neural network-based modules, "Speaker GNN" and "Temporal GNN", to capture temporal and speaker dependencies. To make full use of complete and incomplete data, we jointly optimize classification and reconstruction tasks in an end-to-end manner. To verify the effectiveness of our method, we conduct experiments on three benchmark conversational datasets. Experimental results demonstrate that our GCNet is superior to existing state-of-the-art approaches in incomplete multimodal learning. Code is available at https://github.com/zeroQiaoba/GCNet.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源