论文标题
基于脑电图的多模式情绪数据库,具有姿势和真实的面部动作,用于情绪分析
An EEG-Based Multi-Modal Emotion Database with Both Posed and Authentic Facial Actions for Emotion Analysis
论文作者
论文摘要
情绪是一种与特定的生理活动模式以及不同的生理,行为和认知变化相关的经验。一种行为改变是面部表达,在过去的几十年中,对此进行了广泛的研究。面部行为根据人的情感而有所不同,根据文化,个性,年龄,环境和环境的差异。近年来,生理活动已用于研究情绪反应。典型的信号是脑电图(EEG),它可以测量脑活动。现有的大多数基于脑电图的情绪分析忽略了面部表达变化的作用。由于缺乏同时测量脑电图和面部动作信号的数据集,因此很少对面部行为和大脑信号之间关系的研究很少。为了解决这个问题,我们建议通过同时收集面部表情,动作单元和脑电图来开发一个新的数据库。我们记录了来自不同年龄,性别,种族背景的29名参与者的面部动作和自发表达的脑电图和面对视频。与现有方法不同,我们设计了一个协议来通过明确唤起参与者的个人动作单位来捕获脑电图信号。我们还研究了脑电图信号和面部动作单位之间的关系。作为基准,通过单独使用图像,单独的脑电图和脑电图融合图像的实验,通过实验对数据库进行了评估。该数据库将被发布给研究界,以推动艺术的状态以自动识别。
Emotion is an experience associated with a particular pattern of physiological activity along with different physiological, behavioral and cognitive changes. One behavioral change is facial expression, which has been studied extensively over the past few decades. Facial behavior varies with a person's emotion according to differences in terms of culture, personality, age, context, and environment. In recent years, physiological activities have been used to study emotional responses. A typical signal is the electroencephalogram (EEG), which measures brain activity. Most of existing EEG-based emotion analysis has overlooked the role of facial expression changes. There exits little research on the relationship between facial behavior and brain signals due to the lack of dataset measuring both EEG and facial action signals simultaneously. To address this problem, we propose to develop a new database by collecting facial expressions, action units, and EEGs simultaneously. We recorded the EEGs and face videos of both posed facial actions and spontaneous expressions from 29 participants with different ages, genders, ethnic backgrounds. Differing from existing approaches, we designed a protocol to capture the EEG signals by evoking participants' individual action units explicitly. We also investigated the relation between the EEG signals and facial action units. As a baseline, the database has been evaluated through the experiments on both posed and spontaneous emotion recognition with images alone, EEG alone, and EEG fused with images, respectively. The database will be released to the research community to advance the state of the art for automatic emotion recognition.