论文标题
上下文指导的BERT用于针对性的基于方面的情感分析
Context-Guided BERT for Targeted Aspect-Based Sentiment Analysis
论文作者
论文摘要
基于方面的情感分析(ABSA)和针对性的ASBA(TABSA)允许根据上下文从同一文本中绘制有关情感的细粒度推断。例如,给定的文本可以具有不同的目标(例如,社区)和不同的方面(例如价格或安全性),并且与每个目标敏感对相关的不同情感。在本文中,我们调查为自我注意力模型增加背景是否可以改善(t)ABSA的性能。我们提出了两种上下文引导的BERT(CG-BERT)的变体,这些变体学会在不同上下文中分配注意力。我们首先调整了上下文感知的变压器,以生产使用上下文引导的软智能注意事项的CG-BERT。接下来,我们提出了一种改进的准发音CG-Bert模型,该模型学习了支持减法关注的组成关注。我们在两个(t)ABSA数据集上训练两个模型:Sentihood和Semeval-2014(任务4)。这两种模型都可以通过我们的QACG-BERT模型具有最佳性能来实现新的最新结果。此外,我们提供了对我们提出的模型中背景影响的分析。我们的工作为基于上下文的自然语言任务添加上下文依赖性的语言模型提供了更多证据。
Aspect-based sentiment analysis (ABSA) and Targeted ASBA (TABSA) allow finer-grained inferences about sentiment to be drawn from the same text, depending on context. For example, a given text can have different targets (e.g., neighborhoods) and different aspects (e.g., price or safety), with different sentiment associated with each target-aspect pair. In this paper, we investigate whether adding context to self-attention models improves performance on (T)ABSA. We propose two variants of Context-Guided BERT (CG-BERT) that learn to distribute attention under different contexts. We first adapt a context-aware Transformer to produce a CG-BERT that uses context-guided softmax-attention. Next, we propose an improved Quasi-Attention CG-BERT model that learns a compositional attention that supports subtractive attention. We train both models with pretrained BERT on two (T)ABSA datasets: SentiHood and SemEval-2014 (Task 4). Both models achieve new state-of-the-art results with our QACG-BERT model having the best performance. Furthermore, we provide analyses of the impact of context in the our proposed models. Our work provides more evidence for the utility of adding context-dependencies to pretrained self-attention-based language models for context-based natural language tasks.