论文标题

基于顺序的交叉注意多任务学习

Sequential Cross Attention Based Multi-task Learning

论文作者

Kim, Sunkyung, Choi, Hyesong, Min, Dongbo

论文摘要

在多任务学习(MTL)以进行视觉场景理解的过程中,至关重要的是在多个任务之间传递有用的信息。在本文中,我们提出了一种新颖的体系结构,该结构通过将注意力机制应用于任务的多规模特征来有效地传递信息特征。由于将注意模块直接应用于规模和任务的所有可能功能需要高复杂性,因此我们建议将注意模块顺序应用于任务和规模。首先应用了交叉任务注意模块(CTAM),以促进相同规模的多个任务特征之间相关信息的交换。然后,跨尺度注意模块(CSAM)在同一任务中的不同分辨率下从特征地图中汇总了有用的信息。此外,我们尝试通过特征提取网络中的自我发项式模块捕获长距离依赖性。广泛的实验表明,我们的方法在NYUD-V2和Pascal-Context数据集上实现了最先进的性能。

In multi-task learning (MTL) for visual scene understanding, it is crucial to transfer useful information between multiple tasks with minimal interferences. In this paper, we propose a novel architecture that effectively transfers informative features by applying the attention mechanism to the multi-scale features of the tasks. Since applying the attention module directly to all possible features in terms of scale and task requires a high complexity, we propose to apply the attention module sequentially for the task and scale. The cross-task attention module (CTAM) is first applied to facilitate the exchange of relevant information between the multiple task features of the same scale. The cross-scale attention module (CSAM) then aggregates useful information from feature maps at different resolutions in the same task. Also, we attempt to capture long range dependencies through the self-attention module in the feature extraction network. Extensive experiments demonstrate that our method achieves state-of-the-art performance on the NYUD-v2 and PASCAL-Context dataset.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源