论文标题
MS $^2 $ L:基于骨架的动作识别的多任务自我监督学习
MS$^2$L: Multi-Task Self-Supervised Learning for Skeleton Based Action Recognition
论文作者
论文摘要
在本文中,我们介绍了从人类骨骼学习以进行行动识别的自我监督的表示。以前的方法通常会从单个重建任务中学习特征演示文稿,可能会遇到过度拟合的问题,并且这些功能无法推广以进行动作识别。取而代之的是,我们建议以自我监督的方式整合多个任务,以学习更多的一般表示。为了实现这一目标,我们整合了运动预测,拼图拼图识别和对比度学习,以从不同方面学习骨骼特征。可以通过预测未来序列来通过运动预测来建模骨骼动力学。通过解决拼图难题来学习对行动识别至关重要的时间模式。我们通过对比度学习进一步使功能空间正常。此外,我们探索了不同的培训策略,以利用自我监督任务中的知识进行行动识别。我们通过在不同配置的情况下培训的动作分类器(包括无监督,半监督和完全监督的设置)评估了多任务自我监督的学习方法。我们在NW-UCLA,NTU RGB+D和PKUMMD数据集上进行的实验在动作识别方面表现出色,这证明了我们在学习更多歧视性和一般特征方面的优越性。我们的项目网站可在https://langlandslin.github.io/projects/msl/上获得。
In this paper, we address self-supervised representation learning from human skeletons for action recognition. Previous methods, which usually learn feature presentations from a single reconstruction task, may come across the overfitting problem, and the features are not generalizable for action recognition. Instead, we propose to integrate multiple tasks to learn more general representations in a self-supervised manner. To realize this goal, we integrate motion prediction, jigsaw puzzle recognition, and contrastive learning to learn skeleton features from different aspects. Skeleton dynamics can be modeled through motion prediction by predicting the future sequence. And temporal patterns, which are critical for action recognition, are learned through solving jigsaw puzzles. We further regularize the feature space by contrastive learning. Besides, we explore different training strategies to utilize the knowledge from self-supervised tasks for action recognition. We evaluate our multi-task self-supervised learning approach with action classifiers trained under different configurations, including unsupervised, semi-supervised and fully-supervised settings. Our experiments on the NW-UCLA, NTU RGB+D, and PKUMMD datasets show remarkable performance for action recognition, demonstrating the superiority of our method in learning more discriminative and general features. Our project website is available at https://langlandslin.github.io/projects/MSL/.