论文标题

从行动到事件:采用改进的深度信念网络的转移学习方法

From Actions to Events: A Transfer Learning Approach Using Improved Deep Belief Networks

论文作者

Roder, Mateus, Almeida, Jurandy, de Rosa, Gustavo H., Passos, Leandro A., Rossi, André L. D., Papa, João P.

论文摘要

在过去的十年中,指数数据增长提供了基于机器学习的算法的能力,并使其在日常生活中的使用。此外,由于深度学习技术的出现(即,最终都以更复杂的模型中的简单体系结构)的出现,可以部分解释这种改进。尽管这两个因素都会产生出色的结果,但它们也构成了有关学习过程的缺点,因为大型数据集的培训复杂模型既昂贵又耗时。在处理视频分析时,这样的问题更为明显。一些作品考虑了转移学习或域的适应性,即将知识从一个领域映射到另一个领域的方法,以减轻培训负担,但大多数人都在个人或小框架上运行。本文提出了一种新颖的方法,可以使用基于能量的模型将知识从行动识别到事件识别的知识,并表示为光谱深信仰网络。这样的模型可以同时处理所有帧,从而通过学习过程携带空间和时间信息。与传统基于能量的模型相比,在两个公共视频数据集HMDB-51和UCF-101上进行了实验结果,描述了所提出模型的有效性及其减少的计算负担,例如受限的玻尔兹曼机器和深度信念网络。

In the last decade, exponential data growth supplied machine learning-based algorithms' capacity and enabled their usage in daily-life activities. Additionally, such an improvement is partially explained due to the advent of deep learning techniques, i.e., stacks of simple architectures that end up in more complex models. Although both factors produce outstanding results, they also pose drawbacks regarding the learning process as training complex models over large datasets are expensive and time-consuming. Such a problem is even more evident when dealing with video analysis. Some works have considered transfer learning or domain adaptation, i.e., approaches that map the knowledge from one domain to another, to ease the training burden, yet most of them operate over individual or small blocks of frames. This paper proposes a novel approach to map the knowledge from action recognition to event recognition using an energy-based model, denoted as Spectral Deep Belief Network. Such a model can process all frames simultaneously, carrying spatial and temporal information through the learning process. The experimental results conducted over two public video dataset, the HMDB-51 and the UCF-101, depict the effectiveness of the proposed model and its reduced computational burden when compared to traditional energy-based models, such as Restricted Boltzmann Machines and Deep Belief Networks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源