论文标题

基于阿尔茨海默氏病分类的多样化的秘密注意力模式,重建动态功能连通性网络的高阶序列特征

Reconstructing high-order sequence features of dynamic functional connectivity networks based on diversified covert attention patterns for Alzheimer's disease classification

论文作者

Zhang, Zhixiang, Jie, Biao, Wang, Zhengdong, Zhou, Jie, Yang, Yang

论文摘要

最近的研究应用了基于动态功能连通性网络(DFCN)(例如阿尔茨海默氏病(AD))的深度学习方法,例如卷积复发神经网络(CRN)和变形金刚进行脑部疾病分类,比传统机器学习方法更好。但是,在CRN中,用于获得高阶聚集特征的连续卷积操作可能会忽略由于卷积的本质是局部元素的线性加权总和,因此不同大脑区域之间的非线性相关性。受到现代神经科学在神经系统中秘密注意力的研究的启发,我们引入了自我发项机制,即变形金刚的核心模块,以模拟多样化的秘密注意力模式,并将这些模式应用于DFCN的高阶序列特征,以便在大脑信息流中学习复杂的动态变化。因此,我们提出了一种基于多元化的秘密注意力模式DCA-CRN的新型CRN方法,该方法结合了CRN在捕获局部时空特征和序列变化模式以及学习全局和高阶相关特征中的变压器方面的优势。 ADNI和ADNI和ADHD-200数据集的实验结果证明了我们提出的方法的预测性能和概括能力。

Recent studies have applied deep learning methods such as convolutional recurrent neural networks (CRNs) and Transformers to brain disease classification based on dynamic functional connectivity networks (dFCNs), such as Alzheimer's disease (AD), achieving better performance than traditional machine learning methods. However, in CRNs, the continuous convolution operations used to obtain high-order aggregation features may overlook the non-linear correlation between different brain regions due to the essence of convolution being the linear weighted sum of local elements. Inspired by modern neuroscience on the research of covert attention in the nervous system, we introduce the self-attention mechanism, a core module of Transformers, to model diversified covert attention patterns and apply these patterns to reconstruct high-order sequence features of dFCNs in order to learn complex dynamic changes in brain information flow. Therefore, we propose a novel CRN method based on diversified covert attention patterns, DCA-CRN, which combines the advantages of CRNs in capturing local spatio-temporal features and sequence change patterns, as well as Transformers in learning global and high-order correlation features. Experimental results on the ADNI and ADHD-200 datasets demonstrate the prediction performance and generalization ability of our proposed method.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源