论文标题
TTTFLOFF:无监督的测试时间培训,并进行标准化流程
TTTFlow: Unsupervised Test-Time Training with Normalizing Flow
论文作者
论文摘要
图像分类深度神经网络的一个主要问题是它们在测试时间时易受域变化的脆弱性。最近的方法提出了通过测试时间培训(TTT)解决这个问题的方法,其中对两分支模型进行了培训以学习主要的分类任务,还可以自我监督任务用于执行测试时间适应。但是,这些技术需要定义特定于目标应用程序的代理任务。为了应对此限制,我们提出了TTTFlow:使用基于标准化流量的无监督头的Y形架构,以了解潜在特征的正态分布并在测试示例中检测域移动。在推断时,保持无监督的头部固定,我们通过最大程度地提高归一流流量的对数可能性来使模型适应域移位示例。我们的结果表明,我们的方法可以显着提高与以前的作品相对于以前的工作的准确性。
A major problem of deep neural networks for image classification is their vulnerability to domain changes at test-time. Recent methods have proposed to address this problem with test-time training (TTT), where a two-branch model is trained to learn a main classification task and also a self-supervised task used to perform test-time adaptation. However, these techniques require defining a proxy task specific to the target application. To tackle this limitation, we propose TTTFlow: a Y-shaped architecture using an unsupervised head based on Normalizing Flows to learn the normal distribution of latent features and detect domain shifts in test examples. At inference, keeping the unsupervised head fixed, we adapt the model to domain-shifted examples by maximizing the log likelihood of the Normalizing Flow. Our results show that our method can significantly improve the accuracy with respect to previous works.