Test time Adaptation through Perturbation Robustness

Published in NeurIPS 2021 Workshop on Distribution Shifts, 2021

Download paper here.

Abstract

Data samples generated by several real world processes are dynamic in nature i.e., their characteristics vary with time. Thus it is not possible to train and tackle all possible distributional shifts between training and inference, using the host of transfer learning methods in literature. In this paper, we tackle this problem of adapting to domain shift at inference time i.e., we do not change the training process, but quickly adapt the model at test-time to handle any domain shift. For this, we propose to enforce consistency of predictions of data sampled in the vicinity of test sample on the image manifold. On a host of test scenarios like dealing with corruptions (CIFAR-10-C and CIFAR-100-C), and domain adaptation (VisDA-C), our method is at par or significantly outperforms previous methods.

Code

The code for the experiments has been released here.