Abstract
Adapting to a continuously evolving environment is a safety-critical challenge inevitably faced by all autonomous-driving systems. Existing image- and video-based driving datasets, however, fall short of capturing the mutable nature of the real world. In this paper, we introduce the largest synthetic dataset for autonomous driving, SHIFT. It presents discrete and continuous shifts in cloudiness, rain and fog intensity, time of day, and vehicle and pedestrian density. Featuring a comprehensive sensor suite and annotations for several mainstream perception tasks, SHIFT allows to investigate how a perception systems’ performance degrades at increasing levels of domain shift, fostering the development of continuous adaptation strategies to mitigate this problem and assessing the robustness and generality of a model. Our dataset and benchmark toolkit is publicly available at https://vis.xyz/shift.
Dataset
Paper
![]() | Tao Sun, Mattia Segu, Janis Postels, Yuxuan Wang, Luc Van Gool, Bernt Schiele, Federico Tombari, Fisher Yu SHIFT: A Synthetic Driving Dataset for Continuous Multi-Task Domain Adaptation CVPR 2022 |
Code

github.com/SysCV/shift-dev
Citation
@InProceedings{shift,
author = {Sun, Tao and Segu, Mattia and Postels, Janis and Wang, Yuxuan and Van Gool, Luc and Schiele, Bernt and Tombari, Federico and Yu, Fisher},
title = {{SHIFT:} A Synthetic Driving Dataset for Continuous Multi-Task Domain Adaptation},
booktitle = {Computer Vision and Pattern Recognition},
year = {2022}
}