Using synthetic data for person tracking under adverse weather conditions

Kerim, A. and Celikcan, U. and Erdem, E. and Erdem, A. (2021) Using synthetic data for person tracking under adverse weather conditions. Image and Vision Computing, 111: 104187. ISSN 0262-8856

[thumbnail of NOVA__Adverse_Weather_Conditions__IVC_Journal]
Text (NOVA__Adverse_Weather_Conditions__IVC_Journal)
NOVA_Adverse_Weather_Conditions_IVC_Journal.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.

Download (20MB)

Abstract

Robust visual tracking plays a vital role in many areas such as autonomous cars, surveillance and robotics. Recent trackers were shown to achieve adequate results under normal tracking scenarios with clear weather conditions, standard camera setups and lighting conditions. Yet, the performance of these trackers, whether they are correlation filter-based or learning-based, degrade under adverse weather conditions. The lack of videos with such weather conditions, in the available visual object tracking datasets, is the prime issue behind the low performance of the learning-based tracking algorithms. In this work, we provide a new person tracking dataset of real-world sequences (PTAW172Real) captured under foggy, rainy and snowy weather conditions to assess the performance of the current trackers. We also introduce a novel person tracking dataset of synthetic sequences (PTAW217Synth) procedurally generated by our NOVA framework spanning the same weather conditions in varying severity to mitigate the problem of data scarcity. Our experimental results demonstrate that the performances of the state-of-the-art deep trackers under adverse weather conditions can be boosted when the available real training sequences are complemented with our synthetically generated dataset during training.

Item Type:
Journal Article
Journal or Publication Title:
Image and Vision Computing
Additional Information:
This is the author’s version of a work that was accepted for publication in Image and Vision Computing. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. A definitive version was subsequently published in Image and Vision Computing, 111, 2021 DOI: 10.1016/j.imavis.2021.104187
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/2200/2208
Subjects:
?? person trackingprocedural generationrenderingsynthetic datameteorologycorrelation filterslighting conditionsreal-world sequencesstate of the artsynthetic sequencetracking algorithmtraining sequencesvisual object trackingobject trackingelectrical and elect ??
ID Code:
155303
Deposited By:
Deposited On:
25 May 2021 14:25
Refereed?:
Yes
Published?:
Published
Last Modified:
12 Feb 2024 00:40