Luo, Fei and Li, Anna and He, Jiguang and Yu, Zitong and Wu, Kaishun and Jiang, Bin and Wang, Lu (2025) Improved Multi-Task Radar Sensing via Attention-based Feature Distillation and Contrastive Learning. IEEE Transactions on Information Forensics and Security, 20. pp. 8251-8265. ISSN 1556-6013
Full text not available from this repository.Abstract
Radar sensing is gaining increasing attention due to its unique advantages, including being device-free, privacy-preserving, and capable of penetrating obstacles. It has been extensively studied in various applications such as human activity recognition, vital sign monitoring, and person identification. However, most existing research focuses on a single specific application, and there remains a lack of studies or datasets dedicated to multi-task radar sensing. In this paper, we collected a dataset for two sensing tasks, including gesture recognition and person identification, via a miniature mm-wave radar. The raw radar signals were processed using micro-Doppler and range-Doppler techniques to extract spectral and spatial representations. We propose an improved multi-task radar sensing framework (MT-DualFormer) that incorporates attention-based cross-task feature distillation and contrastive learning to maximize task performance. MT-DualFormer consists of dual branches with CNN and Transformer modules, capturing both spatial and temporal dependencies in radar data. Attention-based cross-task feature distillation enables knowledge transfer between gesture recognition and person identification tasks. Meanwhile, contrastive learning ensures embedding space separability, facilitating robust task-specific classification. In the evaluation, MT-DualFormer achieves accuracy rates of 98.87% for gesture recognition and 97.96% for person identification, surpassing five representative multi-task approaches and ten state-of-the-art models. This study underscores the importance of leveraging task correlations to enhance the performance of radar-based sensing systems.