Wang, Q. and Shen, H. and Wang, L. and Yuan, L. and Ren, Y. and Jia, X. and Sun, S. and Meng, W. (2026) FBAO : backdoor attack against object detection via frequency noise injection. Applied Intelligence, 56 (5): 134. ISSN 0924-669X
sn-article.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (1MB)
Abstract
Object detection, a fundamental task in computer vision, has been extensively employed in numerous machine learning contexts. Nevertheless, object detectors are susceptible to various attacks and present significant security concerns in practical applications. As a particularly insidious attack, the backdoor attack involves embedding a hidden backdoor into the object detector, which can lead to misleading results. However, the majority of existing research on backdoor attacks employs a single pattern in the spatial domain of image as a trigger, which inevitably destroys the pixel-level semantics of benign image. To address this, we propose a novel Backdoor Attack against Object Detection via Frequency Noise Injection, i.e., FBAO. We employ the Gaussian random noise function to generate a noise image, which is then injected into the benign image by linearly combining the amplitude spectra of the perturbation and the benign image. By preserving the pixel-level semantics of benign images when injecting triggers, FBAO ensures the invisibility of generated triggers. Furthermore, we design the object-based evaluation of the Object-based Attack Success Rate (OASR) and the Object-based Miss-triggering Rate (OMR), which introduce the prediction of bounding box to comprehensively assess the effectiveness of backdoor attack against object detection. Experimental results show consistent out-performance of our method over other baselines across different object detection models and datasets.