Weakly Supervised Co-training with Swapping Assignments for Semantic Segmentation

Yang, Xinyu and Rahmani, Hossein and Black, Sue and Williams, Bryan M. (2024) Weakly Supervised Co-training with Swapping Assignments for Semantic Segmentation. Other. Arxiv.

[thumbnail of 2402.17891v1]
Text (2402.17891v1)
Download (0B)
[thumbnail of 2402.17891v1]
Text (2402.17891v1)
Download (0B)
[thumbnail of 2402.17891v1]
Text (2402.17891v1)
Download (0B)
[thumbnail of 2402.17891v1]
Text (2402.17891v1)
Download (0B)
[thumbnail of 2402.17891v1]
Text (2402.17891v1)
Download (0B)
[thumbnail of 2402.17891v1]
Text (2402.17891v1)
Download (0B)
[thumbnail of 2402.17891v1]
Text (2402.17891v1)
2402.17891v1.pdf - Published Version
Available under License Creative Commons Attribution.

Download (50MB)

Abstract

Class activation maps (CAMs) are commonly employed in weakly supervised semantic segmentation (WSSS) to produce pseudo-labels. Due to incomplete or excessive class activation, existing studies often resort to offline CAM refinement, introducing additional stages or proposing offline modules. This can cause optimization difficulties for single-stage methods and limit generalizability. In this study, we aim to reduce the observed CAM inconsistency and error to mitigate reliance on refinement processes. We propose an end-to-end WSSS model incorporating guided CAMs, wherein our segmentation model is trained while concurrently optimizing CAMs online. Our method, Co-training with Swapping Assignments (CoSA), leverages a dual-stream framework, where one sub-network learns from the swapped assignments generated by the other. We introduce three techniques: i) soft perplexity-based regularization to penalize uncertain regions; ii) a threshold-searching approach to dynamically revise the confidence threshold; and iii) contrastive separation to address the coexistence problem. CoSA demonstrates exceptional performance, achieving mIoU of 76.2\% and 51.0\% on VOC and COCO validation datasets, respectively, surpassing existing baselines by a substantial margin. Notably, CoSA is the first single-stage approach to outperform all existing multi-stage methods including those with additional supervision. Code is avilable at \url{https://github.com/youshyee/CoSA}.

Item Type:
Monograph (Other)
Subjects:
?? cs.cv ??
ID Code:
217789
Deposited By:
Deposited On:
30 Apr 2024 15:55
Refereed?:
No
Published?:
Published
Last Modified:
05 May 2024 23:17