Lin, Shen and Zhang, Xiaoyu and Susilo, Willy and Chen, Xiaofeng and Liu, Jun (2024) GDR-GMA : Machine Unlearning via Direction-Rectified and Magnitude-Adjusted Gradients. In: Proceedings of the 32nd ACM International Conference on Multimedia :. Proceedings of the 32nd ACM International Conference on Multimedia . ACM, New York, pp. 9087-9095. ISBN 9798400706868
1860_GDR_GMA_Machine_Unlearnin_1_.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (778kB)
Abstract
As concerns over privacy protection grow and relevant laws come into effect, machine unlearning (MU) has emerged as a pivotal research area. Due to the complexity of the forgetting data distribution, the sample-wise MU is still open challenges. Gradient ascent, as the inverse of gradient descent, is naturally applied to machine unlearning, which is also the inverse process of machine learning. However, the straightforward gradient ascent MU method suffers from the trade-off between effectiveness, fidelity, and efficiency. In this work, we analyze the gradient ascent MU process from a multi-task learning (MTL) view. This perspective reveals two problems that cause the trade-off, i.e., the gradient direction problem and the gradient dominant problem. To address these problems, we propose a novel MU method, namely GDR-GMA, consisting of Gradient Direction Rectification (GDR) and Gradient Magnitude Adjustment (GMA). For the gradient direction problem, GDR rectifies the direction between the conflicting gradients by projecting a gradient onto the orthonormal plane of the conflicting gradient. For the gradient dominant problem, GMA dynamically adjusts the magnitude of the update gradients by assigning the dynamic magnitude weight parameter to the update gradients. Furthermore, we evaluate GDR-GMA against several baseline methods in three sample-wise MU scenarios: random data forgetting, sub-class forgetting, and class forgetting. Extensive experimental results demonstrate the superior performance of GDR-GMA in effectiveness, fidelity, and efficiency