Zhou, Zhuo and Yin, Rui and Wang, Wei and Ni, Qiang and Zarakovitis, Charilaos C. (2026) Wireless Deep Mutual Learning : Challenges and Opportunities. IEEE Communications Magazine. pp. 1-7. ISSN 0163-6804
Author_final_version.pdf - Accepted Version
Available under License Creative Commons Attribution-NonCommercial.
Download (5MB)
Abstract
The rapid proliferation of intelligent mobile devices has spurred the increasing interest in collaborative learning (CL). Existing CL methods rely on direct algebraic averaging of model parameters for knowledge transfer accross devices, which faces communication bottlenecks, struggles to scale into distributed settings, and has difficulty in handling model and data heterogeneity. To address these challenges, building upon the single-device deep mutual learning (DML), we propose a novel communication framework that interconnects multiple devices to form a natively distributed DML system. Unlike the classic CL, multi-device DML utilizes a distillation loss term to enable models to mutually and indirectly influence each other, thereby sharing knowledge by identifying common optimum point across devices. This mechanism enhances the distributed scalability, fully leverages on-device communication and computation resources, and effectively addresses model and data heterogeneity. We explore the integration of our proposed multi-device DML into a wireless system, termed wireless DML (WDML). Since knowledge sharing is hampered by communication bottlenecks, we analyze the corresponding challenges and opportunities for enhancing learning efficiency. Through a case study on a device-to-device based synchronous peer-to-peer system, we validate the advantages of WDML in energy efficiency and generalization. We conclude by discussing open issues that guide future research towards a more efficient, lower latency, more flexible WDML system.