Shi, Ning and Xu, Lei and Zhu, Tianqing and Zhou, WanLei and Meng, Weizhi and Tan, Yu-an (2026) Exact constrained-training neural networks for confidential 8-bit arithmetic primitives in code obfuscation. Journal of Systems Architecture, 173: 103729. ISSN 1383-7621
Exact_Constrained_Training.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (866kB)
Abstract
Neural networks are rarely utilized for exact arithmetic computations. However, we aim to leverage them as accurate computing units in code obfuscation scenarios. The effectiveness of traditional code obfuscation methods is increasingly compromised by advanced reverse-engineering tools, which exploit the semantic transparency of arithmetic operations. To address this, we propose constrained-training neural networks tailored to 8-bit integer addition and multiplication, the most common operations in security-critical software. For addition, we introduce a constrained training algorithm that integrates weight clipping, linearity-enforcing loss terms, and boundary-case oversampling, enabling convergence to 100% accuracy across the entire domain. For multiplication, we design the adaptive symbol-gated NALU (ASG-NALU), an improved 4-bit multiplier that achieves exact results with reduced complexity. Combined with a cascade decomposition strategy, it extends to 8-bit multiplication with guaranteed correctness. Experiments confirm 100% in-domain accuracy, while out-of-domain inputs trigger catastrophic failures that act as natural traps, providing hidden security checks against dynamic analysis. These results establish exact constrained-training neural networks as confidential arithmetic primitives and firmly position neural arithmetic as a promising approach for advancing code obfuscation techniques.