3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors

Lonsdale, David and Zhang, Li and Jiang, Richard (2021) 3D Printed Brain-Controlled Robot-Arm Prosthetic via Embedded Deep Learning from sEMG Sensors. In: 2020 International Conference on Machine Learning and Cybernetics (ICMLC) :. Proceedings - International Conference on Machine Learning and Cybernetics . IEEE, pp. 247-253. ISBN 9780738124261

Full text not available from this repository.

Abstract

In this paper, we present our work on developing robot arm prosthetic via deep learning. Our work proposes to use transfer learning techniques applied to the Google Inception model to retrain the final layer for surface electromyography (sEMG) classification. Data have been collected using the Thalmic Labs Myo Armband and used to generate graph images comprised of 8 subplots per image containing sEMG data captured from 40 data points per sensor, corresponding to the array of 8 sEMG sensors in the armband. Data captured were then classified into four categories (Fist, Thumbs Up, Open Hand, Rest) via using a deep learning model, Inception-v3, with transfer learning to train the model for accurate prediction of each on real-time input of new data. This trained model was then downloaded to the ARM processor based embedding system to enable the brain-controlled robot-arm prosthetic manufactured from our 3D printer. Testing of the functionality of the method, a robotic arm was produced using a 3D printer and off-the-shelf hardware to control it. SSH communication protocols are employed to execute python files hosted on an embedded Raspberry Pi with ARM processors to trigger movement on the robot arm of the predicted gesture.

Item Type:
Contribution in Book/Report/Proceedings
ID Code:
174230
Deposited By:
Deposited On:
17 Nov 2022 15:00
Refereed?:
Yes
Published?:
Published
Last Modified:
09 Oct 2024 12:14