Fei, Haolin and Ma, Songlin and Du, Guanglong and Yadollahi, Elmira and Lam, Hak-Keung and Faragasso, Angela and Montazeri, Allahyar and Wang, Ziwei (2025) Large-Language-Model-Aided Assistive Robot for Single-Operator Bimanual Teleoperation: Introduction and Validation of a Flexible Assistance System. IEEE Robotics & Automation Magazine. pp. 2-12. ISSN 1070-9932
RAM_May_2025.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (3MB)
Abstract
Bimanual teleoperation tasks are highly demanding for human operators, requiring the simultaneous control of two robotic arms while managing complex coordination and cognitive load. Current approaches to this challenge often rely on rigid control schemes or task-specific automations that do not adapt well to dynamic environments or varied operator needs. This article presents a novel large language model (LLM)-aided bimanual teleoperation assistant (BTLA) that helps operators control dual-arm robots through an intuitive voice command interface and variable autonomy. The BTLA system enables a hybrid control paradigm by combining natural language interaction for an assistive robot arm with direct teleoperation of the dominant robotic arm. Our system implements six core manipulation skills with varying autonomy, ranging from direct mirroring to autonomous object manipulation. The BTLA leverages the LLM to interpret natural language commands and select an appropriate assistance mode based on task requirements and operator preferences. Experimental validation on bimanual object manipulation tasks demonstrates that the BTLA system yields a 240.8% increase in success rate over solo teleoperation and a 69.9% increase over dyadic teleoperation, while significantly reducing operator mental workload. In addition, we validate our approach on a physical dual-arm UR3e robot system, achieving a 90% success rate on challenging soft bottle handling and box transportation tasks.