STREAMS: An Assistive Multimodal AI Framework for Empowering Biosignal Based Robotic Controls

📅 2024-10-04
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Electromyographic (EMG) and electroencephalographic (EEG)-driven robotic end-effectors suffer from trajectory jitter, unstable control, and grasp failures due to inherent biological signal noise. Method: This paper proposes an end-to-end self-training multimodal shared autonomy framework. It innovatively integrates environmental perception with synthetically generated user intent to realize a self-training deep Q-network (DQN) that requires neither manual annotation nor pre-collected datasets, enabling zero-shot simulation-to-reality transfer. The approach unifies multimodal signal fusion, self-supervised trajectory generation, and a shared autonomy control architecture. Results: Experiments demonstrate a 98% dynamic target capture rate in simulation; on real users, task success improves to 83% (vs. 44% under manual control), with significant gains in trajectory smoothness, robustness against signal noise, and user satisfaction.

Technology Category

Application Category

📝 Abstract
End-effector based assistive robots face persistent challenges in generating smooth and robust trajectories when controlled by human's noisy and unreliable biosignals such as muscle activities and brainwaves. The produced endpoint trajectories are often jerky and imprecise to perform complex tasks such as stable robotic grasping. We propose STREAMS (Self-Training Robotic End-to-end Adaptive Multimodal Shared autonomy) as a novel framework leveraged deep reinforcement learning to tackle this challenge in biosignal based robotic control systems. STREAMS blends environmental information and synthetic user input into a Deep Q Learning Network (DQN) pipeline for an interactive end-to-end and self-training mechanism to produce smooth trajectories for the control of end-effector based robots. The proposed framework achieved a high-performance record of 98% in simulation with dynamic target estimation and acquisition without any pre-existing datasets. As a zero-shot sim-to-real user study with five participants controlling a physical robotic arm with noisy head movements, STREAMS (as an assistive mode) demonstrated significant improvements in trajectory stabilization, user satisfaction, and task performance reported as a success rate of 83% compared to manual mode which was 44% without any task support. STREAMS seeks to improve biosignal based assistive robotic controls by offering an interactive, end-to-end solution that stabilizes end-effector trajectories, enhancing task performance and accuracy.
Problem

Research questions and friction points this paper is trying to address.

Improving biosignal-based robotic control for smooth trajectories
Enhancing task performance with noisy muscle and brainwave inputs
Stabilizing end-effector movements in assistive robotics systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep reinforcement learning for biosignal control
Interactive end-to-end self-training DQN mechanism
Multimodal fusion of environmental and user inputs
🔎 Similar Papers
No similar papers found.