STRP

New Jersey Institute of Technology

Developing an Intelligent Multi-Modal, IoT-enabled, AI-integrated, Sensor Fusion-based Wearable Device for Improving Human-Robot Interaction

Lead: Rayan H. Assaad

Partner: Gille Albeaino, Ph.D., Assistant Professor at the Department of Construction Science at Texas A&M University

Keywords: Wearable Technology, Human-Machine Interaction, sEMG-IMU Fusion, Multimodal Sensing

Amount: $25000

Intellectual Property Status: None

Award Date: 07/01/2024

End Date: 06/30/2025

ABSTRACT

Advancements in human-machine interaction (HMI) demands intuitive, real-time control mechanisms that seamlessly integrate with natural human movement. This project presents a wearable multi-modal device designed to capture precise hand gestures for machine and/or robotic control based on the fusion of surface electromyography (EMG) and inertial measurement unit (IMU) sensors. The system features a custom-designed PCB integrating six EMG sensors and three IMUs with a microcontroller for onboard signal processing and wireless communication. The IMUs are strategically placed on the wrist, forefinger, and thumb to track fine-grained hand motions, while the EMG sensors capture muscle activity for enhanced gesture recognition. An AI-based framework with Temporal Convolutional Networks, Bidirectional Cross-Attention, and Transformer Encoders processes the multimodal sensor data in real-time to predict hand poses with high accuracy. Compared to conventional camera-based tracking and gesture gloves, the developed innovation offers enhanced real-time performance, lower latency, improved adaptability, and operation in any environment without external dependencies, thus facilitating inclusive HMI across people with various behavioral, physiological, and motor abilities . The system has wide-ranging applications, including machine-computer interaction, robot teleoperation, prosthetic control, assistive robotics, industrial automation, and virtual/augmented reality (VR/AR). The custom PCB design optimizes power efficiency, minimizes latency, and provides a compact, ergonomic solution for seamless wearability. Ongoing efforts focus on improving robustness, expanding use cases for exoskeletons and surgical robotics, and working on industry collaborations for real-world implementation and testing. This work represents a significant step toward the future of wearable responsive human-machine interfaces.