Augmented Reality-based Robot Trajectory Programming

Enabling Safe and Intuitive Human-Robot Interaction

Lead: Machiel Van der Loos

Wesley Chan interacting with AR robot 
Wesley Chan interacting with AR robot 

This project offers a future-focused approach for robot programming using augmented reality (AR) with the goal of enabling safe and intuitive human-robot interaction in collaborative manufacturing. Using a mixed reality head-mounted display (Microsoft Hololens) and a pair of surface electromyography (EMG) and gesture sensing armbands (MYO Armband), we designed a multimodal user interface using AR to ease the robot programming task by proving multiple interactive functions:

1) Trajectory specification

2) Virtual previews of robot motion

3) Visualization of robot parameters

4) Online reprogramming during simulation and execution

5) Gesture and EMG-based control of robot trajectory execution

6) Online virtual barrier creation and visualization

We validated our AR-robot programming interface by comparing it with kinesthetic teaching and other standard robot control methods, and found promising results for our system. Furthermore, we are currently collaborating with the German aerospace agency DLR in applying our system to the pleating process of industrial carbon-fiber-reinforcement-polymer manufacturing.