Datasets / Gesture Commanding of a Robot with EVA Gloves Project


Gesture Commanding of a Robot with EVA Gloves Project

Published By National Aeronautics and Space Administration

Issued almost 10 years ago

US
beta

Summary

Type of release
a one-off release of a single dataset

Data Licence
Not Applicable

Content Licence
Creative Commons CCZero

Verification
automatically awarded

Description

<p>Gesture commanding can be applied and evaluated with NASA robot systems. Application of this input modality can improve the way crewmembers interact with robots during EVA.</p><p>Gestures commands allow a human operator to directly interact with a robot without the use of intermediary hand controllers.  There are two main types of hand gesture interfaces: data glove-based devices and computer vision techniques. Data glove-based devices are worn by the human and capture hand movements through embedded sensors. Computer vision techniques interpret hand movements by using the video feed from cameras. There is a need to assess the feasibility of using both approaches when the person commanding a robot is wearing EVA gloves. This is because EVA gloves can restrict movements of the hand and affect gesture recognition accuracy and recognition speed.</p>