Datasets / Real-time Stitched Video for Contextual Path Planning Project


Real-time Stitched Video for Contextual Path Planning Project

Published By National Aeronautics and Space Administration

Issued over 9 years ago

US
beta

Summary

Type of release
a one-off release of a single dataset

Data Licence
Not Applicable

Content Licence
Creative Commons CCZero

Verification
automatically awarded

Description

<p>Initial (ICA) Investigation:</p><p>Preventing collisions is the first priority for safe operations of the Space Station Remote Manipulator System (SSRMS).  This depends on the ability of the crew and flight controllers to verify enough clearance exists between the SSRMS, its payload, and surrounding structure.  In the plan, train, and fly stages of each mission significant time is spent developing, documenting, and executing a camera plan that allows each portion of the SSRMS trajectory to be monitored.  This time could be decreased and operational situational awareness increased by using an array of cameras mounted around a boom on the SSRMS that point along a boom.  The output of the these cameras could be stitched together to provide a one composite view that provides clearance monitoring 360° around the boom.  Further, this technology could be used in any application where it is desirable to see proximity on two or more sides of an object - surgery, tele-robotics, deep sea exploration.</p><p>This investigation will ask operators (crew and flight controllers) to compare clearance monitoring of a sample trajectory using conventional external camera sources versus a stitched video presentation from a camera array.  A test plan, script, and scoring for comparison will be used to determine if stitched camera arrays lend themselves to clearance monitoring.  The project investigator researched the required technology, including hardware and software, to perform video stitching to identify an approach that can be used for operator evaluation in the ICA project. </p><p>Initial (Innovation Charge Account Project (ICA)) results:</p><p>A cadre of robotics professionals from JSC Robotics Operations and Astronaut Office participated in a benchmarking effort to quantify efficiency and safety metrics both with and without the use of a stitched camera array.  A modified Cooper-Harper scale was used to determine operator workload.  Other metrics included time required to perform task, motion stopped due to lack of clearance views, whether contact was made with external structures.  Results showed a reduced operator workload, faster completion of the task, and reduced contact with external structure.  Additionally, the technology was presented to the JSC community at Innovation Day 2012 where it won the People's Choice Award.</p><p>Second Phase:</p><p>Rearranging image pixels from multiple cameras to accomplish a perspective shift is computationally expensive.  In the last decade, advances in CPU performance and direct to memory image capturing methods have improved the frame rate and latency associated with video stitching.  In the previous phase (FY ’12 ICA, People’s Choice Winner),  the collaborator was able to achieve 10 frames per second with less than a second latency using off the shelf CPU and camera hardware.  The purpose of Phase 2 is to demonstrate the technology on a larger vehicle (Multi-Mission Space Exploration vehicle) using high-bandwidth (GigE) network, increased CPU/GPU resources, and high-performance cameras. </p><p>Second Phase Results:</p><p>Ten video cameras (the minimum required to obtain coverage around the vehicle while providing enough image overlap) were placed around the upper surface of the MMSEV.  The video streams were piped to an on-board high-end PC where software written in MATLAB performed the perspective shifts and homographic alignment.  The resulting single view was displayed in Graphical User Interface (GUI) that allowed the operator to see the composite ‘birds-eye’ view or zoom in on a view from a particular camera when clearance was a concern.  The MMSEV was maneuvered around the simulated Martian landscape at JSC known as the Rock Pile.  To date the maximum achieved frame rate is 2 frames per second.  To increase frame rate current efforts are focused on transferring the homographic algorithms to a Xylinx field-programmable gate array (FPGA) processor.  </p><p> </p>