Bronze level automatically awarded US beta

This data has achieved Bronze level on 22 October 2015 which means this data makes a great start at the basics of publishing open data.

Real-time Stitched Video for Contextual Path Planning Project

Summary

Type of release
a one-off release of a single dataset

Data Licence
Not Applicable

Content Licence
Creative Commons CCZero

Verification
automatically awarded

Release Date
9 April 2015
Modified Date
8 July 2015
Publishers
National Aeronautics and Space Administration
Keywords
completed, johnson-space-center, project
Identifier
real-time-stitched-video-for-contextual-path-planning-project
Landing Page
http://techport.nasa.gov/view/12118
Maintainers
Ronald Clayton ronald.g.clayton@nasa.gov
Language
en-US

Community verification

Other people can verify whether the answers on this certificate are correct.

This certificate is automatically awarded.

Sign in to verify or report this certificate


Description

<p>Initial (ICA) Investigation:</p><p>Preventing collisions is the first priority for safe operations of the Space Station Remote Manipulator System (SSRMS).  This depends on the ability of the crew and flight controllers to verify enough clearance exists between the SSRMS, its payload, and surrounding structure.  In the plan, train, and fly stages of each mission significant time is spent developing, documenting, and executing a camera plan that allows each portion of the SSRMS trajectory to be monitored.  This time could be decreased and operational situational awareness increased by using an array of cameras mounted around a boom on the SSRMS that point along a boom.  The output of the these cameras could be stitched together to provide a one composite view that provides clearance monitoring 360° around the boom.  Further, this technology could be used in any application where it is desirable to see proximity on two or more sides of an object - surgery, tele-robotics, deep sea exploration.</p><p>This investigation will ask operators (crew and flight controllers) to compare clearance monitoring of a sample trajectory using conventional external camera sources versus a stitched video presentation from a camera array.  A test plan, script, and scoring for comparison will be used to determine if stitched camera arrays lend themselves to clearance monitoring.  The project investigator researched the required technology, including hardware and software, to perform video stitching to identify an approach that can be used for operator evaluation in the ICA project. </p><p>Initial (Innovation Charge Account Project (ICA)) results:</p><p>A cadre of robotics professionals from JSC Robotics Operations and Astronaut Office participated in a benchmarking effort to quantify efficiency and safety metrics both with and without the use of a stitched camera array.  A modified Cooper-Harper scale was used to determine operator workload.  Other metrics included time required to perform task, motion stopped due to lack of clearance views, whether contact was made with external structures.  Results showed a reduced operator workload, faster completion of the task, and reduced contact with external structure.  Additionally, the technology was presented to the JSC community at Innovation Day 2012 where it won the People's Choice Award.</p><p>Second Phase:</p><p>Rearranging image pixels from multiple cameras to accomplish a perspective shift is computationally expensive.  In the last decade, advances in CPU performance and direct to memory image capturing methods have improved the frame rate and latency associated with video stitching.  In the previous phase (FY ’12 ICA, People’s Choice Winner),  the collaborator was able to achieve 10 frames per second with less than a second latency using off the shelf CPU and camera hardware.  The purpose of Phase 2 is to demonstrate the technology on a larger vehicle (Multi-Mission Space Exploration vehicle) using high-bandwidth (GigE) network, increased CPU/GPU resources, and high-performance cameras. </p><p>Second Phase Results:</p><p>Ten video cameras (the minimum required to obtain coverage around the vehicle while providing enough image overlap) were placed around the upper surface of the MMSEV.  The video streams were piped to an on-board high-end PC where software written in MATLAB performed the perspective shifts and homographic alignment.  The resulting single view was displayed in Graphical User Interface (GUI) that allowed the operator to see the composite ‘birds-eye’ view or zoom in on a view from a particular camera when clearance was a concern.  The MMSEV was maneuvered around the simulated Martian landscape at JSC known as the Rock Pile.  To date the maximum achieved frame rate is 2 frames per second.  To increase frame rate current efforts are focused on transferring the homographic algorithms to a Xylinx field-programmable gate array (FPGA) processor.  </p><p> </p>


General Information


Legal Information

This dataset has been created by US Government which means it is required to be in the public domain. However US copyright law only allows open access by US citizens, we have assumed the data is equivalently licensed as CC0 for the rest of the world as this is in the spirit of the US Government’s Open Data policy.
  • The rights statement is at

    http://catalog.data.gov/dataset/real-time-stitched-video-for-contextual-path-planning-project Do you think this data is incorrect? Let us know

  • Outside the US, this data is available under

    Creative Commons CCZero Do you think this data is incorrect? Let us know

  • There are

    yes, and the rights are all held by the same person or organisation Do you think this data is incorrect? Let us know

  • The content is available under

    Creative Commons CCZero Do you think this data is incorrect? Let us know

  • The rights statement includes data about

    its data licence Do you think this data is incorrect? Let us know

  • This data contains

    no data about individuals Do you think this data is incorrect? Let us know


Practical Information

  • The data appears in this collection

    http://catalog.data.gov/organization/nasa-gov Do you think this data is incorrect? Let us know

  • The accuracy or relevance of this data will

    go out of date but it is timestamped Do you think this data is incorrect? Let us know

  • The data is

    backed up offsite Do you think this data is incorrect? Let us know


Technical Information

  • This data is published at

    http://techport.nasa.gov/xml-api/12118 Do you think this data is incorrect? Let us know

  • This data is

    machine-readable Do you think this data is incorrect? Let us know

  • The format of this data is

    a standard open format Do you think this data is incorrect? Let us know


Social Information

  • The documentation includes machine-readable data for

    title Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    description Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    identifier Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    landing page Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    publisher Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    keyword(s) or tag(s) Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    distribution(s) Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    release date Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    modification date Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    temporal coverage Do you think this data is incorrect? Let us know

  • The documentation includes machine-readable data for

    language Do you think this data is incorrect? Let us know

  • The documentation about each distribution includes machine-readable data for

    release date Do you think this data is incorrect? Let us know

  • The documentation about each distribution includes machine-readable data for

    a URL to access the data Do you think this data is incorrect? Let us know

  • The documentation about each distribution includes machine-readable data for

    a URL to download the dataset Do you think this data is incorrect? Let us know

  • The documentation about each distribution includes machine-readable data for

    type of download media Do you think this data is incorrect? Let us know

  • Find out how to contact someone about this data at

    http://catalog.data.gov/dataset/real-time-stitched-video-for-contextual-path-planning-project Do you think this data is incorrect? Let us know

  • Find out how to suggest improvements to publication at

    http://www.data.gov/issue/?media_url=http://catalog.data.gov/dataset/real-time-stitched-video-for-contextual-path-planning-project Do you think this data is incorrect? Let us know