Humanoid Robonaut "sees" with two AVT Prosilica GC2450C GigE Cameras
March 9, 2011 - Built in partnership with General Motors (GM) and the Oceaneering Space Systems of Houston, NASA’s Robonaut (R2) is the second generation of highly dexterous humanoid robot designed to work alongside humans and execute simple, repetitive or dangerous tasks on Earth or on board the International Space Station (ISS).
R2: The New Generation
Developed in 1997 the first generation of Robonaut (R1) was a human-like robotic assistant capable of performing simple maintenance tasks. Its successor, R2, is a fully modular, highly dexterous 300 pound robot that consists of a head and a torso with two arms and two hands. R2’s many technological improvements include an improved range of sensors that features two Prosilica GC2450 color cameras from Allied Vision Technologies and an Infra-Red Time-of-Flight (TOF) camera. Capable of speeds more than four times faster than R1, R2 features a total of 350 sensors (for tactile, force, position, range-finding and vision sensing) and 38 power PC processors enabling it to perform functions such as object recognition and manipulation.
R2 is also able to react to its surroundings and operate semi-autonomously.
Other technological improvements include optimized overlapping dual arm dexterous workspace, series elastic joint technology, extended finger and thumb travel, miniaturized 6-axis load cells, redundant force sensing, ultra-high speed joint controllers and extreme neck travel. Capable of 42 degrees of freedom (including 24 in its hands and fingers alone) R2’s dexterity allows it to use the same tools as astronauts removing the need for robot specific tools.
Real-Time Vision Recognition
R2’s vision equipment is housed inside his helmet. The system uses color, pixel intensity and texture based segmentation as well as advanced pattern recognition techniques to extract the necessary information. To simplify the procedure the system focuses on certain areas of the image using Region of Interests (ROI). ROI is a function that allows a certain portion of the available pixels to be read out from the camera resulting in a much faster frame rate and less data to be processed. In addition, the TOF sensor data allows the background to be removed in order to focus on the object of interest (tools, boxes, etc…) Built-in classification techniques within the software are used to perform 3D and pattern-recognition functions in real-time to allow R2 to compute feasible trajectories and decide where to place its hands to execute a set of pre-determined tasks, such as opening boxes autonomously.
The software used by the system is MVtec’s Halcon 9.0. MVtec is an AVT software partner.
R2 underwent a series of rigorous tests prior to its launch on space shuttle Discovery on 24th February 2011. During stage 1 R2 will initially be hard-mounted and stationed in the Destiny laboratory on board the ISS where it will be monitored while executing tasks and operations similar to those performed on Earth.
If successful R2 could move on to stage 2 of the mission and become mobile to perform station maintenance tasks such as vacuuming or cleaning filters. The ultimate goal is to send R2 outside the ISS to perform dangerous EVA (Extra-vehicular activity) tasks during stage 3. There are no plans to return R2 to Earth.
MORE CASE STUDIES
Schneider's Ampla MOM software improves data visibility for Australian mining company
Australian Metals, Mining and Minerals (MMM) company, has selected Schneider Electric’s Manufacturing Operations Management (MOM) solution....
Kistler's maXYmos process monitoring system boosts P.J. Hare's quality control
P. J. Hare focuses its development work primarily on hydraulic presses for use in the automotive supply sector. Because customers require traceable...
Magic Software's Magic xpi to facilitate data exchange for ZF Lemforder
To continue its operations, Lemforder had to replicate the automatic exchange of order and production data that had previously existed between its...
Honeywell provides IIoT connectivity to Delek Refining
Delek Refining will use an Industrial Internet of Things (IIoT)-based Connected Performance Services (CPS) offering to improve the performance of...
CADENAS software used to automate LUCAS' robot quote generation
The assemblies of LUCAS are modular and built with numerous variable properties. The primary goal therefore was to automate the preparation of...
Egemin announces AGV pilot project at Audi logistics facility
In a pilot project, Egemin Automation will be equipping part of Audi's logistics operations at Ingolstadt with AGVs. For the project Egemin will...
Warehouse Automation Buyers Guide: A Complete Guide to AS/RS
By Dan Labell, Westfalia Technologies
One fact exists for most manufacturers and distributors: Warehouse automation stands as one of the last...
Lazpiur and Cy-Time partner to deliver machine vision to Chinese market
This machine was designed by Lazpiur to verify the dimensions of different parts through high resolution vision. It is ideal for first article...
Fuji America chooses Saki's automated optical inspection system
Saki's inspection and measurement systems will be used to ensure that the printed circuit boards produced at Fuji adhere to the standards of...
Cognex announces acquisition of two 3D machine vision companies
On October 27, 2016, Cognex acquired EnShape GmbH, a maker of advanced 3D vision sensors and software based in Jena, Germany. On August 30, 2016,...