NASA SBIR 2014 Solicitation
FORM B - PROPOSAL SUMMARY
PROPOSAL NUMBER: |
14-1 H20.01-9561 |
SUBTOPIC TITLE: |
Human-Robotic Systems - Manipulation Subsystem and Human-System Interaction |
PROPOSAL TITLE: |
Adaptive LIDAR Vision System for Advanced Robotics |
SMALL BUSINESS CONCERN (Firm Name, Mail Address, City/State/Zip, Phone)
Honeybee Robotics Spacecraft Mechanisms Corporation
460 West 34th Street
New York, NY 10001 - 2320
(646) 459-7819
PRINCIPAL INVESTIGATOR/PROJECT MANAGER (Name, E-mail, Mail Address, City/State/Zip, Phone)
Jason Herman
herman@honeybeerobotics.com
460 West 34th Street
New York, NY 10001 - 2320
(646) 459-7819
CORPORATE/BUSINESS OFFICIAL (Name, E-mail, Mail Address, City/State/Zip, Phone)
Chris Chapman
chapman@honeybeerobotics.com
460 West 34th Street
New York, NY 10001 - 2320
(646) 459-7802
Estimated Technology Readiness Level (TRL) at beginning and end of contract:
Begin: 1
End: 3
Technology Available (TAV) Subtopics
Human-Robotic Systems - Manipulation Subsystem and Human-System Interaction is a Technology Available (TAV) subtopic
that includes NASA Intellectual Property (IP). Do you plan to use
the NASA IP under the award? No
TECHNICAL ABSTRACT (Limit 2000 characters, approximately 200 words)
Advanced robotic systems demand an enhanced vision system and image processing algorithms to reduce the percentage of manual operation required. Unstructured environments, whether man-mad (e.g., International Space Station) or natural (e.g., Mars), present significant challenges to supervised autonomy or fully autonomous systems advanced perception sensors and associated software are required. This will be particularly important both for future long duration exploration missions where the transmit (Tx) / receive (Rx) delay will be substantial and a high degree of autonomy will be required to maximize science gain, as well as for telerobotic systems where a human operator is IVA and advanced operations in a short timeline are desired. No solution currently exists for small robotic platforms. Honeybee Robotics proposes to develop a compact, wide-angle, Light Detection and Ranging (LIDAR) system that is able to detect dynamic changes in the field of view (FOV) and focus the laser scan pattern centered on the area of interest while maintaining a lower-resolution fixed FOV for robotic path planning, navigation, inspection, and identification tasks.
POTENTIAL NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
A recent collaborative survey entitled, 'A Roadmap for US Robotics From Internet to Robotics' identified robust 3D perception, planning and navigation, intuitive human-robot interfaces as critical capability gaps that are cross-cutting for the robotics industry which includes space exploration . The Adaptive LIDAR System is ideally suited to telerobotic navigation, path planning, inspection, and identification. This is directly applicable to NASA's planetary exploration initiatives (e.g., Moon, Mars, & NEOs). Current robotic platforms, such as MER and MSL rovers, require significant manpower to analyze and plan mobility operations to ensure obstacle avoidance as well as identify objects of interest for science operations. Some of these tasks could be automated with an adaptive LIDAR system greatly enhancing tactical planning algorithms and simplifying crew telerobotic interfaces. In addition, an adaptive LIDAR system can be used for advanced telerobotic research and development at NASA centers. To realize advanced telerobotic systems for space exploration, a large amount of development and testing is required both of sensing technologies as well as intelligent control algorithms. The proposed system will provide a platform for which advanced algorithms can be developed and implemented.
POTENTIAL NON-NASA COMMERCIAL APPLICATIONS (Limit 1500 characters, approximately 150 words)
Over the past decade, Unmanned Ground Vehicles (UGVs) have proven their worth both on the battlefield and in search and rescue operations. Thousands of man-transportable Packbot and Talon platforms have been deployed overseas. From explosive ordinance disposal (EOD), to urban search and rescue (USAR), to intelligence, surveillance and reconnaissance (ISR), UGV usage for defense and homeland security initiatives is increasing.
The UGVs of the future must have advanced degree of autonomy, lowering the attention demands on the operator. Three-dimensional sensing technology is at the heart of such functionality, enabling sophisticated telerobotic manipulation, robust autonomous navigation, and detailed survey and inspection. A compact LIDAR system that is able to detect dynamic changes in the FOV and focus the laser scan pattern centered on the area of interest while maintaining a lower-resolution fixed FOV for path planning and navigation tasks is the next advancement for UGVs.
In industry, the automation of operations in partially unstructured environments, e.g. pallet transport and stowage, earth moving, steel construction, crop harvesting, requires advanced sensors. Automation in these more challenging environments is beginning to mature in the mining, agricultural, personal assistance, and logistics industries. The coming decade will see a large increase in demand for the sensors that enable smarter, more flexible operations.
TECHNOLOGY TAXONOMY MAPPING (NASA's technology taxonomy has been developed by the SBIR-STTR program to disseminate awareness of proposed and awarded R/R&D in the agency. It is a listing of over 100 technologies, sorted into broad categories, of interest to NASA.)
|
3D Imaging
Image Analysis
Perception/Vision
Ranging/Tracking
Robotics (see also Control & Monitoring; Sensors)
|
Form Generated on 04-23-14 17:37
|