We will implement large virtual spaces dynamic saccadic redirection methods for 2 person Redirected Walking (RW) in real room/multi-room scaling XR. We’ll use target numerical simulations & user scenario test data to evaluate XR performance metrics against TRL-4 goals. RW can deliver up to 10x real space walkable area in virtual space without 1 user awareness of real space. Our goal is to achieve best virtual vs. real space expansion with 2 users in same room.
We will show hyperrealistic large virtual environments rendering with high level of terrain details appropriate for surface operations & object models such as instruments, tools, vehicles, & structures. Photorealistic accurate scenarios improve full immersion sensations & training outcomes.
We will demonstrate accurate finger, hand, & object tracking for 2 rooms including tracking objects with limited visibility for typical astronaut activities, e.g., unload, transport, assemble.
We will explore novel human-computer interface methods, such as gesture commands, next step scenario guided choice/presentation from eye, hand, finger motions, etc. to determine applicability for Phase I & Phase II R&D. UX interfaces in XR can guide users, follow responses to AR visual cues, present checklists, measure appropriateness of response & use trainee or user responses to guide next step training or operations to improve memory & new eye hand coordination retention during training, operational planning, & operational management.
Our Deliverables include a theoretical framework implemented in hardware that demonstrate basic functionality & critical test environments, with key software components integrated & functionally validated to establish interoperability, with documented test performance demonstrating agreement with analytical predictions. Our (TRL-4) goal is to show breadboard systems with novel RW algorithms in a basic operating environment. All deliverables will be shown in a 2 person hands-on demo in XR.
XR technologies can facilitate many missions, including those related to human space exploration, for planning, training, & operations support as well as for modeling & simulation of future orbital, transportation, & stationary structures for robotic & human use. The Human Exploration & Operations Mission Directorate, Space Technology Mission Directorate, Science Mission Directorate, Artemis, & Gateway programs could benefit from this technology. The crosscutting nature of XR technologies allows it to support all of NASA’s Directorates.
More “realistic” training environments deliver better training outcomes due to improved “muscle memory”. Commercial applications include training of pilots for aerospace; workplace injury reduction among construction, freight, material movers (2.8 million 2019); tele-robotics; surgical training; strength training; telepresence; education; gaming & entertainment.