AI-Enabled Virtual Environment Interaction
Various military groups are developing extended reality (XR) solutions to fit within their overall training curricula but are encountering similar challenges in implementing interaction with virtual environments. To address and overcome these challenges, we developed a flexible hand-tracking software layer for XR applications that:
- Combines data from multiple COTS sensors (e.g., passive optical sensors, controllers, VR gloves), and presents the resulting improved qualitative resolution tracking data to higher-level applications as if originating from a single sensor and in a manner agnostic to the choice of sensors.
- Presents sensor data to higher-level applications via a single hand-tracking application programming interface (API).
- Enables additional sensors or sensor combinations to be integrated with VR applications without modifications to those applications.
- Enables instructors to insert their hands into trainees’ VR environment from tablet-based Instructor Operator Stations.
- Enables Artificial intelligence (AI) based recognition (>95% accuracy) of cockpit control interaction gestures.
Developed software overcomes common XR environment interaction challenges and dramatically decreases the cost of fielding, scaling, and expanding XR training systems by seamlessly incorporating multiple disparate COTS components. Our software is integrated, or is currently being integrated, with:
- USAF Pilot Training Next (PTN) T-6 VR Flight Simulators
- PTN T-6 VR Emergency Procedure Trainer
- USAF AC-130U VR Checklist Trainer
- USAF AC-130J VR Technology Demonstrator
- USAF AC-130J VR Combat Mission Trainer
- USAF A-10C VR Flight and Mission Simulator
- USAF VR Fuel Cell Maintenance Trainer
- USN MR MK 16 Rebreather EP Trainer