Mind Flight: Brain-Computer Interface-Driven Design of Simulated Aircraft Control Frameworks
(Room S320C)
29 Nov 17
2:00 PM
-
2:30 PM
Speaker(s):
Denise D'Angelo, Johns Hopkins Applied Physics Laboratory; Nathan Turner, Johns Hopkins Applied Physics Laboratory; Zachary Koterba, Johns Hopkins Applied Physics Laboratory; James Beaty, Johns Hopkins Applied Physics Laboratory; Daniel Cybyk, Johns Hopkins Applied Physics Laboratory; Eric Pohlmeyer, Johns Hopkins Applied Physics Laboratory; Johnathan Pino, Johns Hopkins Applied Physics Laboratory; Matthew Rich, Johns Hopkins Applied Physics Laboratory; Francesco Tenore, Johns Hopkins Applied Physics Laboratory; Jonathan Ellsworth, Johns Hopkins Applied Physics Laboratory; Matthew Fifer, Johns Hopkins Applied Physics Laboratory; David Handelman, Johns Hopkins Applied Physics Laboratory; Michael McLouglin, Johns Hopkins Applied Physics Laboratory; Robert Ide, Johns Hopkins Applied Physics Laboratory; Brock Wester, Johns Hopkins Applied Physics Laboratory; Brendan Johnson, Johns Hopkins Applied Physics Laboratory
As the complexity of military systems advances, so too must the human-machine interfaces that operators use to control those systems. In support of DARPA’s Revolutionizing Prosthetics Program, we developed a method and associated technologies to test the efficacy of novel control interfaces with scalable virtual and live aircraft control frameworks. The design of these test frameworks supports compatibility with multiple control modalities, ranging in complexity from joysticks, eye-tracking, and electromyography sensors, to the direct decoding of neural activity within the brain. MindFlight’s virtual fixed-wing aircraft control framework leverages a commercial-off-the-shelf simulation and visualization platform commonly used for civilian and military aviation training. Early control evaluations featured basic aircraft control tasks, such as two-degree-of-freedom control of a single aircraft in free flight. Additional features provided support for increasingly complex test paradigms involving navigation through hoops courses or simultaneous control of multiple aircraft. The test system also supports tasks in which the pilot must make control decisions based on novel information provided via visual cues, vibrotactile stimulation of the skin, or even intracortical microstimulation of neurons. Several tests with this platform have shown its usefulness for assessing brain-computer interface control of simulated aircraft.
Proof of concept demonstrations involving live aircraft will be critical to assessing the effectiveness of MindFlight’s method for evaluating the operational utility of novel control interfaces. A custom-built live aircraft test framework enables control of one or multiple quadrotor unmanned aerial vehicles (UAVs) by a single operator using any of the same control modalities and interfaces as the virtual fixed-wing aircraft control framework. The quadrotor framework supports multiple control modes and enables test operations from a remote control site