Incorporating intent inference and feedback control into the design of brain-machine interfaces for navigating vehicles

Speaker:

Abdullah Akce

Date and Time:

January 28, 2011 - 2:50pm - 3:10pm

Presentation Abstract:

In this talk, I will describe a new approach to the design of brain-machine interfaces for controlling robotics systems. These interfaces translate measurements of brain activity into commands for an external device, effectively allowing people to control robots just by thinking. This field achieved significant progress in recent years and enabled human patients to control robotics systems like artificial limbs, wheelchairs, and humanoids. In each case, the role of the interface is to facilitate quick and reliable communication of intent, i.e. a description of what the user wants the robot to do. Because of the inherent uncertainty in measurement and interpretation of brain activity, this process requires a compact representation of possible intent that lends itself to statistical inference. For navigating vehicles, intent takes the form of a desired path to be followed by the vehicle. Viewing interface design as a problem of efficiently communicating intent, our approach is to learn a prior distribution over intent from past observations and to construct an optimal communication protocol that says exactly how user inputs and feedback should be generated to learn the user’s intent. I will demonstrate this approach in two interfaces that we have designed. The first interface allows a human pilot to fly an unmanned aircraft with input only from an electroencephalograph (EEG). Hardware experiments with a real aircraft show that this interface works in practice. The second interface allows a human user to navigate a wheelchair in home environments without colliding with obstacles.