Learning How the Brain Processes Language
The National Science Foundation (NSF) has awarded a $1.02 million grant to scientists at Rice University and McGovern Medical School at UTHealth to study how the brain processes language. The collaborative research may one day help people who have lost the ability to communicate.
The grant is part of a $13.1 million initiative announced by the NSF last August to support integrative, fundamental research for the federal Brain Initiative introduced by President Barack Obama in 2013. The funds will support the analysis of data from intracranial recordings in patients with epilepsy who undergo brain surgery at Memorial Hermann Mischer Neuroscience Institute at the Texas Medical Center.
A language team assembled by Nitin Tandon, M.D., a professor in the Vivian L. Smith Department of Neurosurgery at McGovern Medical School at UTHealth, includes Rice University electrical and computer engineers Behnaam Aazhang, Ph.D., and Aydin Babakhani, Ph.D. The long-term goal is to design and prototype wireless, inductively charged implants that could enable neurosurgeons like Dr. Tandon to help patients to communicate through a computer interface.
“The human vocabulary is large, yet we are able to select the most appropriate words at very high speeds and assemble them in a way that conveys meaning,” Dr. Tandon says. “How we do so is not at all understood.” The laboratory teams of Dr. Aazhang, Dr. Tandon and Dr. Babakhani will first collect and analyze data from patients under Dr. Tandon’s care who volunteer for the study. As director of epilepsy surgery at the Mischer Neuroscience Institute, Dr. Tandon has performed hundreds of surgeries to implant electrodes in patients with epilepsy to monitor and treat their conditions. “While our patients are in the hospital waiting for seizures to occur so we can localize their epilepsy, we ask them to participate in a variety of language experiments so we can study how brain regions are engaged in this process,” he says.
Translating the Brain's Signals
Electrodes that monitor signals produced by neurons provide only coarse data from a limited region of the brain and require connections that pass through the skull. By the end of the three-year project, the team hopes to develop a wireless implant prototype that will transmit data from hundreds of deep-brain and subsurface electrodes.
“People often ask, ‘What does this part of the brain do? What does that part do?’” Dr. Tandon says. “But nothing in the brain does anything by itself. The parts that can load an abstract concept into a word and then tell your mouth to move are not all in one spot. And they have to communicate with each other. We want the ability to intercept and translate those signals.”
Pairing Brain Signals with Technology
Tandon observes that as many as 100,000 Americans suffer brain injuries that impair speech each year. “We hope one day to be able to provide wireless brain implants that will help these patients communicate via computer programs,” he says. “Using the incomplete language network that remains, these prosthetics would reconstruct speech and allow folks to communicate their basic needs and emotions. A computer would try to understand what the person wants to say and create a response. That individual would then agree or disagree with the response.”
The first step will be the low-power prototype to acquire neural signals and stimulate neurons. The Rice team has extensive experience designing, building and testing integrated analog/radio frequency chips for signal acquisition and stimulation.
Tandon says Rice, McGovern Medical School at UTHealth and Baylor College of Medicine scientists have worked closely in recent years to form neural engineering collaborations. “This grant is part of a greater effort on our part to create in the Texas Medical Center the best place to develop neural devices,” he says. “We have a long track record of innovation in cardiology and cardiothoracic surgery. It’s now time for this to happen in neuroscience.”