A button at a crosswalk
Features & Articles

The same motor task can make different groups of brain cells light up

Tags
  • Innovation and Research
  • Swanson School of Engineering

Standing at a crosswalk, the signal changes from “don’t walk,” to “walk.” You might step out into the street straight away, or you might look both ways before you cross.

In either scenario, you see the light change, and you cross the street. But the context is different; in one case, you didn’t think twice. In the other, you waited, looked to the left and right, and then stepped into the street.

New research from Pitt’s Swanson School of Engineering shows that even though those two scenarios involve the same action, they light up parts of the brain in different ways. The discovery may help researchers understand how other structures of the brain work and even develop new algorithms for self-driving cars.

Researchers have known that certain brain activity when you see the light change and when you step out into the street are the same no matter the context — there’s a known “pathway” that a neuron’s activity travels.

Neeraj Gandhi, a bioengineering professor in the Swanson School of Engineering wanted to know if anything happens along that pathway between the time you see the light change — a stimulus — and the moment you step into the street — an action. Or does the pathway for “crossing the street” look the same, no matter the context? 

When measuring the activity of neurons in a part of the brain called the superior colliculus, which governs reactions to visual stimuli, the team found bursts in different groups of brain cells when a task was immediate and when it was delayed.

“If there are two different contexts, even though you’re making exactly the same movement, the neural activity in the brain is different,” Gandhi said. “In addition to the motor/action command, there is other activity there that tells you something about what’s going on cognitively in a given structure.”

Gandhi and his team published their findings Sept. 29 in the Proceedings of the National Academy of Sciences. 

From an engineering standpoint, Gandhi said, the finding may have implications for algorithm design. For instance, a similarly designed system could serve as a framework for an autonomous vehicle system that could accelerate when a light turns green, but also delay that action if it senses something in the crosswalk. The system could analyze the object and, if the coast is clear, it could then begin to drive.

Lead author Eve Ayar, a PhD student at Carnegie Mellon University and a member of Gandhi’s lab, says their results may have implications for better understanding the mechanisms underlying executive function — and ways in which it may be impaired.

“There are a lot of disorders out there where people are unable to take in that sensory stimulus in [their] environment and make some kind of movement or action in response to that,” Ayar said. Soon researchers may be able to build models to understand how these systems work and the ways in which they can be disrupted.

“I think this is valuable not only for better understanding this structure of the brain, but potentially it will help us understand how other regions in the brain are operating as well,” Ayar said.

 

— Brandie Jefferson, photography by Aimee Obidzinski