Vision team’s findings can help victims of stroke, brain injury

crawfordDoug Crawford, associate director of York’s pioneering Centre for Vision Research, and his team have released a groundbreaking study that will help the rehabilitation of people suffering from stroke and brain injuries. Using measurements of eye and head movements, computer simulations and brain wave recordings, Crawford and his associates discovered how a complex brain function guides human movement and could help in the development of “smart” prosthetic devices for amputees. The study appears in the Dec. 16 issue of the journal Neuron.

“In this study,” says Crawford, right, “we discovered a function that sets humans and higher primates apart from lower mammals in terms of how vision controls movements.” The research was conducted at York, which, Crawford noted, has the only lab in the world suitable for conducting the experiments.

In previous studies, Crawford’s team demonstrated that the parietal cortex, located at the back and top of the brain, maps what the eyes are looking at to tell us where things are in space relative to where we are looking. Team researchers observed that this spatial map is updated each time the eyes move and that this function is a basic, primitive mechanism probably common to all mammals.

“With higher mammals like humans, we have now determined that an area in the frontal cortex called the Supplementary Eye Fields (SEF) contains a much more complex map of space for guiding movements of the eyes and head that shift visual gaze,” Crawford said. He added that, by correlating the activation of specific SEF regions with eye and head movements and comparing them to computer model predictions, the team was able to show that the SEF contains several separate maps of space for coding targets relative to the eyes, head or body.

While his team knew the SEF governed complex signals related to high-level vision, attention and planning, Crawford said they did not know much about how the signals actually worked. “To put it in layperson’s terms,” said Crawford, “the parietal cortex relies on one fairly simple spatial language to guide movement, while the SEF is multilingual.”

Likening the SEF to a computer that receives all kinds of processed information from the visual system, Crawford said “it then lets you link it up to different kinds of behaviour, which is the basis of thought,” he said, adding, “to extend the language analogy, it is like a translator, except that this translator also makes decisions and can learn.”

“Since the discovery of the SEF,” Crawford said, “neuroscientists have argued about what kind of coding scheme it uses to command movements. Some said that it codes movement goals relative to the direction where the eyes are pointed, some said it coded targets relative to the head, and some said relative to the body. By using the unique technology available to our lab, we were able to show that the SEF uses all of these coding schemes, depending on which part of the SEF is activated.”

Crawford noted several labs are working on ways to link up brain areas like the SEF with prosthetic devices to allow stroke and other patients to move. “In order to do this, he said, “we need to know what these brain areas are coding [and] what signals they normally send to other parts of the brain.”

martinez-trujilloIn addition to being team leader, lab director and holder of the Canadian Institutes of Health Research grant that funded this work, Crawford is Canada Research Chair in Visumotor Neuroscience and winner of a teaching excellence award in 2004 (see story in the March 12, 2004 issue of YFile). The first author of the study, Dr. Julio Martinez-Trujillo, left, Canada Research Chair and assistant professor of physiology at McGill, and Pieter Medendorp, who did the theoretical simulations and is now a faculty member in the Department of Psychology at the University of Nijmegen, Netherlands, were both research associates at York. Dr. Hongying Wang, Crawford’s current research associate, also assisted Martinez-Trujillo with experiments.

The study involved interdisciplinary cooperation from a number of departments including the Laboratory of Visuomotor Neuroscience in the Centre for Vision Research, the Canadian Institutes of Health Research Group for Action and Perception, York’s departments of Biology, Computer Science and Psychology, and its School of Kinesiology & Health Science.