What is the relationship between neural activity in the brain and the perceptual world we experience? We explore this connection by using physiological, psychophysical, and computational techniques in animals and humans. This allows us to answer questions about how visual information is represented in the activity of neurons, where in the system critical perceptual computations are performed, and how neural activity is related to our visual experience.
Perception in multiple sensory modalities is an active process that involves exploratory behaviors. A critical form of exploratory behavior is sensory sampling driven by motor action. The various forms this takes include sniffing in olfaction, whisking in somatosensation, and saccadic eye movements in vision. A question of fundamental importance in each of these modalities is whether the active form of sampling has some role in perception beyond simply providing a series of sensory snapshots to the brain. We have found that saccades have widespread effects on the activity of single neurons and neural networks; these effects make visual processing more efficient and optimized for extracting detail from natural visual input. The mechanism responsible for the influence of visual exploration on perception appears to involve a copy of the motor signal that produces eye movements (i.e. a corollary discharge signal) which is fed into early visual processing.
An example of the effect of active visual sensing via saccadic eye movements on perception is the changes in spectral sensitivity of neurons – saccades reduce neural and perceptual sensitivity to low spatial frequencies and enhance sensitivity to high spatial frequencies. These changes may simultaneously reduce the perception of blur that would occur from eye movements and accentuate the perception of visual details that are important for object recognition, reading, and so on. Another consequence of active sensing concerns the interactions between neurons in visual cortex. Just after a saccade, when a new visual fixation begins, there is a precipitous drop in the noise correlations between neurons. This decrease in cell coupling may reduce redundancy in the neural representation and increase information transmission in visual cortex.
These findings related to active visual sensing appear related to active perception in other modalities. In rodent somatosensory and olfactory systems, active perception via sniffing and whisking is associated with changes in neural correlations as well as improvements in perceptual discrimination. Thus, there appear to be general principles used by the nervous system to optimize perception with the aid of motor input.
A Visual Assistant for People with Low Vision and Blindness
The National Eye Institute estimates that in the United States there are presently over 1 million legally blind people (32 million worldwide) and this number will increase to about 4 million by 2050. Some of these people can be helped by relatively simple and inexpensive devices such as magnifiers and canes. However, with more severe vision loss, people are likely to experience loss of independence and reduced quality of life. Blindness has a major impact on quality of life, as evidenced by the fact that over 70% of working-age people with significant visual impairment are unemployed.
While there are causes of blindness that can be readily corrected, in many cases there is no effective treatment. One solution is to restore vision by surgically placing electrodes in the retina or visual cortex and electrically stimulating the brain based on the light level recorded by a head-mounted camera. Systems of this type are being developed and they have great potential. However, there are downsides to the implanted systems that include the risks of surgery, high cost, and a form of vision that is severely limited compared to normal perception. There is also a wide array of devices and smartphone apps that perform specialized functions (text readers, obstacle detectors, color identifiers).
The goal of our research is to develop and test a new type of inexpensive general purpose visual assistive device for individuals with blindness and low vision. It takes advantage of recent advances in artificial intelligence including computer vision, machine learning, optical character recognition, speech recognition, and 3D sound rendering to give the user the ability to recognize, localize, and interact with objects and people in their vicinity.