Speaker: Prof. Michael E. Goldberg
Date/Time: March 19 (Wed), 14:30 - 16:00
Place: CiNet Building, Conference Room A&B
Title: "How the parietal cortex creates an accurate representation of visual space for action and perception despite a constantly moving eye."
Abstract: In order to link perception and action the brain must have a spatially accurate representation of the visual world, so it can generate actions appropriate to the objects it perceives. The only way visual information enters the eye is through the retina, which moves constantly between brief fixations. The retinal location of targets for action is not useful for calculating movements to acquire those targets. Two strategies have been postulated to calculate the accurate location of movement targets: Helmholtz suggested that the brain knows the command to move the eye, and therefore can use that motor command to update the sensory representation. Hering suggested that the brain can calculate accurate target location if it knows the position of the eye in the world. In keeping with Helmholtz's suggestion, the receptive fields of parietal neurons are remapped around the time of an eye movement, so that a neuron will respond to a stimulus not in its receptive field as determined in a fixation task if that stimulus will be moved into the receptive field by a saccade. This remapping is accomplished by a stretching of the receptive field along the trajectory of the saccade, so that a stimulus that flashes in an intermediate location crossed by the receptive field during the saccade, but not in either the current or future receptive fields as defined by the saccade will drive the cell briefly around the time of the saccade. The receptive field stretches like a rubber band: the intermediate location drives the cell before the future receptive field does. In keeping with Hering's suggestion, there is a representation of eye position in somatosensory cortex. Neurons in parietal cortex have their visual responses modulated by the position of the eye in the orbit (Andersen and Mountcastle, 1983) and this can be used to calculated target position in space (Zipser and Andersen, 1988). The somatosensory cortex signal lags eye position by 60 ms, and the eye position modulation of visual responses lags eye position by at least 150 ms, and is dependent upon somatosensory cortex. Thus the position signal is too slow to do the job for stimuli that flash briefly around the time of a saccade, but could be used to calibrate the efference copy signal.
Michael E. Goldberg M.D.
David Mahoney Professor of Brain and Behavior in the Departments of Neuroscience, Neurology, Psychiatry, and Ophthalmology
Columbia University College of Physicians and Surgeons
Website: mahoney.cpmc.columbia.edu/goldberglab
Date/Time: March 19 (Wed), 14:30 - 16:00
Place: CiNet Building, Conference Room A&B
Title: "How the parietal cortex creates an accurate representation of visual space for action and perception despite a constantly moving eye."
Abstract: In order to link perception and action the brain must have a spatially accurate representation of the visual world, so it can generate actions appropriate to the objects it perceives. The only way visual information enters the eye is through the retina, which moves constantly between brief fixations. The retinal location of targets for action is not useful for calculating movements to acquire those targets. Two strategies have been postulated to calculate the accurate location of movement targets: Helmholtz suggested that the brain knows the command to move the eye, and therefore can use that motor command to update the sensory representation. Hering suggested that the brain can calculate accurate target location if it knows the position of the eye in the world. In keeping with Helmholtz's suggestion, the receptive fields of parietal neurons are remapped around the time of an eye movement, so that a neuron will respond to a stimulus not in its receptive field as determined in a fixation task if that stimulus will be moved into the receptive field by a saccade. This remapping is accomplished by a stretching of the receptive field along the trajectory of the saccade, so that a stimulus that flashes in an intermediate location crossed by the receptive field during the saccade, but not in either the current or future receptive fields as defined by the saccade will drive the cell briefly around the time of the saccade. The receptive field stretches like a rubber band: the intermediate location drives the cell before the future receptive field does. In keeping with Hering's suggestion, there is a representation of eye position in somatosensory cortex. Neurons in parietal cortex have their visual responses modulated by the position of the eye in the orbit (Andersen and Mountcastle, 1983) and this can be used to calculated target position in space (Zipser and Andersen, 1988). The somatosensory cortex signal lags eye position by 60 ms, and the eye position modulation of visual responses lags eye position by at least 150 ms, and is dependent upon somatosensory cortex. Thus the position signal is too slow to do the job for stimuli that flash briefly around the time of a saccade, but could be used to calibrate the efference copy signal.
Michael E. Goldberg M.D.
David Mahoney Professor of Brain and Behavior in the Departments of Neuroscience, Neurology, Psychiatry, and Ophthalmology
Columbia University College of Physicians and Surgeons
Website: mahoney.cpmc.columbia.edu/goldberglab