Although the visual deficits associated with amblyopia are quite well described, eye-hand coordination in amblyopia has received relatively little attention. In our lab we are conducting a series of experiments on eye movements (saccades and pursuit), visually-guided reaching and eye-hand coordination on patients with amblyopia.
These experiments are conducted using high speed binocular 3D infrared video-based pupil and cornea tracking, a high frame rate 3D infrared camera-based motion capture system and a high resolution CRT video display. We have access to real time eye and limb movements, allowing us to open feedback loops for eye and limb movements.
Our custom virtual surface apparatus (VSA) allows for reaching and pointing tasks to be conducted while recording eye and limb movement data. The unique configuration of the VSA enables experiments employing "virtual" targets (i.e., intangible stimuli that appear to be located at a real position in 3D space) while retaining the ability to accurately track a participant’s eye movements. Through the use of a partially-silvered mirror and optional back-lighting, stimuli on the virtual plane can be presented to a participant with or without visual feedback of limb position. Integration with real-time eye and limb tracking instruments allows the VSA to implement complicated experimental paradigms involving eye or limb movement-contingent changes in stimuli.
To date our work has focused on quantifying changes in motor planning and online control of visually-guided movements. Research topics under investigation using the VSA include the effect of conflicting information from visual and proprioceptive stimuli on reaching and eye movements and how motor planning adapts to sudden changes in one or more characteristics of a target that occur once a movement is already in progress.