Journal of Neurophysiology

Smooth Pursuit of Nonvisual Motion

Marian E. Berryhill, Tanya Chiu, Howard C. Hughes


Unlike saccades, smooth pursuit eye movements (SPEMs) are not under voluntary control and their initiation generally requires a moving visual target. However, there are various reports of limited smooth pursuit of the motion of a subject’s own finger in total darkness (pursuit based on proprioceptive feedback) and to the combination of proprioception and tactile motion as an unseen finger was moved voluntarily over a smooth surface. In contrast, SPEMs to auditory motion are not distinguishable from pursuit of imagined motion. These reports of smooth pursuit of nonvisual motion cues used a variety of paradigms and different stimuli. In addition, the results have often relied primarily on qualitative descriptions of the smooth pursuit. Here, we directly compare measurements of smooth pursuit gain (eye velocity/stimulus velocity) to visual, auditory, proprioceptive, tactile, and combined tactile + proprioceptive motion stimuli. The results demonstrate high gains for visual pursuit, low gains for auditory pursuit, and intermediate, statistically indistinguishable gains for tactile, proprioceptive, and proprioceptive + tactile pursuit.


Although most formal models of smooth pursuit eye movements consider retinal slip the controlling input to the smooth pursuit control system (Krauzlis and Lisberger 1994; Robinson 1965), it is known that other forms of input support some degree of smooth pursuit. Expectancies form one class of these nonvisual inputs because brief periods of pursuit can occur in anticipation of the movement of a stationary visual target (Kowler 1989), and smooth pursuit continues for brief time periods after a visual stimulus is extinguished (e.g., Becker and Fuchs 1985; Kveraga et al. 2001; Whittacker and Eaholtz 1982). This predictive ability helps overcome phase lags that would otherwise result from the relatively lengthy latencies of the visual control signals (e.g., Bahill and McDonald 1983; Deno et al. 1995).

Proprioceptive signals can also elicit smooth pursuit. Several studies have examined smooth pursuit eye movements as subjects follow the movement of their own hand in the dark [Gauthier and Hofferer 1976; Gertz 1916 (cited in Ilg 1997); Glenny and Heywood 1979; Hashiba et al. 1996; Steinbach 1969, 1976], or in the light but with the hand remaining invisible (Watanabe and Shimojo 1997). In contrast, auditory signals are apparently unable to generate motion signals capable of supporting smooth pursuit eye movements; in fact, quantitative analyses indicate there is no difference between smooth pursuit to moving auditory stimuli and imagined moving stimuli (Boucher et al. 2004). Although the somatosensory system has access to cutaneous motion information through Meissner’s corpuscles (for a recent review see Johnson 2001) this information does not seem to be very effective either. Tactile information produced by tracking a hand as it slides across a stationary object did not support smooth pursuit eye movements (Watanabe and Shimojo 1997). To our knowledge, the ability to track motion across an extended area of the skin surface has not been investigated, but is explored in the present experiments. The general pattern of results indicates that the smooth pursuit system has a very limited ability to use motion information from sensory modalities other than vision.

Previous studies of smooth pursuit of nonvisual motion relied on qualitative descriptions of the quality of the pursuit eye movements, making direct comparisons between different nonvisual modalities difficult. The goal of this study is to quantitatively compare smooth pursuit eye movements to several modalities of motion stimuli in the same participants and under the same conditions, thereby permitting an accurate assessment of the ability of the smooth pursuit control system to access motion signals from the following modalities: vision, audition, proprioception, tactile, and combination tactile + proprioception.



Ten participants who contributed data (eight female), ranging in age from 19 to 56 yr, were drawn primarily from the graduate community. The three authors participated and two (MB, HCH) had previous experience in smooth pursuit experiments. The seven naïve participants had not participated in smooth pursuit eye movement studies, although two participants were regular participants in saccade studies. Participants were paid $6.00 per session. All protocols were approved by the Dartmouth College Committee for the Protection of Human Subjects and each participant signed an informed consent document before participation.


A custom-made pendulum (84 cm long) was equipped with a green light-emitting diode (LED), a Piezo-electric speaker, and a rubber wheel at the base. The pendulum position was determined by the voltage across a potentiometer and was recorded simultaneously along with the eye movements. Eye movements were recorded using scleral search coils (Skalar Medical) (Robinson 1963). The spatial resolution of the system is 2.0 min of arc. Eye position was digitized with 12-bit resolution at a sampling rate of 1,000 Hz and stored to disk for off-line analysis.

The pendulum was calibrated before experimentation and the correspondence between voltage output and degrees of eccentricity determined. Each session began with a calibration of eye position. During calibration, participants sat at a viewing distance of about 20 cm and fixed their gaze at four stationary LEDs arrayed in a rectangle located at 85° horizontal eccentricity and 45° vertical eccentricity and a fifth LED on the pendulum located at the origin.


Before each trial, the experimenter told the participant what type of trial was going to take place and when to start. Trial types were presented in pseudorandom order (random without replacement). For visual and audio trials, the experimenter raised the pendulum to a marked position to the right of the participant and released it when the Piezo speaker or LED turned on. For the proprioceptive trials participants held the pendulum with their preferred hand and moved it back and forth horizontally for the duration of the trial. In the tactile condition, the wheel at the base of the pendulum was placed on the dorsal surface of the participant’s bent forearm and the experimenter moved the pendulum back and forth along the participant’s arm while the participant tried to track this movement by eye. In the combination proprioceptive + tactile condition, the participant moved the pendulum along his/her own arm. In all but the visual condition, the experiment took place in the dark and participants could not see the pendulum. In each session five to ten trials of each condition were recorded. Each participant performed two sessions. Each subject contributed ten trials per condition for a total of 50 trials per person.

Data analysis

The saccades from each eye trace were removed using a saccade-detecting algorithm and visually inspected to ensure accuracy. Eye and pendulum velocity traces were low-pass filtered using a Butterworth filter to remove noise >25 Hz. Saccades were identified as regions where the eye velocity exceeded 1.5 SDs of the mean velocity over a 10-ms period. These samples were removed from both the eye and pendulum traces. Samples where the pendulum velocity was >100° s−1 were also excluded. Gain was calculated by dividing the horizontal eye velocity by the pendulum velocity for the saccade-free velocity traces. Gain measurements excluded the first 500 ms of each trial when pendulum and eyes began moving. Thus all gain measurements were made under “closed-loop” conditions that optimize smooth pursuit performance. Eye movement latency was also computed, however, by determining the point in time when eye movement velocity increased by >0.8 SDs over a 10-ms interval. This procedure detected both smooth pursuit eye movements and the saccadic eye movements that characterized the response to nonvisual stimuli. Root mean square error (RMSE) was calculated by first squaring the difference between the pendulum position and the eye position, then taking the mean of the square root of these differences. Analyses were conducted using Matlab 6.5 (The MathWorks, Natick, MA).


In Fig. 1, the mean gain values per condition are presented. These data were subjected to a repeated-measures ANOVA, comparing gain values across the five conditions (visual, auditory, tactile, proprioceptive, and combined proprioceptive + tactile). There was a violation of homogeneity of variance, as determined by Mauchly’s test of sphericity (P = 0.03), that led us to use the Greenhouse–Geisser correction. There was a significant main effect of condition [F(2.1,18.6) = 114.25, P < 0.001, partial η2 = 0.93). Pairwise comparisons use Bonferroni corrections to correct for multiple comparisons. Pairwise analyses revealed significantly greater gains for the visual condition than for all other conditions (all P values <0.001). In addition, the smooth pursuit gain for the auditory condition was significantly smaller than all other conditions (all P values <0.02). There was no significant difference between the mean gain values in the tactile, proprioceptive, or combination tactile and proprioceptive conditions. Additional repeated-measures ANOVAs examined eye movement latency × condition and RMSE × condition. There was no main effect of stimulus condition on eye movement latency and no significant differences between the movement latencies for any of the stimulus conditions (all P values >0.1). The RMSE revealed significantly higher accuracy for the visual traces than for the other conditions (all P values <0.01). Figure 2 illustrates example traces from the visual (V), auditory (A), tactile (T), proprioceptive (P), and combination tactile and proprioceptive (T + P) conditions.

FIG. 1.

Overall gain values per condition. A, auditory; T, tactile; P, proprioceptive; T + P, tactile and proprioceptive combination; V, visual. Error bars: SE.

FIG. 2.

Individual-trial example eye traces from subject CG before desaccading. Each panel (A: Visual, B: Tactile and Proprioceptive Combination, C: Proprioceptive, D: Tactile, E: Auditory conditions) includes the pendulum (gray) and horizontal eye trace (black). Respective gain values for each trace are the following: 0.82, 0.50, 0.46, 0.39, and 0.14.

The motion paths in each of these experimental conditions were not identical. The tactile motion was smaller in amplitude because the experimenter was not able to slide the pendulum along the subjects’ arms to the same extent that occurred when the pendulum could swing freely. There were also some differences in the velocity of the pendulum motion in each condition that were revealed by examination of velocity histograms of the pendulum trajectories in each condition. This analysis showed that the stimulus velocities for both the tactile and the tactile + proprioceptive combination trials were on the average slower because, in these conditions, the pendulum was not swinging freely but was moved manually (by either the participant in the proprioceptive or combination condition or by an experimenter in the tactile condition). To control for this difference, we recalculated the gain values using only the samples with pendulum velocities of <50° s−1, a manipulation that successfully equilibrated the velocity distributions across conditions. The statistics remained the same after this correction [F(4,36) = 95.9, P < 0.001, partial η2 = 0.91] and there was no violation of sphericity for these data.


The present study compared ocular tracking of stimulus motion in four different modalities. After saccades were removed, we determined the gain of the remaining smooth pursuit eye movements supported by visual and nonvisual stimulus motion under closed-loop conditions. We used a moving pendulum apparatus equipped with an LED and a speaker. Participants either watched, listened, held, felt, or held and felt the pendulum moving back and forth in the frontoparallel plane. The results confirm and extend previous findings and support the following conclusions: 1) visual input produces the highest gain values of smooth pursuit; 2) auditory input produces the least smooth pursuit; and 3) tactile, proprioceptive, and combined tactile + proprioceptive signals support values of smooth pursuit gain that were intermediate between vision and audition. We attribute the lower than expected visual gain values to several factors: untrained subjects; variations in pendulum velocity; and perhaps, most important, the unusually short viewing distance across all stimulus conditions (20 cm) and the prevention of anticipatory tracking movements by randomizing trial conditions. If anything, these relatively low pursuit gains probably underestimate differences between pursuit of visual and nonvisual motion.

The saccadic system operates on position signals. Because the auditory, proprioceptive, and somatosensory systems all provide position signals and distribute those signals to the saccadic control system, it is not surprising that saccades are initiated by auditory (see Zambarbieri 2002) and somatosensory targets (Groh and Sparks 1996). These nonvisually guided saccades tend to be less accurate (Zambarbieri 2002)—a decrease that parallels the difference in spatial resolution between the auditory and visual systems (e.g., Boucher et al. 2004). In contrast to saccades, sustained smooth pursuit eye movements clearly require visual signals that specify the relative retinal motion of a visual target. Visual motion detectors, commonplace in the visual system, provide a critical source of input to the smooth pursuit control system (e.g., Yamasaki and Wurtz 1991). Although the possibility remains that nonvisual motion signals exist within the nervous system and simply are not provided as inputs to the smooth pursuit control system, it is also possible that measures of smooth pursuit of nonvisual motion can serve as a behavioral index of motion detectors in nonvisual sensory modalities. In this context, it is interesting to note the especially poor pursuit supported by auditory motion because the very existence of auditory motion detectors remains questionable (Ahissar et al. 1992; Grantham 1986; Perrott and Marlborough 1989; see review by Middlebrooks and Green 1991).

Similarly, proprioceptive afferents provide information concerning egocentric body position over time, but it is not known whether the proprioceptive system contains motion detectors that are in any way comparable to visual motion detectors. The finding that proprioceptive signals support smooth pursuit eye movements that are superior to those using auditory motion suggests that the neural representation of body motion might be more robust than the neural representation of auditory motion. Similar considerations apply to tactile motion. The cutaneous system also provides motion information (Hagen et al. 2002), probably through stimulation of Meissner’s corpuscles (Johnson 2001). Hashiba et al. (1996) suggested that the smooth pursuit they observed in auditory and somatosensory conditions might arise from a common gating mechanism (for a recent review see Krauzlis 2003). Our data suggest that if there is a single pursuit gating mechanism it is not efficiently accessed by all sensory modalities. Recent evidence indicates that both tactile and visual information is processed in several common regions, including the superior colliculus (Maravita et al. 2003). Apparently, both the cutaneous and the proprioceptive systems provide motion signals that are significantly more effective than the auditory system in supporting smooth pursuit.


  • The costs of publication of this article were defrayed in part by the payment of page charges. The article must therefore be hereby marked “advertisement” in accordance with 18 U.S.C. Section 1734 solely to indicate this fact.


View Abstract