## Abstract

One of the open questions in oculomotor control of visually guided eye movements is whether it is possible to smoothly track a target along a curvilinear path across the visual field without changing the torsional stance of the eye. We show in an experimental study of three-dimensional eye movements in subhuman primates (*Macaca mulatta*) that although the pursuit system is able to smoothly change the orbital orientation of the eye's rotation axis, the smooth ocular motion was interrupted every few hundred milliseconds by a small quick phase with amplitude <1.5° while the animal tracked a target along a circle or ellipse. Specifically, during circular pursuit of targets moving at different angular eccentricities (5°, 10°, and 15°) relative to straight ahead at spatial frequencies of 0.067 and 0.1 Hz, the torsional amplitude of the intervening quick phases was typically around 1° or smaller and changed direction for clockwise vs. counterclockwise tracking. Reverse computations of the eye rotation based on the recorded angular eye velocity showed that the quick phases facilitate the overall control of ocular orientation in the roll plane, thereby minimizing torsional disturbances of the visual field. On the basis of a detailed kinematic analysis, we suggest that quick phases during curvilinear smooth tracking serve to minimize deviations from Donders' law, which are inevitable due to the spherical configuration space of smooth eye movements.

- eye movements
- three-dimensional rotations
- Listing's law
- Donders' law
- vision

although the eye can rotate in multiple ways, the kinematics of many eye movements simply correspond to a fixed-axis rotation of the eye with respect to the bony orbit. This is true for the vestibuloocular reflexes, where the rotation axis of the eye parallels the axis of head rotation. As a result these reflexes move the fovea along the shortest path from one position to the next in retinocentric coordinates. Even though saccades are typically also encoded as fixed-axis rotations, they do not always follow the shortest path from one target to the next. They are characterized by a principle of minimal cost of rotations: they minimize the total angular excursion of the eye (Hepp et al. 1997; Nakayama 1983) or, equivalently, the extent of ocular torsion (Listing's law: Helmholtz 1867; Tweed and Vilis 1990). In contrast, the evolutionary young pursuit system deploys a much richer kinematic repertoire, enabling it to track targets along straight but also curvilinear paths. Indeed, if a target moves off to one side along a curved path, it could not be tracked by a fixed-axis rotation without generating a distortion of the entire visual space. To avoid such disturbances, the pursuit system might choose among the infinitely many non-fixed-axis rotations, those corresponding to a minimal torsion of the visual space, a strategy first proposed by von Helmholtz (1867) as the basis of Listing's law (Hepp 1995; Hepp et al. 1997). Although there is experimental evidence suggesting that Listing's law is obeyed under such circumstances, it has been overlooked that the brain faces in fact an intractable problem when trying to generate perfectly smooth tracking movements requiring non-fixed-axis rotations. Consider for example an observer that tracks a fly along a curvilinear path (Fig. 1). First, he intercepts the fly's track with a saccade and subsequently manages supposedly to track it smoothly. When saccading back to the starting position after a period of smooth tracking, he might find that the world has appreciably rotated opposite to the direction of path rotation. Thus smooth tracking bears the risk of violating a fundamental law, due to Donders (1848), which states that the eye always returns exactly back to its original angular orientation in three dimensions after a closed-path motion. The question thus is what strategy the oculomotor system employs to minimize violations of Donders' law during curvilinear target tracking.

At this time there exist a fair number of studies on two-dimensional, curvilinear smooth pursuit focusing on questions of predictive mechanisms and other aspects of response dynamics (De'Sperati and Viviani 1997; Engel et al. 1999; Kettner et al. 1996; Leung and Kettner 1997; Mrotek et al. 2006; Mrotek and Soechting 2007a, 2007b; Soechting et al. 2005). One important outcome of these studies is the finding that the performance of two-dimensional pursuit cannot be predicted from the system's performance during horizontal and vertical tracking (De'Sperati and Viviani 1997; Engel et al. 1999; Leung and Kettner 1997; Schwartz and Lisberger 1994). It has been noted also that the system is able to smoothly change not only the velocity (Lisberger and Westbrook 1985) but also the direction of tracking, suggesting that angular target velocity might be an important input for the neural control mechanism (Engel et al. 1999). In contrast, saccades, triggered by directional position errors, have been reported to have no appreciable effect on the direction of tracking and are considered to contribute insignificantly to the input of the pursuit system (Engel et al. 1999).

To address Donders' problem of smooth curvilinear tracking, we first investigated how target motion in space can be generally encoded in terms of three-dimensional angular eye velocity during steady-state smooth pursuit. We then show experimentally that eye movements during curvilinear pursuit consist of smooth tracking phases interrupted by small-amplitude quick phases with significant torsional components, a strategy that minimizes the expected violation of Donders' law. Thus, while the system keeps the fovea close to the target, it also controls the orientation of the eye to avoid accumulation of torsion. We show that this dual control of the eye's torsional orientation and gaze velocity cannot be achieved by velocity feedback alone but must be supplemented by quick phases to correct torsional position errors.

## MATERIALS AND METHODS

Four female rhesus monkeys (*Macaca mulatta*) were used in these experiments. The animals were chronically prepared during sterile surgery under isoflurane anesthesia with skull bolts for head restraint. Dual search coils were implanted on one eye under the conjunctiva for eye movement recording as described previously (Hess 1990). All procedures accorded with the National Institutes of Health *Guide for the Care and Use of Laboratory Animals* and were approved by the Veterinary Office of the Canton of Zurich.

### Recording and Representation of Eye Positions

Three-dimensional eye positions were measured using the magnetic search coil technique (Robinson 1963) with an Eye Position Meter 3000 (Skalar, Delft, The Netherlands). Eye position was calibrated as described in Hess et al. (1992), digitized at a sampling rate of 833.3 Hz, and stored on a computer for off-line data analysis. To express eye positions as rotation vectors (Haustein 1989), the zero, or reference positions were defined to be the eye's orientations while the monkey fixated a target 0.8 m straight ahead. In all four animals, Listing's plane tilted, respectively, less than 4° vertically and 1° horizontally from the frontal plane.

During the behavioral tasks, the animal was seated upright, with the head restrained in a primate chair that was mounted within an opaque sphere 1.6 m across. The animals were trained to track a small laser spot (0.35°), which was projected onto the inner wall of the sphere describing circular (radius 5°, 10°, and 15°; 0.067 or 0.1 Hz) or moderate elliptic paths (major axis 20°, minor axis 15°; 0.1 Hz) on a structured background. The quality of smooth tracking was controlled with behavioral windows of 1–2° across. During successful tracking the animal received fluid rewards. Experiments were performed in light, i.e., with a background illumination inside the opaque sphere, which completely surrounded the animal.

### Data Analysis

All tracking responses were analyzed cycle per cycle. Saccades, quick phases, and blink artifacts were detected and marked by applying time and amplitude windows to the time derivative of eye acceleration (jerk). Cycles with saccades or blink artifacts were eliminated by visual inspection. To facilitate identification of quick phase events in terms of magnitude, duration, and peak velocity, eye position traces were rectified by subtracting the sinusoidal modulation determined by least-squares fitting. Three-dimensional angular eye velocity was computed in two fundamentally different ways. In the first conventional approach, we also eliminated all the small steplike shifts in horizontal and vertical eye position, which obviously were components of quick phases that reposition torsional eye position (see results, Fig. 3). We then computed the angular eye velocity (**Ω**) with the formula (Hepp 1990), **Ω** = 2(d** E**/d

*t*+

**× d**

*E***d**

*E*/*t*)/(1 + ‖

**‖**

*E*^{2}), where d

**/d**

*E**t*is the time derivative of eye positions, expressed as rotation vector [

**= tan(ρ/2)**

*E**ê*, closely related to the eye position quaternion

*E*

_{q}= cos(ρ/2) + sin(ρ/2)

*ê*, where

*ê*is rotation axis and ρ is rotation angle]. Finally, we fitted each of the three components of the angular eye velocity (Ω

_{tor}, torsional; Ω

_{ver}, vertical; and Ω

_{hor}, horizontal) with a sum of sinusoids up to the third harmonic of the spatial stimulus frequency using the Levenberg-Marquardt algorithm (MATLAB, The MathWorks). This procedure accurately estimated the actual slope of the slow phase segments in between quick phases. The resulting angular velocity or fits are traditionally called slow phase or slow phase-related angular eye velocity (SPV). In a variant of this approach, we fitted a generic Listing model, calling the resulting fits Listing law-based slow phase-related angular eye velocity (LSPV). This procedure is described in more detail below. In the second approach, we fitted three-dimensional eye position with the same sum of sinusoids without removing the small steplike changes in eye position due to quick phases. On the basis of these position-related fits, we estimated the angular eye velocity using the same formula as above. We refer to this angular eye velocity (fits) as eye position-related angular eye velocity (EPV).

#### Notational remarks.

In this article, we denote vectors as bold letters and represent the components as column vectors or, for convenience, also as row vectors within the text. Entities marked with a caret denote unit vectors. Entities marked with a tilde represent control parameters (vectors, scalars) that ultimately are based on the brain's estimate of target motion. For example, eye position reflecting estimated target position in space is represented as ** Ẽ**; it is a two-dimensional vector that, however, also may be written

**= (0,**

*Ẽ**Ẽ*

_{ver},

*Ẽ*

_{hor}). For simplicity, we do not accumulate carets and tildes on the same entity.

#### Extended generic Listing model.

To describe steady-state smooth pursuit eye movements in a generic way in three dimensions (3-D), we started from the general formulas that express 3-D angular eye velocity as a function of 3-D eye position and velocity, namely, **Ω** = 2(d** E**/d

*t*+

**× d**

*E***d**

*E*/*t*)/(1 + ‖

**‖**

*E*^{2}) and its inverse, d

**/d**

*E**t*= (

**Ω**+

**Ω**×

**+ (**

*E***Ω**·

**)**

*E***)/2 (“×” stands for vector cross product, “·” stands for scalar product; Hepp 1990, 1994). In the following, we refer to these two vector equations, respectively, as**

*E***Ω**=

**Ω**(d

**/d**

*E**t*,

**) and d**

*E***/d**

*E**t*= d

**/d**

*E**t*(

**Ω**,

**). For successful tracking of a far target, the angular eye position and velocity must approximately equal target angular position and velocity taken relative to the eyes (see also Mays and Sparks 1980). In the simple case of straight-line pursuit, the angular velocity of the target is a vector in the observer's frontal plane. Thus, by substituting**

*E***and d**

*E***/d**

*E**t*on the right-hand side of

**Ω**=

**Ω**(d

**/d**

*E**t*,

**) by the brain's estimate of target angular position**

*E***and angular velocity**

*Ẽ***Ω̃**in space, this equation suggests in fact that the angular velocity of the eye (as measured at the motor output) is proportional to the estimated target velocity plus a nonlinear target position- and velocity-dependent term, i.e.,

**Ω̃**= 2d

**/d**

*Ẽ**t*that accounts for

**and**

*Ẽ***Ω̃**being confined to the

*y-z*or frontal plane.

*Equation 1*predicts a nonzero torsional angular eye velocity component at the motor output due to the vector cross product on the right-hand side. How and where this vector cross product is implemented in the brain is beyond the scope of this study. Although this simple heuristic approach is sufficient to explain steady-state straight-line pursuit responses as a function of eye position, it poses serious problems if the target's angular velocity is not confined to the horizontal-vertical plane (see example in Fig. 1: the angular velocity component “dψ/d

*t*” is not in the observer's

*y-z*plane). In that case, the predicted angular velocity of the eye would violate Listing's law as a consequence of the implied (eye position independent) nonzero torsional eye velocity. How can the brain then encode a general target angular velocity in oculomotor commands without violating Listing's law? The most efficient solution is to make use of the rotational degree of freedom of the eye about the gaze line. Specifically, by counterrotating at a certain velocity about the gaze line, the eye can encode the torsional component of the estimated target angular velocity. Thus we define “ocular counterroll” as angular velocity of the eye about the current gaze direction and devise the following necessary condition for a tracking strategy compatible with Listing's law. First, we express the estimated target angular velocity in terms of a magnitude (

**ω̃**

_{target}) times a unit direction (in space), i.e.,

**Ω̃**

_{target}=

**ω̃**

_{target}

*n̂*and assume that a certain fraction of its magnitude is encoded by ocular counterroll, say

**ω̃**

_{CR}= λ

**ω̃**

_{target}. Second, we assume that the motor system encodes the estimated target angular velocity in the form of a linear combination of target angular velocity and ocular counterroll velocity, i.e.,

**Ω̃**′ =

**Ω̃**

_{target}+

**Ω̃**

_{CR}=

**ω̃**

_{target}(

*n̂*+ λ

*ĝ*), where λ is a certain fraction of ocular counterroll velocity and

*ĝ*denotes current gaze direction. We express this relation more concisely in terms of two direction vectors by writing

*Eq. 2*has to be chosen such that the forward component of the estimated target angular velocity cancels out, resulting in [

**f**]

*= 0. Thus a necessary condition for encoding angular target velocity compatible with Listing's law is that the ratio of counterroll velocity to the estimated target velocity,*

_{x}**ω̃**

_{CR}/

**ω̃**

_{target}= λ, compensates the forward directional component of target angular velocity, [

**Ω̃**

_{target}]

*/*

_{x}**ω̃**

_{target}=

*n̂*. If there is no such forward component, Listing's law can be satisfied without any counterroll. With the use of this constraint, the generic equation for eye velocity at the premotor level is

_{x}**= (0,**

*f**f*,

_{y}*f*). This relation encodes the estimated target angular velocity according to the first, linear term of the above-mentioned equation d

_{z}**/d**

*E**t*= d

**/d**

*E**t*(

**Ω**,

**). As previously noted,**

*E***ω̃**

_{target}is the brain's best estimate of instantaneous target angular velocity. The complexity of this vector equation lies in the vector

**, which changes from moment to moment as a function of gaze direction (**

*f**Eq. 2*). Note that this eye velocity is not yet the angular velocity measured at the motor output. To obtain the angular velocity at the motor output, denoted by

**Ω**

_{eye}, a second, nonlinear transformation must be postulated, which follows from the above-mentioned kinematical equation

**Ω**=

**Ω**(d

**/d**

*E**t*,

**). Inserting the estimated horizontal and vertical target position (**

*E***) and velocity (d**

*Ẽ***/d**

*Ẽ**t*as defined in

*Eq. 3*) yields

**‖ is the magnitude of**

*Ẽ***given by ‖**

*Ẽ***‖ =**

*Ẽ**f̂*=

**Ω**

_{eye}/‖Ω

_{eye}‖, measured at the motor output, is a function of horizontal and vertical eye position. This equation, together with the implicit condition

*f*= 0 in

_{x}*Eq. 2*, is not only an accurate implementation of the half-angle rule of Listing's law, when the eye rotates about a fixed axis, but also a good approximation when it performs a more complex form of rotation. Specifically, during straight-line tracking the eye movement is a fixed-axis rotation (i.e.,

*n̂*= 0:

_{x}*f̂*=

*n̂*, λ = 0: no counterrotation). For example, during horizontal tracking the target rotates about the head vertical axis (i.e.,

*n̂*=

_{x}*n̂*= 0,

_{y}*n̂*= 1), whereas the eye rotates about an axis that tilts by half the angle of vertical eccentricity (ϕ), i.e., ϕ/2 = tan

_{z}^{−1}(

*Ê*) = tan

_{z}^{−1}[(

**Ω**

_{eye})

*/(*

_{x}**Ω**

_{eye})

*] (half-angle rule, equivalent to Listing's law). The same is true for vertical tracking (i.e.,*

_{z}*n̂*=

_{x}*n̂*= 0,

_{z}*n̂*= 1). During circular tracking,

_{y}*Eq. 4*approximates the half-angle rule with less than 2% error for gaze eccentricities ε ≤16°. In this range, the angular eye velocity depends on current gaze direction approximately as

**Ω**

_{eye}=

**ω̃**tan(ε)/[1 + tan

^{2}(ε/2)][tan(ε/2), sin(ψ), cos(ψ)], where ψ is the angle and

**ω̃**= dψ/d

*t*is the velocity of the target along the circular path (see Fig. 1; for computational details see appendix a). The angular velocity tilts out of Listing's plane by half the angle of gaze eccentricity ε, i.e., ε/2 = tan

^{−1}[(

**Ω**

_{eye})

*/*

_{x}*y-z*plane continuously changes orientation with

**ω̃**= dψ/d

*t*. Experimentally, we found that this prediction is accurate only up to an additional eye position-independent torsional term δ

**Ω**=

**ω̃**

_{target}(

*f*, 0, 0) that can capture the observed violation of Listing's law during circular pursuit. We refer to this extension of

_{x}*Eq. 4*as the extended generic Listing model. The rationale for this extension is both theoretical (see below,

*Compensation of torsion by ocular counterroll*) and experimental (see results).

#### Listing law-based slow phase-related angular eye velocity.

We also used the extended model [*Eq. 4* plus torsion term δ**Ω** = **ω̃**_{target}(*f _{x}*, 0, 0)] to fit the circular pursuit responses. To estimate the magnitude

**ω̃**

_{target}, we used the equation (Tchelidze and Hess 2007)

*a*

_{μk}and phase angles α

_{μk}(μ = tor, ver, and hor;

*k*= 1, 2, 3). The directional vector

**= (**

*f**f*,

_{x}*f*,

_{y}*f*) was fitted componentwise with a similar sum of sinusoids. The second and higher harmonics of these fits were very small and could therefore be neglected, except for the second harmonics of torsional eye position and velocity (see results). The spatial frequency parameter (ν) was set equal to the spatial stimulus frequency. We determined the coefficient of determination of these fits by computing the generalized

_{z}*R*

^{2}of the (extended) model vs. a reduced model with

*a*

_{μk}= 0 and α

_{μk}= 0 (μ for tor, ver, and hor;

*k*= 1, 2, 3) using the formula

*R*

^{2}= 1 − RSS (full model)/RSS (reduced model), where RSS is the residual sum of squares (for details see Anderson-Sprecher 1994).

So far, no use has been made of *Eq. 2*, which formulates a necessary condition for Listing's law whenever the angular velocity of the target has a nonvanishing *x*-component. To check the prediction that the counterrolling velocity increases as a function of target eccentricity ε, we estimated the counterroll velocity ratio λ by solving the quadratic equation ‖*n̂*‖^{2} = ‖** f** − λ

*ĝ*‖

^{2}= 1, which leads to λ

_{±}=

**·**

*f**ĝ*±

*n̂*

_{±}=

**− λ**

*f*_{±}

*ĝ*. We used this relation to test the prediction of

*Eq. 2*that λ increases with target eccentricity ε as 1/cos(ε) for circular tracking. For this we fitted the curve λ =

*a*+

*b*/cos(ε) to the λ data obtained from fitting the extended generic Listing model to angular eye position and velocity data and computed

*R*

^{2}by comparing this model with a reduced model with

*b*= 0 (Anderson-Sprecher 1994).

#### Compensation of torsion by ocular counterroll.

To comply with Listing's law, the eye must compensate the torsional velocity of the target by counterroll. In this report we provide a geometrical argument showing that this compensation cannot be fully achieved by a smooth (i.e., nonsaccadic) motion of the eye (for a quantitative argument see appendix b and c). We use this geometric argument in the following to backcompute the 3-D angular motion of the eye based on the experimentally measured eye angular position and velocity.

To estimate the required counterroll of the eye during circular or curvilinear tracking, consider three mutually orthogonal vectors labeled *ĝ*, *ĥ*, and *v̂* associated with the gaze line. In the initial orientation (labeled with subscript 0), these vectors coincide with the *x*-, *y*-, and *z*-axes, respectively. For tracking a target that moves in front of the subject along a circle, the eye needs to move first from primary position A straight up to intercept the target at B and then about the *x*-axis, which transports the gaze line from B toward C along a circular path (Fig. 2, *A* and *B*). By this maneuver, the angular orientation of the eye changes as indicated by the respective orientations of the *ĝ*, *ĥ*, and *v̂* frames in positions A, B, and C (see Fig. 2, *A* and *B*; in primary position A the vectors *ĝ*, *ĥ*, and *v̂* coincide with the direction of the *x*-, *y*-, and *z*-coordinate axes). To keep the vertical retinal meridian invariant in space, the eye simultaneously counter rolls about the gaze line (Fig. 2, *B* and *C*). By comparing the orientations of the vectors *v̂* in position B and *v̂*′ in position C, it is obvious that the torsion of the globe, acquired by moving gaze from B to C, cannot be fully compensated by counterroll about the gaze line. This is possible only in the four cardinal positions where ψ = *k*π/2 (*k* = 0, 1, 2, and 3). Thus, if the eye would saccade back from C or from any other noncardinal position to primary position A after perfect smooth tracking along the circular path, Donders' law would be violated. For a quantitative proof, see appendix b and c.

#### Reconstruction of ocular torsion.

We used the three different definitions of angular velocity (SPV, LSPV, and EPV) to backcompute the actual rotation of the eye. Since the angular velocity represents at each moment the direction of rotation of the eye, we took this information to reconstruct the motion of gaze in space. By normalizing the angular velocity, we obtained the direction of the rotation axis *f̂* = **Ω**_{eye}/‖**Ω**_{eye}‖ at any moment in time. Let us denote the time samples by superscripts, i.e., *f̂ ^{k}* is the rotation axis computed from the data sample

*k*that was recorded at the time

*t*=

*k*·Δ

*t*(Δ

*t*= sampling interval). From this, we calculated the rotation

*R*for each sample

^{k}*k*, which we specify by the axis

*f̂*and angular increment Δψ, summarized as

^{k}*R*=

^{k}*R*(

*f̂*, Δψ). The time evolution of gaze direction {

^{k}*ĝ*,

^{k}*k*= 1:

*N*} and its orthogonal vertical component {

*v̂*,

^{k}*k*= 1:

*N*} is thus obtained as follows:

*ĝ*

^{0}=

*ĝ*(

*t*= 0), i.e., gaze orientation at onset of tracking, and

*v̂*

^{0}=

*v̂*(

*t*= 0). Since the vectors

*v̂*,

^{k}*ĝ*, and

^{k}*ĥ*=

^{k}*v̂*×

^{k}*ĝ*form an orthogonal triple of unit vectors, it is straightforward to compute an estimate of 3-D eye position

^{k}

*Ẽ**= (*

^{k}*Ẽ*

_{tor}

^{k},

*Ẽ*

_{ver}

^{k},

*Ẽ*

_{hor}

^{k}) from them. Using this approach, we were particularly interested in comparing the torsional component

*Ẽ*

_{tor}based on the two different estimates of eye angular velocity, the eye position-related (EPV) and the slow-phase angular eye velocity (SPV). Finally, we also computed

*Ẽ**based on a pure slow phase-based counterroll strategy with no intervening quick phases. The rationale for these procedures was to determine the influence of quick phases on the ocular torsion of the eye during curvilinear pursuit. For this, we postulated that the plane spanned by the current gaze direction and its orthogonal vertical component remained always vertical as the initial orientation at onset of the tracking. Using at each step in time the counterrotation*

^{k}*R̃*= R(

^{k}*ĝ*, Δ

^{k}*g*) about the momentary direction of gaze (

^{k}*ĝ*), we computed the following sequence of orthogonal vertical gaze directions:

^{k}*û*

^{0}=

*û*(

*t*= 0) and ρ

*= −tan*

^{k}^{−1}(

*c*

_{x}

^{k}×

*c*

_{z}

^{k}/

*c*

_{x}

^{k}·

*c*

_{z}

^{k}) with

*c*

_{l}

^{k}=

*ĝ*×

^{k}*ê*(

_{l}*l*=

*x*,

*z*). From the vectors

*û*,

^{k}*ĝ*, and

^{k}*l̂*=

^{k}*û*×

^{k}*ĝ*that also formed an orthogonal triple of unit vectors, we obtained a different estimate of 3-D eye positions,

^{k}

*Ẽ̃**, where we were particularly interested in comparing the torsional component with*

^{k}*Ẽ*

_{tor}

^{k}from eye position vector

*Ẽ**.*

^{k}To evaluate the accuracy of the reconstructions, we computed the normalized root mean square error between the reconstructed and the actual least-squares fitted eye position
*R*^{2} values based on the residual sum of squares obtained from the fitted and reconstructed rotations by reverse computation.

## RESULTS

### Basic Observations

Steady-state tracking eye movements consisted of smooth pursuit segments lasting about 100–300 ms separated by small quick phases with a torsional component. This pattern was observed most clearly during tracking of targets describing an elliptic path, as illustrated in Fig. 3, but also during the simpler circular tracking. In the following, we first characterize the global properties of these quick phases before addressing the more intricate analyses of their interactions with slow-phase segments. Since we did not find significant differences for circular and elliptic pursuit trials at the mentioned moderate ellipse eccentricity (*e* = 0.66), we focused our analyses on circular pursuit trials.

We analyzed quick phases during steady-state pursuit in two ways. First, we determined the frequency distribution of the amplitudes in terms of horizontal, vertical, and torsional components. We found that these amplitude components were mostly confined to within less than ±1.5° relative to the sinusoidal least-squares fit of the tracking response. Only rarely did quick phases show larger components in the horizontal or vertical direction. Second, we analyzed the main sequence properties of these rapid eye movements. We found that the peak velocity was linearly correlated to the amplitude of quick phases (Fig. 4, *A* and *B*). For example, linear fits of quick-phase peak velocities during circular pursuit at 15° eccentricity and spatial frequency of 0.067 Hz (*N* = 155 distributed across 6 clockwise and 6 counterclockwise cycles) yielded offset *a* = 0.3 ± 0.1°/s and slope *b* = 16 ± 0.2 s^{−1} (*R*^{2} = 0.83) for torsional components, *a* = −0.5 ± 0.1°/s and *b* = 31 ± 0.1 s^{−1} (*R*^{2} = 0.97) for vertical components, and *a* = −0.6 ± 0.1°/s and *b* = 26 ± 0.2 s^{−1} (*R*^{2} = 0.84) for horizontal components. Separately fitting these quick phases during clockwise and counterclockwise pursuit delivered significantly different values only for the torsional components (clockwise: *a* = 1.9 ± 0.1°/s and *b* = 18 ± 0.4 s^{−1} with *R*^{2} = 0.88; counterclockwise: *a* = −2.3 ± 0.2°/s and *b* = 22 ± 0.5 s^{−1} with *R*^{2} = 0.90). The duration of these quick phases was approximately constant across the small amplitude range (Fig. 4*C*), whereas the peak velocity strongly decreased with increasing duration (Fig. 4*D*).

### Impact of Quick Phases on Ocular Kinematics

To evaluate the impact of the quick phases during steady-state pursuit trials, particularly with respect to the torsional component, we analyzed the ocular kinematics as follows. First, we estimated angular eye velocity based on sinusoidal eye position fits up to second order without removing the small steplike position modulations due to quick phases. These fits perfectly matched the eye position modulation (Fig. 5, *top* panels, dashed lines superimposed on eye position traces), and the angular velocity, EPV, derived from these fits also provided good estimates of the horizontal and vertical slow-phase angular velocity, except for the torsional slow-phase angular velocity (Fig. 5, *bottom* panels, dashed lines superimposed on slow-phase angular velocity traces). Thus the angular velocity derived from the torsional eye position fit provided a relatively poor description of the admittedly rather small modulation of torsional slow-phase angular velocity. The coefficients of determination (*R*^{2}) summarized in Table 1 (sinusoidal fits) emphasize this observation: the position fits predicted slow-phase angular velocity with average *R*^{2} values >0.85 in the vertical and horizontal directions compared with *R*^{2} values <0.15 in the torsional direction. In a second approach, we fitted the data using the extended generic Listing model, which included an eye position-independent torsional term (labeled δΩ̂_{tor} = **ω̃**_{target} *f̂ _{x}* in materials and methods) to account for a torsional component violating Listing's law (

*Eqs. 4–6*in materials and methods). Note that the eye position-dependent torsional component of the (not extended) generic Listing model predicts no modulation at higher harmonics unless such modulation would be present in the horizontal and/or vertical component. We found that the extended generic Listing model provided superior fits of the torsional responses, although there was no significant nonlinearity in the fits of the vertical and/or horizontal components as documented below (see solid lines in Fig. 5,

*bottom*panels). A comparison of the coefficients of determination showed significant improvements of the

*R*

^{2}values when fitting the extended generic Listing model compared with the sinusoidal fits, which also included a second harmonic (

*P*< 0.001,

*t*-test). This was true in all animals (Table 1). In the remaining parts of this study, we further analyzed the torsional nonlinearity based on the fits provided by the extended generic Listing model.

The potential impact of the torsional nonlinearity can best be analyzed by characterizing the relative power of the second and higher harmonic contributions in terms of that at the fundamental frequency. From the sinusoidal fits to eye position, we computed the squared ratios (*a*_{μi}/*a*_{μ1})^{2} of the Fourier components *a*_{μi} vs. *a*_{μ1} at the fundamental frequency for each eye movement component (see materials and methods, *Eq. 6*). This approach takes into account that the total power is proportional to the sum of squares of significant Fourier components. Since during circular pursuit the input signal is a pure sine at the spatial frequency of target motion, these ratios are a measure of the amount of harmonic distortion at each frequency. We found that the power, contributed by the second-order harmonics to the horizontal and vertical response component, amounted to less than 1‰ relative to the power of the fundamental. In contrast, the mean relative power of the second harmonic of torsional angular velocity was around 13% and 19% for clockwise and counterclockwise circular pursuit, respectively (Table 2). At higher Fourier frequency there was a strong roll-off in power.

### Evaluation of the Rotation Axis of the Eye During Steady-State Pursuit

The geometric relation between gaze direction and the estimated target rotation (see materials and methods, *Eq. 2*) can be used to reconstruct the actual rotation of the eye ball (relative to the head). To analyze this motion, we first estimated the ocular rotation vector *f̂* as a function of time by fitting the extended generic Listing model to 3-D eye position and 3-D slow-phase angular velocity, both being an essential input to the model. Second, we computed the rotation of the gaze vector *ĝ* as a function of time from the fits of 3-D eye position. With these vector functions *f̂*(*t*) and *ĝ*(*t*) at hand, we computed the associated parameter λ(*t*) by solving the respective quadratic equation and computed the underlying estimate of the axis of target rotation, represented by the unit vector *n̂*(*t*). Plots in the time domain showed that the gaze vector oscillated in the horizontal (*z*) and vertical (*y*) directions with very little jitter across cycles (Fig. 6*A*). The estimated target rotation axis pointed in the direction of the *x*-axis, exhibiting only small modulations in the *y*- and *z*-directions during a rotation cycle (Fig. 6*B*). Most interestingly, the angular velocity vector of the eye oscillated in the *y-z* plane with some jitter around zero in the *x* direction (Fig. 6*C*). The lambda parameter (*Eq. 2*), which relates the gaze vector to the estimated rotation axis, modulated at negative levels exceeding −1 (Fig. 6*D*). The minus sign reflects the fact that counterrotation of the eye opposed the estimated angular target velocity in the *x* direction, whereas the magnitude >1 accounted for the target eccentricity (see further analysis of this point below). In the spatial domain, the angular velocity vector of the eye traveled around a circle in the subject's frontal plane (*z-y* plane, Fig. 7*A*). Its projections onto the yaw and pitch plane always showed some wobble of the circular trajectory due to the small oscillations in the *x* direction (Fig. 7, *B* and *C*). This wobble reflected the deviation of circular pursuit responses from the ideal Listing's law behavior. It is accounted for by the term δ**Ω̃** = **ω̃**_{target}*f _{x}* in the extended generic Listing model. As mentioned, this term significantly improved the fits of the small torsional modulations. Despite the torsional wobble of the angular velocity vector, the gaze vector showed virtually no distortion of its circular trajectory in the target plane. Similarly, the estimated rotation axis of the target motion was often surprisingly stable in space. Variability in the

*y*and

*z*direction was compensated for by corresponding modulations of λ. To further corroborate the validity of our approach, we tested the dependence of λ as a function of eccentricity as explained in the next paragraph.

### Estimating Ocular Counter Rotation

We evaluated the dependence of λ as a function of gaze eccentricity for circular pursuit at gaze eccentricities of ∼5°, ∼10°, and ∼15°. According to *Eq. 2* in materials and methods, λ is expected to lie on the curve 1/cos(ε), where ε is the target eccentricity. A geometric explanation of this effect is illustrated in Fig. 8*B*. Describing the target motion as a circle on a unit sphere (with the observer's eye in the center), the eccentricity ε is related to the radius of target circle *r* by ε = arcsine(*r*). This relation is exact if the observer's frontal plane parallels the plane of target motion. Otherwise, the target's eccentricity will vary about an average value during circular tracking. Indeed, we found that λ varied along each cycle, suggesting that ε was not perfectly constant throughout on tracking cycle (Fig. 8*A*). To verify the predicted 1/cos(ε) behavior, we fitted the curve λ = *a* + *b*/cos(ε) to the λ data, which we obtained from fitting the extended generic Listing model to the angular eye position and velocity data. Although the predicted increase of |λ| is small in the tested range of eccentricities, we found an excellent correspondence of experimental data with prediction (Fig. 8*A*). *R*^{2} values and fitted parameters *a* and *b* from experiments in three animals are summarized in Table 3.

### Reconstruction of 3-D Ocular Rotation From Steady-State Pursuit Data

Having established the relation between target motion and ocular rotation in terms of how the brain might encode 3-D target angular velocity (*Eqs. 1–4* in materials and methods), we finally tackled the question of whether it is possible to predict the angular eye position, including its small but consistent second harmonic modulation in torsion from the angular velocity observed during circular pursuit. Motivated by the observation that quick phases appear to be an integral part during steady-state pursuit in primates, we reconstructed the ocular rotation based on the two different ways of estimating the angular velocity of the eye (see SPV vs. EPV in materials and methods). In the first approach, we estimated angular eye velocity by fitting slow-phase angular velocity obtained from eye position after removing all quick phases (SPV) or in a variant, by using the extended generic Listing model (LSPV, based on fits of eye position and slow-phase eye velocity). Alternatively, we estimated angular eye velocity based on the best sinusoidal fits of the angular eye position traces (EPV), including the small quick phases (see Fig. 3, *A* and *B*). We call this alternative eye position-related angular velocity. Using the axis-angle method for calculating rotations, we used the thus estimated angular eye velocity to calculate the instantaneous direction of the eye's rotation axis and to reconstruct the rotation of the eye sample per sample by reverse computation. We found that the slow phase-related angular velocity (both variants) failed to reproduce the experimentally observed torsional eye position modulation, despite the fact that it could predict the observed vertical and horizontal eye position modulation. More specifically, although torsional eye position initially changed in the correct direction (compared with the actual data), torsion soon accumulated in the same direction as the tracking (Fig. 9*B*). This meant that in these simulations torsional eye position ran off unbounded from cycle to cycle, since eye position at the end of one cycle equaled initial position of the following cycle. In these simulations, the normalized root mean square error was 2.7 ± 0.4 for torsion, 0.07 ± 0.02 for vertical, and 0.06 ± 0.009 for horizontal eye position in the clockwise direction and 6.5 ± 1.5 for torsion, 0.05 ± 0.02 for vertical, and 0.03 ± 0.01 for horizontal eye position in the counterclockwise direction (data shown in Fig. 9*B*). In contrast, with the use of the angular eye position-related angular velocity, the reconstructed rotation of the eye consistently reproduced not only the vertical and horizontal eye position modulation but also the observed modulation of torsional eye position (Fig. 9*C*). Most importantly, torsional eye position did not run off, because it returned at the end of one cycle to the value at which it started at the beginning. The normalized root mean square error was 0.11 ± 0.02 for torsion, 0.01 ± 0.003 for vertical, and 0.02 ± 0.004 for horizontal eye position in the clockwise direction and 0.10 ± 0.04 for torsion, 0.008 ± 0.006 for vertical, and 0.01 ± 0.003 for horizontal eye position in the counterclockwise direction (same data as in Fig. 9*C*). Notice that although the normalized root mean square error for the reconstruction of torsion was roughly a factor of 10 larger than that of vertical and horizontal eye position, it was not significantly larger in absolute terms because the peak-to-peak modulation of torsional eye position was about a factor 10 smaller than that of vertical or horizontal eye position. We also computed the average coefficient of determination for the reconstructed ocular rotation as summarized in Table 4.

To compare these results with a smooth counterrolling strategy of the eye (i.e., without quick phases), we also reconstructed the rotation of the eye by using the angular velocity to reconstruct the rotation of the gaze vector and the principle of minimal torsion to compute the torsional position modulation as described in materials and methods. This procedure yielded the predicted modulation of ocular torsion at the second harmonic of the spatial tracking frequency (see appendix b, *Eq. B1* and Fig. A3). The peak amplitudes of this modulation were comparable to the experimentally observed peak amplitudes (Fig. 9, *A* and *C*).

## DISCUSSION

To track target motion in space, the oculomotor system has to be able to generate motion patterns that are geometrically more complex than the relatively simple fixed-axis rotations underlying the vestibuloocular reflex movements or the ballistic movements of the eye during visually guided saccades. A major challenge in motor control of smooth tracking movements is the restrictions that prevent the eyes to redirect gaze by using the most efficient shortest path rotation. This difficulty arises in particular during curvilinear tracking where the dual requirement of controlling the eye's torsion orientation comes into conflict with the requirement of rotating the eye in the most efficient way to smoothly track the target. We have shown that this dual task requires that the rotation axis of the eye continuously changes its orientation relative to the orbit. This particular motion pattern results from two simultaneous rotations of the eye, one to keep gaze on the moving target and another one to control ocular orientation by counterrolling the eye about the instantaneous gaze direction. Although this particular motion strategy is needed to prevent accumulation of ocular torsion, we have shown that it also implies that Donders' law is no longer automatically maintained. We suggest that the intervening small quick phases serve to minimize the inevitable deviations from this fundamental law during curvilinear smooth tracking.

### Extended Generic Listing Model

It was noted almost two decades ago that smooth pursuit is governed by Listing's law (Haslwanter et al. 1991; Tweed et al. 1992). These studies reported that Listing's law was followed with a precision of about ±1.5° in humans and about ±(<1°) in (rhesus) monkey (where slippage of the search coil can be excluded). Compared with the tested oculomotor range of about ±50° and 40° in humans and monkeys, respectively, the conclusion that smooth pursuit obeys rather than violates Listing's law (see Westheimer and McKee 1973 and the pertinent discussions in Haslwanter et al. 1991 and Tweed et al. 1992) seems well justified. In the present work, we focused our attention on the greatest challenge of Donders' law during generation of curvilinear smooth pursuit. Without knowing the details about how the brain might encode 3-D target motion in space, we start from the simple assumption that this process depends on estimating and predicting the momentary target angular velocity and position relative to the observer's eye (for saccades see Ghasia et al. 2008; Mays and Sparks 1980; for a review see Crawford et al. 2003). With this assumption in mind, we used 3-D rotation kinematics and a principle akin to Helmholtz's principle of minimal torsion of the visual space as guiding principles (Helmholtz 1867; Hepp et al. 1997). Assuming smooth target tracking, it is possible to express the kinematics of the eye in terms of the 3-D angular velocity and position of the target. For this one has to assume that the motor commands to the ocular plant implement, at least on average, a counterroll condition similar to the one described by *Eq. 2* to maintain control of the ocular attitude. This implies that an observer who tries to track a target along a circular path must transform the estimated motion into a sequence of rotations of the eye to avoid accumulation of ocular torsion (see Figs. 2 and 10). The mentioned guiding principles lead to a generic expression of Listing's law that accurately implements the half-angle rule (Helmholtz 1867) for straight-line pursuit but also shows that it can only be approximately followed during circular or curvilinear pursuit (<0.5° deviation of the predicted tilt of angular eye velocity for gaze eccentricities ≤20°). Indeed, we found that the generic Listing model cannot explain the finer experimental details of curvilinear tracking responses: during circular tracking, there is in fact a significant second harmonic in the modulation of torsional eye position, which is not predicted by the model. Specifically, there is no such second harmonic modulation of horizontal and vertical eye position, excluding the possibility that the observed second harmonic in ocular torsion reflects the multiplicative interaction of velocity and position signals predicted by the term **ω̃**_{target}(*f _{z}Ẽ*

_{ver}−

*f*

_{y}Ẽ_{hor}) in the generic Listing model (

*Eq. 4*). Straightforward calculations show that this term does not generate a second harmonic if none of its constituents includes such a harmonic. Consistent with this observation, the ad hoc addition of an extra term to the generic model that was free to modulate at the second harmonic independently of horizontal and vertical eye position led to a significant improvement of the model fits in torsional direction. This observation suggests that smooth tracking of curvilinear targets is not possible without violation of Listing's law. An independent kinematic analysis based on the counterroll condition (

*Eq. 2*) without making use of any of the approximations leading to

*Eq. 4*supports this finding (see appendix b and c). For reasons that will become clear in the following paragraph, the required additional torsion term, providing only a small but nevertheless significant correction of torsional angular velocity, most likely reflects the adjustments caused by quick phases to maintain ocular orientation during tracking.

### Functional Role of Quick Phases During Steady-State Pursuit

Since the experiments of Rashbass (1961), saccades or rapid eye movements have been recognized as an important part of pursuit eye movements. During steady-state pursuit they are in fact an integral part, alternating with portions of smooth tracking (Collewijn and Tamminga 1984; De'Sperati and Viviani 1997; Kettner et al. 1996). The typical role of rapid eye movements during tracking is to correct motor errors in eye position, keeping the fovea, which is far more sensitive to image motion than the periphery, close to the target (Dubois and Collewijn 1979). The rapid eye movement events observed in this study had amplitudes mostly below 1.5°. Their function is most likely not to catch up target position, since they have a clear-cut repositioning phase in torsional direction. We therefore refer to them as quick phases. The torsional onset component of these quick phases typically drives the eye across the torsional equilibrium position (defined as average zero torsion position), either from clockwise to counterclockwise or vice versa. During the subsequent slow phase, 3-D eye position drifts back toward the previous torsional offset position (see *insets* in Fig. 3). Thus, altogether, angular eye velocity never quite matches target velocity, confirming earlier observations from other studies performed in 2-D (De'Sperati and Viviani 1997; Engel et al. 1999; Mrotek et al. 2006).

What sort of motor error drives these small quick phases during curvilinear pursuit? As mentioned, it is unlikely a horizontal and/or vertical position error that causes these events in the sense of catch-up saccades. Our animals were well trained to track targets during highly predictable paradigms, so there is little reason to assume that velocity feedback signals were not sufficient to control pursuit (Engel et al. 1999; Lisberger and Ferrara 1997; Mrotek et al. 2006). There is, however, an ongoing debate about the role of position input during smooth pursuit (Noda and Warabi 1982; Polya and Wyatt 1980), which might reflect the fact that the system's control of eye position and velocity is nonlinearly interconnected. At the level of 2-D oculomotor control, there seems to be growing evidence for such interaction (Blohm et al. 2005; Carl and Gellmann 1987; Morris and Lisberger 1987). From the perspective of 3-D control, the challenge of the oculomotor system during continuous tracking is twofold: one is to smoothly control gaze position and velocity to keep the retinal target image close to the fovea, and the other is to control 3-D ocular orientation, which involves the holding function of all eye muscles while at the same time gaze is tracking the target. This task is much more complex in the case of circular pursuit than during straight-line pursuit, because it involves the activation of muscles or muscle compartments with mutually orthogonal pulling directions that smoothly change orientation in the roll plane. For the simple paradigm of circular tracking, the geometric analysis reveals that the required counterrolling of the eye is a trigonometric function of horizontal and vertical gaze direction (see Fig. 2, appendix b and c). Since this constraint cannot be expressed in terms of target velocity alone (e.g., by transforming it into the velocity domain), it cannot be realized inside a simple velocity feedback loop, although velocity feedback represents the major drive during steady-state pursuit (Goldreich et al. 1992; Lisberger et al. 1981; Robinson 1965; Robinson et al. 1986). Rather, it is bound to involve independent position control, simultaneously or alternating with smooth tracking control. Our results suggest that the system uses a strategy where smooth tracking alternates with quick phases for correcting deviations from ocular torsion.

From a strategic point of view, one might wonder why the ocular counterroll during pursuit is not fully parametrically controlled in a continuous smooth manner. One of the reasons is likely related to Donders' law, which is not automatically warranted during smooth curvilinear tracking. The particular motion pattern required during circular tracking, which is a combination of tracking and counterrolling of the eye, undermines the premises on which Donders' law is based (Hepp et al. 1995, 1997). To minimize the inevitable torsion due to the spherical geometry of the eye's configuration space, the system apparently invokes quick-phase control. Another related reason might be that quick-phase control reduces the computational load, since it can take advantage of existing neuronal hardware and might ultimately be more efficient. In fact, we found that the average modulation of torsional eye position only exhibited a fraction of the power at the second harmonic (Table 2) compared with that predicted by a full parametric control strategy (compare Fig. 9*A* and Fig. A3*B*). Furthermore, the peak-to-peak modulation was smaller than the predicted smooth modulation of ±1° at a target eccentricity of 15° (compare Fig. 9, *A* and *C*).

### Role of the Oculomotor Plant

Over the last two decades an increasing number of anatomical and theoretical studies have provided evidence converging on the idea that the half-angle rule of angular eye velocity might originate from an intricate neuromechanical control of the oculomotor plant (Demer 2004; Demer et al. 1995, 2000; Kono et al. 2002a, 2002b; Miller 1989, 2007; Miller et al. 2003; Quaia and Optican 1998; Raphan 1998). The theory was kindled by the observation that fibroelastic sheaths, surrounding the extraocular muscles, could in fact change the pulling direction of eye muscles and mechanically produce the half-angle rule. This implies that both agonist and antagonist muscles change their action plane appropriately as a function of eye position (for a review see Angelaki and Hess 2004). Since not all eye movements, notably not the vestibuloocular eye reflexes, follow the half-angle rule (Misslisch and Hess 2000; Misslisch and Tweed 2001; Misslisch et al. 1994; Tchelidze and Hess 2008), the required coordinated changes of muscle action planes must necessarily be under some neuronal control, as in fact proposed in form of the so-called active pulley hypothesis (Demer et al. 1997, 2000). In a recent study of smooth pursuit initiation, evidence was presented suggesting that the oculomotor commands do undergo a reference frame transformation, taking ocular torsion into account (Blohm and Lefèvre 2010). This finding supports the notion that the brain has access to 3-D eye position signals coded in space coordinates, independent of whether the oculomotor plant be controlled by 2-D or 3-D motor commands (Blohm and Lefèvre 2010; Ghasia et al. 2008; Green et al. 2007). In recent electrophysiological studies, Klier et al. (2006, 2010) have shown that electrical stimulation of the abducens nerve in subhuman primates generates eye movements whose angular velocity profiles approximately follow the half-angle rule, irrespective of static head orientation or orbital eye position. Since peripheral stimulation of motor nerves precludes the intervention of central processing, these observations emphasize the importance of the peripheral biomechanics in the implementation of the half-angle rule.

Listing's law requires that the torsional component of the eye angular velocity at the output is numerically equivalent to a multiplicative interaction of current horizontal and vertical position and velocity signals (half-angle rule; see *Eq. 4*). Although recent evidence suggests that this interaction does not happen at the premotor level (Ghasia and Angelaki 2005), its biomechanical implementation remains obscure. It has been pointed out that such implementation requires that the same muscles controlling eye velocity also must exert appropriate position control (Crawford and Guitton 1997; Optican and Quaia 1998; Raphan 1998; Smith and Crawford 1998). To illustrate the geometric complexity of the required muscle activation pattern, circular pursuit provides an ideal example. In the first place, one notices that the vectors of eye angular acceleration and velocity geometrically move in different planes, even though in phase quadrature, while the angular velocity tilts by half the angle of gaze eccentricity as predicted by Ω_{tor} ∼ **ω̃**_{target}(*f _{z}Ẽ*

_{ver}−

*f*

_{y}Ẽ_{hor}) in

*Eq. 4*, the angular acceleration remains approximately confined to the

*z-y*plane because δΩ

_{tor}/d

*t*≈ 0 while rotating about the

*x*-axis (Fig. 10). At the same time, the gaze holding mechanisms must continuously change orientation to keep ocular torsion minimal and to maintain gaze at constant eccentricity during circular tracking. Altogether, this requires highly coordinated muscle activity patterns, encoding simultaneously the continuously changing tonic and phasic motor command signals by differential activation of phasic and tonic muscle fibers. Although it has long been known that extraocular muscles contain different muscle types (Bach-y-Rita and Ito 1966; Hess and Pilar 1963; Kern 1965; Kono et al. 2002b; Lim et al. 2007; Oh et al. 2001; Scott and Collins 1973) with different functional properties, our knowledge about their actual deployment in eye movement control is unfortunately still rather limited (Anderson et al. 2009; Davis-Lopez de Carrizosa 2011; Dieringer and Precht 1986). Independently of how the details of a neuromechanical implementation of the half-angle rule during pursuit might look, our analysis suggests that circular or curvilinear pursuit requires intricate dual control of ocular orientation and gaze, involving a close interaction of slow and fast eye movement mechanisms at some premotor level of 3-D oculomotor control.

## GRANTS

This work was supported by the Marie Curie Action EST-“SensoPrim.” Swiss National Science Foundation Grant 31-47287.96, and the Betty and David Koetser Foundation for Brain Research.

## DISCLOSURES

No conflicts of interest, financial or otherwise, are declared by the author(s).

## ACKNOWLEDGMENTS

We thank Carla Bettoni and Urs Scheifele for technical assistance and Dora Angelaki for valuable comments on the manuscript.

## APPENDIX A: LIMITED VALIDITY OF HALF-ANGLE RULE DURING CIRCULAR PURSUIT

Consider tracking a target that moves on a circle centered straight ahead in a frontoparallel plane at constant velocity. To evaluate the generic Listing model (*Eq. 4* in materials and methods), we first compute the underlying premotor eye velocity command (*Eq. 3*) by evaluating the vector ** f** with

*n̂*= (1, 0, 0) and λ = −1/

*ĝ*(

_{x}*Eq. 2*). In the polar coordinates ε and ψ (see Fig. A1), the gaze vector writes ĝ = (cos(ε), −sin(ε)sin(ψ), sin(ε)cos(ψ)) such that

**= tan(ε)(0 sin(ψ), −cos(ψ)). With this, we find d**

*f***/d**

*Ẽ**t*= (

**ω̃**/2)

**. To estimate the eye position rotation vector that moves the eye from straight ahead to the target and then along a circular path as illustrated in Fig. 2, we note that the time integral of the estimated eye velocity, ∫d**

*f***= −½tan(ε)**

*Ẽ**ê*with

*ê*= (0, cos(ψ), sin(ψ)) is a close approximation of a rotation vector written as

**= −tan(ε/2)**

*E**ê*. For angular target eccentricities ε ≤ 16°, the ratio tan(ε)/2 to tan(ε/2) ≤ 1.02. This rotation vector describes every eye position along the circular path as an eye movement about axis

*ê*= (0, cos(ψ), sin(ψ)) through the angle +ε or −ε for ψ varying in the interval [0 π]. With this approximation we can write the generic Listing equation for circular pursuit (

*Eq. 4*in methods):

*Eq. 2*) guarantees no accumulation of torsion during circular tracking across cycles, zero ocular torsion is generally not maintained in tertiary eye positions. Second, we show that these violations of Listing's law are due to violations of Donders' law, which does not hold in general during non-fixed-axis rotations.

## APPENDIX B: LIMITATIONS OF OCULAR ROLL COMPENSATION

The optimal counterroll motion during curvilinear pursuit can be evaluated as follows. Consider the change in ocular orientation if during circular pursuit the eye would simply rotate about the forward-pointing *x*-axis, say through an angle ψ from position B to C as illustrated in Fig. 2. Using three mutually orthogonal unit vectors *ĝ*, *ĥ*, and *v̂* (defined in materials and methods), this change in orientation is the difference in angular orientation of the frame *ĝ*′, *ĥ*′, *v̂*′ relative to *ĝ*, *ĥ*, *v̂* in the respective positions (see Fig. 2, *A* and *B*). Let *R* = *R*(*ê*, ρ) represent the usual rotation operator with rotation axis *ê* and rotation angle ρ. Applied to the vectors *ĝ*, *ĥ*, and *v̂* with axis *ê* ≡ *ĝ*, we find their new orientation *ĥ*′ = *ĝ*′ × *v̂*′ and *v̂*′ with *v̂*′ = *Rv̂* = sin(ρ)*ĝ* × *v̂* − cos(ρ)*ĝ* × (*ĝ* × *v̂*). Since the vectors *ĝ*, *ĥ*, and *v̂* represent a right-handed orthogonal triple, we can use *ĝ* × *v̂* = −*ĥ* and *ĝ* × (*ĝ* × *v̂*) = −*v̂* to simplify the equation for *v̂*′ to *v̂*′ = −sin(ρ)*ĥ* + cos(ρ)*v̂*. To find the counterroll angle ρ that keeps the vector *v̂*′ in a vertical plane through the new gaze direction *ĝ*′, one has to require that the projection of *v̂*′ onto a vector parallel to the *z*-axis, *ê _{z}*, is minimal (compare

*v̂*and

*v̂*′ in Fig. 2

*C*). Taking the scalar product of

*v̂*′ and

*ê*, we find the relation

_{z}*f*(ρ) =

*v̂*′·

*ê*= −sin(ρ)sin(ψ) + cos(ρ)cos(ψ)cos(ε) and evaluate the angles ρ for which the slope ∂

_{z}*f*(ρ)/∂ρ = 0. We find that for ψ = π/2, π, 3π/2, and 2π, the eye has to counterrotate through ρ = −π/2, −π, −3π/2, and −2π at which angles

*f*(ρ) are minimal [i.e., ∂

^{2}

*f*(ρ)/∂ρ

^{2}= 0]. However, for angles in between, the eye has to counterrotate through slightly larger angles. Specifically, one finds the remaining minima of

*f*(ρ) for ρ = −tan

^{−1}[tan(ψ)/cos(ε)] for ψ ≠

*k*π/2,

*k*= 1, 2, and 3. Thus during circular pursuit, ocular counterroll does not completely compensate the ocular roll for rotation angles in between the four quadrants (i.e., for ψ ≠

*k*π/2,

*k*= 1, 2, and 3). If there are no intervening torsional quick phases, the residual torsion of the eye in these quadrants should follow the relation

*k*π/2, where

*k*= 1 to 3. The residual torsion δρ modulates at twice the frequency of ψ (see Fig. A3

*B*).

## APPENDIX C: OCULAR ROTATION DURING SMOOTH CIRCULAR PURSUIT EYE MOVEMENTS

To compute 3-D eye position during curvilinear pursuit, we represent gaze direction *ĝ* in spherical polar coordinates ϑ (horizontal gaze deviation, positive leftward) and ϕ (vertical gaze deviation, positive downward, Fig. A2):
*ê _{x}*,

*ê*, and

_{y}*ê*can be represented by 3 × 1 matrices

_{z}*ê*= (1, 0, 0)

_{x}^{T},

*ê*= (0, 1, 0)

_{y}^{T}, and

*ê*= (0, 0, 1)

_{z}^{T}(where superscript T is for transpose), pointing, respectively, along the

*x*-,

*y*-, and

*z*-axis. For the following calculations, it is advantageous to use an alternative basis represented by the four Clifford numbers γ̂

_{1}, γ̂

_{2}, γ̂

_{3}, and

*I*, the multiplicative unit, with the following two properties: (γ̂

_{1})

^{2}= (γ̂

_{2})

^{2}= (γ̂

_{3})

^{2}=

*I*and γ̂

_{1}γ̂

_{2}+ γ̂

_{2}γ̂

_{1}= γ̂

_{1}γ̂

_{3}+ γ̂

_{3}γ̂

_{1}= γ̂

_{2}γ̂

_{3}+ γ̂

_{3}γ̂

_{2}= 0. This Clifford basis can be represented by real 4 × 4 matrices, also called Dirac matrices (Snygg 1997):

*ĝ*with the rotation operator

*R*(

*n̂*, ρ) =

*I*cos(ρ/2) −sin(ρ/2)(

*n*γ̂

_{x}_{23}+

*n*γ̂

_{y}_{31}+

*n*γ̂

_{z}_{12}), specifying the angle ρ and the axis

*n̂*in terms of the three Clifford products γ̂

_{23}= γ̂

_{2}γ̂

_{3}, γ̂

_{31}= γ̂

_{3}γ̂

_{1}, and γ̂

_{12}= γ̂

_{1}γ̂

_{2}(for details see Snygg 1997):

*R*

^{−1}=

*R*(

*n̂*, −ρ).

To compute the rotation operator for a rotation of the eye about the current gaze position *ĝ*, we compute the basis vectors tangential to the coordinates ϑ and ϕ (Fig. A2):
_{ϕϑ}, which represents the oriented tangent plane to the current gaze direction at position ϑ, ϕ. Taking the metric of this coordinate change from the flat Euclidean space to the spherical non-Euclidean space into account, yielding a factor cos(ϕ) (i.e., the square root of the determinant of the metric tensor), we write

Finally, to compute the motion of gaze during circular pursuit as illustrated in Fig. A3*A* (see also Fig. 2, *A* and *B*), we first write the rotation operator for moving gaze from straight ahead to up about the *y*-axis, *P* = *I*cos(ϕ/2) − sin(ϕ/2)γ̂_{31}, then the rotation about the *x*-axis, *R* = *I*cos(ψ/2) − sin(ψ/2)γ̂_{23}, and finally the counterrotation of the eye about the current gaze direction, *R*′ = *I*cos(ρ/2) − sin(ρ/2)γ̂_{ϕϑ}. The compound rotation of gaze is then obtained by conjugating the gaze vector in primary position with the compound rotation operator (*R*′)^{−1}*RP*^{−1}(which is a function of the angles ϑ, ϕ, ψ, and ρ)

In the same way, the rotation of *ĥ* and *v̂* (see Fig. 2*A*) can be computed. With any two of these vectors, we obtain the associated eye position (rotation) vector ** E** = tan(ψ/2)

*ê*with rotation axis

*ê*and roll angle ψ (relative to straight ahead) as illustrated in Fig. A3

*B*.

From these analyses in the eye position domain, three important observations can be made (Fig. A3*A*). First, although torsion does not accumulate under the imposed counterroll condition requiring ρ = −tan^{−1}(tanψ/cosε), it cannot completely be compensated, i.e., Listing's law cannot be preserved: note the continuous deviation of the moving rotation axes from the frontal plane during one pursuit cycle (3-D view and projections of trace of *n̂* in Fig. A3, *A* and *D*). Second, torsional eye position modulates at twice the spatial frequency of circular pursuit (1° amplitude for circular pursuit with eccentricity = 15°). Third, Donders' law is not fulfilled except at the four cardinal eye positions. For example, if circular pursuit terminates at position C with a saccade back to the starting position A, the eye carries with it a residual torsion that was built up during circular pursuit from position B to C (Fig. A3, *A* and *B*). It can be demonstrated that this is always the case no matter how the counterroll angle ρ is being defined as a (smooth) function of ψ.

- Copyright © 2011 the American Physiological Society