Eye-hand coordination is crucial for our ability to interact with the world around us. However, much of the visually guided reaches that we perform require a spatial decoupling between gaze direction and hand orientation. These complex decoupled reaching movements are in contrast to more standard eye and hand reaching movements in which the eyes and the hand are coupled. The superior parietal lobule (SPL) receives converging eye and hand signals; however, what is yet to be understood is how the activity within this region is modulated during decoupled eye and hand reaches. To address this, we recorded local field potentials within SPL from two rhesus macaques during coupled vs. decoupled eye and hand movements. Overall we observed a distinct separation in synchrony within the lower 10- to 20-Hz beta range from that in the higher 30- to 40-Hz gamma range. Specifically, within the early planning phase, beta synchrony dominated; however, the onset of this sustained beta oscillation occurred later during eye-hand decoupled vs. coupled reaches. As the task progressed, there was a switch to low-frequency and gamma-dominated responses, specifically for decoupled reaches. More importantly, we observed local field potential activity to be a stronger task (coupled vs. decoupled) and state (planning vs. execution) predictor than that of single units alone. Our results provide further insight into the computations of SPL for visuomotor transformations and highlight the necessity of accounting for the decoupled eye-hand nature of a motor task when interpreting movement control research data.
- superior parietal lobule
- visuomotor transformation
- eye-hand coordination
- cognitive motor integration
eye-hand coordination is an important aspect of our ability to perform different types of visually guided reaches, allowing us to interact with objects in a variety of ways. We can reach for objects we are not looking at, manipulate a tool like a joy-stick, and even perform laparoscopic surgery. These types of complex eye-hand coordination tasks usually require the location of the eyes and the hand to remain decoupled throughout the movement, a type of action referred to as a decoupled, or eye-hand decoupled, reach (Wise et al. 1996). Decoupled reaches require an implicit spatial algorithm and/or an explicit cognitive rule to be incorporated into the motor plan to relate the visual stimulus to the direction of the end effector movement (Murray et al. 2000; Sergio et al. 2009; Wise et al. 1996). This differs from the more basic forms of visually guided reaches in which the visual object guiding the movement is the spatial target of the action itself, referred to as a standard, or eye-hand coupled, reach (Wise et al. 1996). Decoupled reaching requires inhibition of the natural tendency to link the actions of the eyes and the hand (Gauthier and Mussa Ivaldi 1988; Gielen et al. 1984; Gorbet and Sergio 2009; Henriques et al. 1998; Morasso 1981; Neggers and Bekkering 2000; Prablanc et al. 1979; Sergio and Scott 1998; Terao et al. 2002; Vercher et al. 1994), as well as additional processing to calculate the new spatial mapping between the eyes and the hand (Gorbet et al. 2004; Granek and Sergio 2012; Sergio et al. 2009).
Although previous human and animal studies have found that performing decoupled reaches alters the activity of regions located within the parietofrontal reach network (Andersen et al. 1987, 1997; Battaglia-Mayer et al. 2001; Connolly et al. 2000; Gail et al. 2009; Gorbet et al. 2004; Grafton et al. 1996; Granek et al. 2010; Hawkins et al. 2013; Prado et al. 2005; Sayegh et al. 2013), many of these studies focused only on foveated vs. extra-foveated reaching (Andersen et al. 1997; Battaglia-Mayer et al. 2001; Clavagnier et al. 2007; Gail et al. 2009; Prado et al. 2005). Thus the specific contribution of parietofrontal regions, known to integrate eye and hand information, to the control of decoupled but foveated movements is largely unknown.
Situated between sensory and motor cortices, superior parietal lobule (SPL) integrates eye and hand signals to successfully calculate the reach vector under sensory guidance (Andersen et al. 1987; Battaglia-Mayer and Caminiti 2002; Graziano et al. 2000; Grefkes et al. 2004; Rushworth et al. 1997a; Vesia and Crawford 2012; Vesia et al. 2010). Recently, our laboratory has found that, during an identical decoupled reach task, single units within SPL demonstrate a reduction in the mean discharge rate during decoupled reach planning and execution (Hawkins et al. 2013). To fully characterize the contribution of a region to a particular behavior, analyzing both the single units and oscillatory activity will provide a richer repertoire of information than one technique alone. While spiking activity is known to reflect suprathreshold inputs or outputs from pyramidal cells, local field potentials (LFPs) reflects subthreshold inputs within local cell assemblies (Scherberger et al. 2005). In addition, LFP activity is thought to be a better predictor of certain behavioral states compared with the activity of single units alone (Engel and Fries 2010; Mitzdorf 1985; Pesaran et al. 2002; Scherberger et al. 2005). Thus it is reasonable to suggest that spike and LFP activity each carry a different set of information and can, therefore, be complementary tools for brain analysis (Pesaran et al. 2002; Sanes and Donoghue 1993). Finally, because LFP activity has been shown to have a stronger relationship with blood oxygen level-dependent function magnetic resonance imaging (Goense and Logothetis 2008; Nir et al. 2007) than single-unit activity (Fries et al. 2001), the results obtained from LFP studies can help bridge the gap between neurophysiological data in animals and human functional magnetic resonance imaging recordings. The results of the present study will thus enrich the findings from our laboratory's previous report on the single-unit activity within SPL and improve our understanding about the computations of SPL in eye-hand decoupled reaches.
Given the crucial role of the SPL in the integration of vision and proprioception for limb guidance, we hypothesize that a spatial decoupling between the actions of the eyes and the hand will affect the neural activity within SPL during reach planning and execution. Decoupling the eyes from the hand will provide incongruent eye and hand signals. Thus a greater reliance on proprioceptive input will be required during decoupled reaches when the visual information about the reach target is inaccurate (Buneo and Andersen 2006; Engel et al. 2002; Flanders et al. 1992; Nixon et al. 1992; Rushworth et al. 1997a, 1997b). We predict that these incongruent eye and hand signals will modulate the LFPs within SPL in a way that differs from that which occurs during direct object interaction. An exploratory aspect of the current study was to examine how LFP activity in the different frequency bands varies in both the planning and execution phases of a decoupled relative to coupled reach. A second aim of this study was to test the hypothesis that LFP activity within SPL will vary between the planning and execution phases of the movement, when movement control switches from a planning, feedforward, and movement inhibition phase, to an execution, proprioceptive feedback and efference copy phase. We discuss our findings in the context of other research characterizing the alteration of activity in the parietofrontal movement control network during the control of complex, rule-based behaviors.
Animals and Apparatus
Two rhesus monkeys (female Macaca mulatta, both 5.2 kg) were trained to perform a visually instructed delayed reaching task under coupled and decoupled eye-hand conditions, as described previously (Hawkins et al. 2013; Sayegh et al. 2013). All surgical and animal handling procedures were in accordance with Canadian Council on Animal Care guidelines on the use of laboratory animals, and preapproved by the York University Animal Care Committee.
During the experiment, the monkey was seated in a custom-built primate chair 40 cm in front of a 38.1-cm vertical screen, which was set at monkey eye level and centered with her midline. An additional 38.1-cm horizontal touch-sensitive screen (Touch Controls, San Diego, CA) was set in front of the animal, between the animal's waist and xyphoid process, so that she could reach over the entire surface of the screen comfortably (Fig. 1). The horizontal touch screen was designed to detect spatial displacements as small as 3 mm using infrared beams, at a sampling rate of 100 Hz. Continuous tracking of the eye was monitored using the ISCAN-ETL 200 Eye Tracking System (ISCAN, Burlington, MA) at a sampling rate of 60 Hz. To minimize any interference from the nonreaching limb, the animal was trained to maintain its nonreaching hand on a metal lever just beyond the lower corner of the horizontal touch screen throughout the experiment. Only when the metal lever was depressed would the tasks begin and continue. In this way it was ensured that the animal only used the appropriate arm without having to forcefully restrain the unused limb.
The schematic describing the sequence of each trial is shown in Fig. 1 and is as follows: a red circular target (70 mm in diameter) appeared at the center of the screen with an additional smaller white circular target (40 mm in diameter; 5.7° of visual angle) on top of it. The monkey was instructed to touch the red target and maintain eye fixation on the white target. After a baseline period of 500 ms, one of eight green-colored peripheral targets appeared (70 mm in diameter). All eight targets were equally spaced (45°) and appeared randomly, based on a randomized-block design. The peripheral target appeared 5 times at each location for a total of 40 trials per condition. After a variable instructed delay period (IDP; 2,000 ± 500 ms, Gaussian distribution), the red central target extinguished and the white target jumped to the peripheral target. This served as the go signal (GO) instructing the animal to move the eyes and hand from the central target to the peripheral target (Fig. 1). Once the eyes and hand arrived at the peripheral target, the monkey was required to hold both the eyes and the hand there for 500 ms. The movements were made from the middle of the center target to the middle of the peripheral target (roughly 80 mm, Fig. 2, E and F). Visual presentation of the task was identical across conditions. In the coupled condition, the actions of the eyes and hand remained congruent. The visual presentation of the targets and the reaching movements were both made on the horizontal touch-sensitive screen placed in front of the animal (Fig. 1A). In the decoupled condition, the actions of the eyes and hand were incongruent. The visual presentation of the targets was on the vertical screen, while the animal's limb movements remained on the horizontal touch screen (Fig. 1B). Thus the animal was required to direct its gaze along the vertical monitor, but move its hand along the horizontal touch screen to displace the cursor from the central to the peripheral target. To ensure that the animal did not track its hand position extra-foveally, an opaque screen was placed 100 mm over the animal's arm to block vision of the limb. For each condition, two epochs during the trial were considered. The delay epoch (IDP) was composed of the 500-ms baseline period and the first 2,000 ms of the IDP. While this epoch length means that a small percentage of trials will include early reaction time, we found in preliminary analyses that the results were stable using a longer epoch. The movement epoch (MOVE) is aligned to the onset of the reach and is composed of the last 500 ms of the instruction delay period and 500 ms after movement onset. The animals were trained to perform similar movements during both conditions, and the biomechanical features of the reach movements were monitored to ensure that the movement profiles were similar between conditions. To reinforce similar hand paths between conditions, movement alleys were included to ensure that reaches were directed along a fairly straight trajectory (Fig. 1). These alleys were set at ±40 mm from a straight line spanning from the central to the peripheral targets. If the cursor moved outside of these alleys, the trial would stop. To provide feedback on the current position of the hand, a cross-hair representing the position of the hand on the touch screen was displayed. Muscle activity was also recorded from 13 proximal-arm muscles in separate recording sessions. Pairs of Teflon-insulated 50-μm single-stranded stainless steel wires were implanted percutaneously. Implantations were verified by passing current through the wires to evoke focal muscular contractions (<1.0 mA, 30 Hz, 300-ms train; Sergio and Kalaska 2003). Multiunit electromyography (EMG) activity was amplified, band-pass filtered (100–3,000 Hz), half-wave rectified, integrated (5-ms time bins) and digitized online at 200 Hz. The muscles studied included the anterior deltoid, medial deltoid, posterior deltoid, dorsoepitrochlearis, infraspinatus, latissimus dorsi, pectoralis, supraspinatus, teres major, rostral trapezius, caudal trapezius, triceps lateralis, and triceps medialis. These recordings were performed to assess the general effects of the coupled and decoupled tasks on EMG activity and were not designed as a definitive biomechanical study of the muscle properties.
A gaze-only control task, which has been described previously in detail (Hawkins et al. 2013; Sayegh et al. 2013), was also included to determine if the neural activity was affected solely by the overall shift in gaze angle. This condition was performed for every recording. The visual display consisted of nine white circles (40 mm in diameter; 5.7° of visual angle) that appeared one at a time in the same locations (i.e., one central and eight peripheral) as the white targets that appeared during the experimental conditions. The monkey was instructed to fixate on each of these white circles while maintaining both hands beside the horizontal touch screen. The white circles appeared in each location three times for a total of 27 fixation points for each plane.
To confirm that the movements were biomechanically similar between conditions, hand paths were recorded and analyzed. The individual movement paths were first low-pass filtered at 10 Hz, and the movement onsets and endpoints were scored as 8% peak velocity. The movements were then divided into 21 equal segments. The five trials for each direction were pooled, and the mean standard deviations were calculated at each segment along the path for both the coupled and decoupled tasks. An equality of variance test was performed between the two conditions on the mean X and Y components of the trajectory for each target (Snedecor and Cochrane 1989). Mean reaction times (from the GO signal movement onset) and mean reach velocity (from movement onset to the end of the movement) were also calculated for each condition, and paired-samples t-tests were performed to compare reaction times and velocity between the coupled and decoupled conditions. Repeated-measures ANOVAs were performed on the EMG data during the IDP and MOVE epochs for each muscle recorded to determine the effect of target (reach direction) and condition (coupled vs. decoupled) on maximum EMG amplitude. It was expected that reach direction would have an effect on EMG amplitude, but task condition would not.
Monkeys were implanted with a recording cylinder under standard aseptic surgical techniques (Kalaska et al. 1989). The stereotaxic coordinates for chamber placement over SPL (monkey A, interaural, anterior: −12.30 mm, lateral: 18.40 mm; and monkey B, interaural, anterior: −7.80 mm, lateral: 00.00 mm, Fig. 2, A and B) were determined using The Rhesus Monkey Brain in Stereotaxic Coordinates (Paxinos et al. 2000). Space limitations for monkey A, due to a previously implanted premotor chamber and a posterior head-post, required the chamber to be positioned on a 25° angle to allow access to the desired brain regions through deep electrode penetrations in the medial-anterior quadrant of the chamber (Fig. 2, A and C). Thus, while the surface entry points are more rostral, the angle and depth of the electrodes provided access to neurons from within the medial intraparietal sulcus. A hydraulic multichannel driver (MCM-4, FHC, Bowdoin, ME) mounted to the implanted chamber was used in conjunction with a multichannel processing system (MCP, Alpha-Omega Engineering). The multidriver provided simultaneous recording from up to two electrodes at a time and allowed us to examine the LFP collected at each electrode site. Neural activity from each electrode was preamplified (5,000×), band-pass filtered (1 Hz to 10 kHz), and split into lower (LFP) and higher (single units) frequencies. Higher frequency signals were sampled at 12.5 kHz, and spikes of single units were sorted using template matching (Hawkins et al. 2013). The LFP (below 100 Hz) was sampled at 390.6 Hz. Following the completion of all experiments, anatomical brain images of both animals were obtained using a 3T Siemens Tim Trio MRI scanner to verify chamber location (T1 anatomical, field of view: 131 × 122.8 mm, repetition time: 2300 ms, echo time: 3.54 ms, flip angle: 9°).
All successfully recorded LFP sites were used for all of the analyses reported in this paper. Open-source Chronux script files were used in MATLAB (The Mathworks, Natick MA) to analyze the spectral data and to generate time-frequency spectrograms for all penetrations in both conditions (Jarvis and Mitra 2001; Pesaran et al. 2002). To estimate the frequency structure of the LFP activity, we used the multitaper spectrum analysis (previously described in Jarvis and Mitra 2001; Pesaran et al. 2002). The multitaper estimates of the spectrum Sx(f) were calculated for each recording (Pesaran et al. 2002; Scherberger et al. 2005). Normalizing to a baseline period was necessary to directly compare between conditions; therefore the spectrum for each site was z-transformed to its own baseline period centered on the final 350–500 ms of the baseline window, which reflected the most stable time window prior to directional cue onset (averaged across trials) (see Fig. 3D). Spectrograms were then calculated using a 500-ms window shifted in 20-ms increments with a 6-Hz frequency resolution. Currently there is some discrepancy regarding the relationship of high-frequency gamma activity with spiking activity. While some research suggests that a strong relationship exists (Ray and Maunsell 2011; Zanos et al. 2011, 2012), others suggest a difference between spiking and high gamma activity (Flint et al. 2012; Pesaran et al. 2002). To remain consistent with our laboratory's previous work, we decided to only include data from below 60 Hz (Sayegh et al. 2013).
To determine significant task-related differences (at a P <0.05 alpha level), the normalized spectra from each trial for each electrode site was calculated. The average spectral value across each time-frequency bin was determined for each condition. A bootstrapping permutation test (Hawkins et al. 2013; Sayegh et al. 2013; Sergio and Kalaska 2003) was used to assess whether the difference in power between the two conditions was significant. According to the null hypothesis, power would be the same irrespective of condition (coupled/decoupled), and as such the observed difference in mean power would not exceed the 95% confidence limits of a distribution of differences in LFP power. The difference distribution is obtained by shuffling (permuting) the condition assignment of each trial, taking the power difference, then repeating this 1,000 times. To depict the regularity of significant differences for each time and frequency bin across the population of SPL sites, we generated a color plot mapping the proportion of sites showing significant differences at a given point in time and a given frequency. A receiver-operating characteristic (ROC) analysis was performed on the spectrum to measure the discriminability of two alternatives by an ideal observer. Here, we calculated the probability of an ideal observer correctly predicting the behavioral epoch (IDP vs. MOVE) and of correctly predicting the task (coupled vs. decoupled). ROC values were determined across the population of sites (N = 44) using the normalized power values at each time and frequency band used to generate the average spectra. To determine the ROC values for predicting the state of the animal (planning vs. execution), we pooled the ROC values from each condition for each epoch separately.
Lastly, we analyzed the oscillatory activity in the gaze-only condition to determine whether the overall shift in gaze angle that occurred from viewing the stimuli in two different planes had an effect on the oscillatory activity within SPL. The mean power (0–70 Hz) at each gaze location was calculated from a 500-ms window while the animal was fixating at each target. Within each condition, the mean power across the nine target locations was computed. The permutation procedure described above was used to assess whether the difference in power between the two gaze planes could have occurred by chance.
A single-unit analysis was previously described elsewhere (Hawkins et al. 2013); however, a subset of the data was selected for a distinct analysis in this study. See Hawkins et al. (2013) for details on single-unit isolation and task-selective analysis. Cells that were directionally tuned to the epoch of interest were selected, and mean firing rates were normalized to individual baseline firing rates. Baseline firing rates were calculated as the mean firing rate during the first 300 ms of each trial when the animal was instructed to hold its hand at the central target. This generated a normalized firing rate for comparison with the LFP data, which were also normalized to the same baseline time period. Significant task related differences were determined by performing paired-samples t-test between the spike rates during each condition. As was done for the LFP data, single units were tested in the gaze-only condition to assess the effect of the viewing plane on spike activity. Lastly, an ROC analysis was performed on the firing rates of all task-related single units. Area under the curve (AUC) values were determined across the population of cells (N = 26, IDP epoch; N = 17, MOVE epoch) using a sliding-window analysis with the same timing as that used for the LFPs. We compared the firing rates of each cell during the coupled vs. decoupled conditions to generate ROC values for task probability. To determine the ROC values for predicting the behavioral state (planning vs. execution), we pooled the firing rate for each condition and compared across epochs.
To ensure that task-related differences in the neural data were not a result of differences in the movement of the hand, we compared the hand trajectories between the two conditions. Our comparisons confirm that the kinematic and EMG features of the limb movement between task conditions were not significantly different. Therefore, we can interpret the task-related differences in the neural data as being due to rule-processing rather than motor behavior. The use of alleys helped support the animal in maintaining similar hand trajectories during both conditions (see methods). Figure 2, E and F, shows the mean reach trajectories during both conditions for each animal. Except for a few segments, there were no significant differences in the extent or variability of the reach trajectories between coupled and decoupled conditions (P > 0.05). To analyze reach kinematics, the mean reach velocity was calculated for each animal, and no significantly different between conditions was observed (P > 0.05). An analysis of the EMG data revealed that, for 11 of 13 muscles, there was no main effect of condition during the IDP and MOVE epochs (P > 0.01). For two muscles, medial deltoid and teres major, there was a marginal effect of condition on EMG activity during the IDP period (0.05> P > 0.01). This may have been due to a slight alteration in the animals' starting posture in reaction to the board placed over their arm in the plane dissociation condition. There was, as expected, a main effect of target for all proximal arm muscles studied during the MOVE epoch (P < 0.01). Lastly, reaction times between the coupled (M = 537.9 ms, SEM ± 12.82) and decoupled (M = 522 ms, SEM ± 9.89) conditions were also not significantly different [t(45) = 0.927, P = 0.359]. Taken collectively, these results strengthen the conclusion that any neural differences observed between conditions are not a direct result of changes in the kinematics or biomechanics of the reaching movement, but rather the neural control of the movement.
We obtained 44 LFP recordings from SPL (28 from monkey A, 16 from monkey B; Fig. 2). All successfully recorded LFP sites were used for the analysis of this project. In addition, 91 single cells were recorded during each condition, of which only 41 (45%) were found to be task-related (see Hawkins et al. 2013). In line with our first hypothesis, the neural activity within SPL, both at the single-cell and LFP level, changed between conditions. In support of our second hypothesis, we observed that the magnitude of these changes varied with behavioral epoch.
Task-related differences during IDP epoch.
We observed salient differences in the oscillatory activity within SPL during coupled vs. decoupled reaches (example site, Fig. 3). Specifically, we observed strong synchrony within the 10- to 20-Hz beta frequency band shortly after peripheral cue onset for each site (Fig. 3A). The signal is more easily seen in Fig. 3C for both conditions. This was consistent across the population (Fig. 4, A–C). Importantly, there was a delay in the oscillatory activity within this range during the IDP of decoupled relative to coupled reaches (10–20 Hz, compare Fig. 4A with Fig. 4B). Figure 4C shows the number of sites within the population that showed significant task-related differences (P < 0.05). The heightened beta synchrony in the coupled condition emerged more quickly after directional cue onset than that observed during the decoupled condition (Fig. 4D). As the task progressed, these differences diminished so that, by ∼500 ms into movement planning, power within the beta band was indistinguishable between conditions. In concert with the 10- to 20-Hz IDP oscillation, we also observed a reduction in the mean firing rate of single cells during this epoch in the decoupled condition relative to the coupled one (Fig. 5A, P < 0.05). Furthermore, similar to the oscillatory activity results (Fig. 4D), as the task progressed, this difference in mean firing rate was no longer significant (Fig. 5A, P > 0.05). We also observed a reduction of gamma-band activity for both conditions following the peripheral cue onset (Fig. 4, A and B). This gamma-band reduction was stronger during the coupled relative to the decoupled condition (Fig. 4D). As the task progressed, gamma synchrony continued to increase during the IDP, and continued to be stronger in the decoupled condition, right up until the GO cue that signaled the beginning of the MOVE epoch.
Task-related differences during MOVE epoch.
Whereas 10- to 20-Hz beta synchrony in SPL dominated the IDP, by movement onset this pattern shifted to reveal even stronger synchrony that was focused in the gamma (>25 Hz) and low-frequency (<10 Hz) bands (Fig. 3B, Fig. 4, E–G). Across the population, a clear enhancement in oscillatory power within the low-frequency band occurred roughly 200 ms before the onset of the reach movement (Fig. 4, E and F). Furthermore, this activity was stronger during decoupled compared with coupled reaches (Fig. 4H, P > 0.05). Prior to movement onset, during the late planning phase of the reach, many sites also showed significant alterations in the gamma frequency band (30–60 Hz, Fig. 4, E–G). As the task progressed, gamma-band activity became stronger and similar to the IDP epoch, the 30- to 40-Hz low-gamma band was significantly enhanced during the decoupled compared with the coupled condition (Fig. 4, G and H, P > 0.05). Figure 4G shows not only the time course of the gamma recovery at the end of the IDP/beginning of the MOVE epoch, but also that the low-gamma power is stronger and occurs earlier in the decoupled condition (Fig. 4, G and H). In summary, as the end of the delay approaches and reaching behavior progresses from planning a decoupled reaching movement to executing one, a significant enhancement in the oscillatory power under 10 Hz and in the low-gamma band occurs relative to that observed during coupled reaches. In support of the LFP findings, we also observed an enhancement in the mean discharge rate of single units within SPL during decoupled reaching relative to direct target interaction, albeit at a slower time course that begins after the start of the MOVE epoch (Fig. 5B, P < 0.05).
Task probability estimates.
We calculated the average ROC probability for each site in predicting the correct condition (coupled vs. decoupled) and behavioral state (planning vs. execution) of the animal. Whereas the lower beta-band range (10–20 Hz) was a dominant oscillatory frequency across conditions, our results showed that the greatest task predictability (coupled/decoupled) was within the low-gamma frequency range (30–40 Hz) during both the planning (Fig. 6, A and B) and MOVE epochs (Fig. 6, C and D). Furthermore, oscillations within the gamma band showed stronger task predictability than the single units, specifically during the late planning phase (Fig. 6, B and D). Thus gamma-band activity was a better predictor of the type of reach the animal was performing. In contrast to the task discrimination, the behavioral state of the animal was most strongly discriminated by the 10- to 20-Hz frequency band (AUC = 0.8861, planning vs. execution), compared with that of spiking (AUC = 0.5277) or low-gamma-band activity (AUC = 0.6715). In summary, although gamma-band activity was good at predicting the type of reach the animal was performing, beta-band activity was better at predicting the behavioral epoch, providing a reliable signature of the delay epoch. These results support the idea that oscillatory activity carries a richer set of information than single-unit firing rate alone.
Performance during each condition required arm movements that were biomechanically similar (Fig. 2, C and D); however, the overall eye-in-head angle shifted between conditions equivalent to the gaze angle since the head was fixed in place. To ensure that the task-related differences we observed were not a direct result of this change in gaze angle, we recorded and analyzed the neural activity during a gaze-only condition (see methods). No change in single-unit mean discharge rate occurred within SPL, as previously reported (Hawkins et al. 2013). Similarly, we found no effect of gaze plane on any frequency range of the oscillatory activity within SPL (Fig. 7, P > 0.05).
Patient data and limited imaging studies suggest that parietal and premotor areas are crucial to the control of goal-directed voluntary movement and may contribute to eye-limb coordination under conditions requiring cognitive rule integration. How key nodes within this network help to accomplish goal-directed voluntary movement in the face of decoupled gaze-hand mapping is not yet understood. The vast amount of research available on SPL activity has demonstrated its importance in the general representation of posture and movement of the body and eyes for visuomotor transformations (Andersen et al. 1997; Breveglieri et al. 2006; Caminiti et al. 1998; Kalaska 1996, 1997). Specifically, the SPL is thought to be involved in transforming sensory information into the appropriate reference frames to guide hand movements (Batista et al. 1999; Buneo et al. 2002; Galletti et al. 2003; Kalaska et al. 1983; Vesia and Crawford 2012). Regions within SPL are important in the planning and execution of goal-directed reaches (Colby 1998; Colby and Goldberg 1999; Culham et al. 2006; Galletti et al. 1999a, 1999b, 2003; Prado et al. 2005), maintaining an internal representation of ones body in the surrounding space (Breveglieri et al. 2006; Mountcastle et al. 1975), and calculating the reach vector from the initial hand position (Eskandar and Assad 1999; Vesia et al. 2010). Taken collectively, while it is known that SPL shows reach-related activity, our investigation into the oscillatory and single-unit activity within SPL demonstrates that this activity varies, depending on the type of visuomotor transformation being performed, specifically between the different stages of the movement. The oscillatory activity during the IDP, a period concerned with planning and holding the motor plan in working memory, was dominated by synchrony within the 10- to 20-Hz frequency range. This activity was specifically evident during the coupled transformation, while, in contrast, the contribution of 10- to 20-Hz oscillations to the decoupled IDP seemed to be disrupted, at least in the early part of the epoch. As the task progressed, we observed a clear distinction in the role of SPL to the planning vs. the execution of a decoupled motor act. By late planning and movement onset, there was a switch in low-frequency and gamma-dominated response, specifically for decoupled reaches. This switch may represent the increased reliance on proprioceptive inputs and online control mechanisms required during decoupled eye-hand coordination (Battaglia-Mayer and Caminiti 2002).
Planning a Decoupled Reach: Decreased Neural Activity Within SPL
The planning of a decoupled reaching movement produced a significant delay in the dominant beta oscillation (10–20 Hz) and single-cell activity within SPL compared with coupled reach planning. Previous work into the contribution of SPL to the planning of visually-guided reaches has shown that cells within this region receive converging information from the eyes, as visual feedback, and the arm, as proprioceptive feedback (Kalaska 1996; Kalaska and Crammond 1995). As a result, it has been suggested that the regions within SPL preferentially represent automatic or sensory-driven reaching movements (Desmurget et al. 1999; Gail et al. 2009; Pisella et al. 2000). Coupled reaching movements that involve direct interaction with objects of interest are innate and natural to produce, while decoupled reaching movements are not innate and must be learned over time (Bo et al. 2006; Piaget 1965; Sergio et al. 2009). Successfully decoupling the action of the eyes from that of the hand will demand inhibition of our natural tendency to couple them, thus increasing the processing that must occur to incorporate the transformational rule into the motor plan (Gorbet and Sergio 2009; Sergio et al. 2009).
Sub-gamma frequencies (<30 Hz), including beta oscillations, are primarily observed within the infragranular layers of a region and thus are suggested to reflect feedback projections to distant signals or are involved in “top down” neural processing (Bastos et al. 2012; Bosman et al. 2012; Maier et al. 2010). This is in contrast to neuronal synchrony and spike-field coherence in the gamma range, which are observed in the superficial and granular cortical layers of a region and thus suggested to reflect “bottom up” processing (Bastos et al. 2012; Brovelli et al. 2004; Buschman et al. 2012; Donner and Siegel 2011; Engel and Fries 2010; Siegel et al. 2012). This idea is further supported by recent work into the functional role of beta oscillations by Engel and Fries (2010). They suggest that beta oscillations signal the current behavioral state, or the “status quo,” by promoting preferential or top-down processing of that state (such as the motor plan) (Engel and Fries 2010; Pesaran et al. 2002; Scherberger et al. 2005). In the present study, it is important to clarify the different uses of the word feedback/feedforward. In the movement control sense, feedback refers to updating the current state of the system using incoming sensory information once the movement has begun. In the neural anatomical sense, top-down types of anatomical feedback are thought to modulate typical integration or processing of incoming signals into a given region, as may occur following memory, attention, context, or voluntary inhibition of a typical response.
Our results demonstrate 1) enhanced beta synchrony during the delay epoch, and 2) a delay of this oscillation during decoupled reaches. Based on the observations that SPL integrates eye and hand signals from various regions of the brain (Battaglia-Mayer and Caminiti 2002; Graziano et al. 2000), the appearance of beta may be a signal to indicate that the behavior has been planned successfully, possibly as a “hold” signal to maintaining the current motor plan. Indeed others have observed increases in beta-band activity during working memory paradigms (Pesaran et al. 2002) when the motor plan would need to be held. Beta coherence has also been observed between area 5 and M1 during movement hold (Witham et al. 2007), and enhanced beta synchrony is associated with motor slowing in healthy (Pogosyan et al. 2009) and Parkinsonian patients (Schnitzler and Gross 2005). In addition, beta-band activity shows strong state (planning vs. execution) predictability (see section below, Pesaran et al. 2002).
Our observation of a delay in beta synchrony during decoupled reaching movements may indicate that eye-hand segregation leads to a neural interference or a delay of this top-down control over movement planning. Indeed the incongruent eye and hand signals will demand additional processing to incorporate the new spatial transformation into the motor plan (Gorbet and Sergio 2009), as discussed below. The additional processing would require more time to calculate the transformed reach vector and incorporate the new spatial transformation between the eyes and the hand, a requirement that may be reflected in the delayed beta-band and single-unit activity observed here. These results also support a functional role for beta in the planning of visually guided movements, possibly as a signal for indicating the maintenance of an already established movement plan prior to execution.
In contrast to the task-related beta-band differences, we observed gamma-band synchrony that was reduced during the delay of coupled compared with decoupled reaches. Previous reports have observed enhanced gamma synchrony within early sensory areas during active sensory processing, whereas alpha/beta-band synchrony is generally reduced. Proprioceptive information arising from the somatosensory cortex and visual information arising from the primary visual cortex terminate in the superficial layers of SPL (Pandya and Seltzer 1982; Rockland and Pandya 1979), where gamma oscillations often dominate (Bastos et al. 2012; Bosman et al. 2012; Maier et al. 2010). The incongruent signal between eye and hand locations will require additional processing between parietal and frontal regions (Geyer et al. 2000; Gorbet and Sergio 2009; Matelli and Luppino 2001) so that the correct relative position code can be calculated to guide the eyes and the hand to their new appropriate spatial locations. The extra reliance on proprioceptive and visual signals is important in calculating the spatial transformation required for decoupled reach, and we propose that the increase in gamma synchrony is a reflection of this extra reliance, relative to a coupled reach.
Executing Decoupled Reaches: Enhanced SPL Activity
As the trial progresses, the pattern of neural activity shifts to one of enhanced activity during the execution of decoupled reaches. This shift occurs prior to movement onset in the low-frequency (5–10 Hz) and low-gamma oscillations (30–40 Hz) within SPL. Note here that the clearly delineated bands of activity do not fall strictly within typical EEG bands; namely, our “beta” covers the high alpha range (10 Hz), and our time resolution during the MOVE epoch prohibits a clear delineation within low-frequency bands (e.g., delta from theta and even lower alpha).
During visually guided reaching movements, SPL receives information to maintain an updated representation about the relative position between the hand and the reach goal in eye-centered coordinates (Buneo and Andersen 2006; Jackson et al. 2009; Rushworth et al. 1997a; Wolpert et al. 1998). The rapid online updating about limb position relies on forward model predictions that combine efference copy motor commands, sensory feedback (visual and proprioceptive), and an internal model regarding the dynamics of the arm (Battaglia-Mayer et al. 2013; Buneo and Andersen 2006; Desmurget et al. 1999; Desmurget and Grafton 2000; Vesia and Crawford 2012; Wolpert et al. 1998). During coupled eye and hand reaches, the visual and proprioceptive information regarding the location of the limb and its relative position to the reach target are in alignment and thus provide equally accurate information. During decoupled reaching movements, SPL is receiving mismatched visual and sensory information. Thus in order for SPL to maintain an updated representation of the position of the hand relative to the target, the hand position must be derived predominantly from proprioceptive feedback and efference copy information (Buneo and Andersen 2006; Engel et al. 2002; Flanders et al. 1992; Nixon et al. 1992; Rushworth et al. 1997a, 1997b). Numerous research studies suggest that a reach performed under visually reliable situations is controlled in eye-centered coordinates (Buneo and Andersen 2006; Vesia and Crawford 2012). However, when visual information is unreliable, a limb-centered posture-defined coordinate system must be used to control the reach (Batista et al. 1999; Buneo and Andersen 2006; Jackson et al. 2000, 2009; Pellijeff et al. 2006; Rushworth et al. 1997a). Within SPL the frames of references used to plan and control reaching movements are highly flexible and task specific (Battaglia-Mayer and Caminiti 2002; Buneo et al. 2002; Newport et al. 2006). In addition, damage to the SPL results in misreaching due to proprioceptive deficits that impair the integration of visual and proprioceptive information (Blangero et al. 2007). Together these results suggest that, during a decoupled, context-dependent visuomotor transformations, the reliability of the visual information provided could influence how the updated limb state is determined (Buneo and Andersen 2006). However, future investigations specifically designed to test this hypothesis will need to be conducted to address this suggestion.
As previously stated, proprioceptive inputs from sensorimotor cortex terminate in the superficial layers of SPL (Pandya and Seltzer 1982; Rockland and Pandya 1979) where gamma oscillations dominate (Bastos et al. 2012; Bosman et al. 2012; Maier et al. 2010). If executing a decoupled reach relies more heavily on proprioceptive and efference copy processing vs. a coupled reach, than enhanced gamma-band activity during these types of movements are not surprising. This also helps to explain the progressive increase in gamma-band synchrony observed in the current study as the trial progresses. To maintain an updated representation regarding the current state of the limb, SPL must be able to incorporate proprioceptive feedback into the ongoing motor command (Buneo and Andersen 2006; Jackson et al. 2009; Rushworth et al. 1997a; Wolpert et al. 1998). The reciprocal communication between parietal and frontal structures (such as dorsal premotor cortex) are likely critical to the incorporation of an updated estimate regarding limb position to the current motor plan throughout movement execution (Geyer et al. 2000; Luppino and Rizzolatti 2000; Matelli et al. 1998; Wise et al. 1997). Since alpha/beta synchrony has been suggested to reflect top-down, feedback processing (Bastos et al. 2012; Bosman et al. 2012; Maier et al. 2010), one possibility is that the enhanced low-frequency synchrony (10 Hz), observed in the present study, could signal the maintenance of the updated limb estimates to the motor plan. This is also supported in the ROC analysis, which demonstrates beta-band activity to be the strongest predictor of behavioral state (see Probability Estimates section). Future studies examining the synchrony between parietal and premotor structures during decoupled reaches would need to be conducted in order address this possibility.
In addition to characterizing the oscillatory activity within SPL during different types of reaching movements, we also looked at the receiver-operating characteristic for task epoch and condition probability estimates. We found that oscillations within the beta and gamma bands showed either strong task epoch or strong condition predictability. Gamma-band activity, and to a lesser extent single units, were a stronger predictor of which condition the animal was performing (coupled vs. decoupled) than beta-band activity. In contrast, beta-band activity was observed to be a better predictor of the behavioral state of the animal (i.e., task epoch, planning vs. execution) than spikes and gamma-band activity. Previously, Pesaran et al. (2002) and others have found that they could decode the behavioral state of the animal (planning vs. execution) more reliably with beta-band activity, while gamma-band and single-unit activity could reliably be used to decode the movement direction (Engel and Fries 2010; Pesaran et al. 2002; Scherberger et al. 2005). Previous work by Battaglia-Mayer and colleagues (Battaglia-Mayer et al. 2001; Battaglia-Mayer and Caminiti 2002) has demonstrated that SPL neurons combine different eye and hand information into a global-tuning field, representing different frames of reference for eye-hand coordination. This idea fits with the conclusion of Pesaran et al. (2002) that gamma-band and single-unit activity carry information about movement direction, which alters the relative position of the eyes and the hand. Similarly, our finding that gamma-band and single-unit activity carry information about which condition is being performed is in agreement with these findings. Performance of the decoupled task required an overall shift in the relative position of the eyes and the hand. These results support the idea that SPL activity is modulated by different types of reaching movements. In addition, it supports previous work that has found LFP activity to carry a richer set of information than single units alone.
The current report supports and expands upon recent work demonstrating that different types of visually guided reaching movements alter the activity of regions within the parietofrontal reach network. Although the role of SPL in reach planning and execution is well understood, its role in decoupled visuomotor transformations has not been thoroughly studied. The current work presented here supports the role of SPL in decoupled reach planning and execution. Specifically, we suggest that, because of the nature of decoupled reaching movements, decoupling the action of the eyes from that of the hand will alter the weight of proprioceptive feedback and online monitoring required during a decoupled relative to a coupled movement. This increased reliance on prioprioceptive and efference copy information will manifest itself as enhanced neural processing (increased firing rate and activity) during the movement and suggests that SPL may have a prominent role in providing ongoing proprioceptive and efference copy information about the hand.
This work was supported by Canadian Institutes of Health Research grant MOP-74634 (L. E. Sergio), the Canadian Foundation for Innovation and the Ontario Innovation Trust (L. E. Sergio), and the Ontario Ministry of Training, Colleges, and Universities (P. F. Sayegh).
No conflicts of interest, financial or otherwise, are declared by the author(s).
Author contributions: P.F.S., K.M.H., B.N., and L.E.S. performed experiments; P.F.S., K.M.H., and L.E.S. analyzed data; P.F.S., K.M.H., K.L.H., and L.E.S. interpreted results of experiments; P.F.S. prepared figures; P.F.S. drafted manuscript; P.F.S., K.M.H., J.D.C., K.L.H., and L.E.S. edited and revised manuscript; P.F.S., K.M.H., J.D.C., K.L.H., and L.E.S. approved final version of manuscript; J.D.C. and L.E.S. conception and design of research.
We thank Taiwo McGregor, Tyrone Lew, Dr. Xiaogang Yan, and Dr. Hongying Wang for exceptional technical and surgical assistance, as well as Natasha Down, Veronica Scavo, Julie Panakos and Dr. Melissa Madden for invaluable animal care expertise.
- Copyright © 2014 the American Physiological Society