A recent report demonstrated that, while fixating a central light, lateral intraparietal area (LIP) neurons are not modulated by the location of auditory stimuli until monkeys learn to saccade to the location of an auditory stimulus. This finding suggests that auditory spatial responses in area LIP are dependent on auditory-saccadic training. We found that, in monkeys that had not been trained to make behavioral responses to auditory stimuli, LIP neurons are modulated by auditory-stimulus location when a central light is not present in the environment. These results indicate that LIP auditory responses are not wholly dependent on behavioral training with auditory stimuli.
Neurons in the lateral intraparietal area (area LIP) are modulated by the location of auditory stimuli (Cohen and Andersen 2002; Grunewald et al. 1999; Linden et al. 1999; Mazzoni et al. 1996; Stricanne et al. 1996). However, a recent study (Grunewald et al. 1999) found that, during a visual-fixation task, LIP neurons were not modulated by the location of band-pass noise bursts until the monkeys learned to saccade to the location of an auditory stimulus. This result suggests that auditory spatial responses in area LIP are dependent on auditory saccadic training. The goal of this study was to explore further this hypothesis in monkeys that had not been trained to make behavioral responses to auditory stimuli. First, we hypothesized that, during a visual-fixation task, LIP neurons may be modulated by the locations of auditory stimuli with spectrotemporal properties like those found in the natural environment (e.g., species-typical vocalizations). Second, we hypothesized that, if auditory spatial responses are wholly due to auditory-saccadic training, these responses should not depend on the context in which auditory stimuli are presented. We now report that, independent of the spectrotemporal properties of the auditory stimulus, LIP neurons are modulated by the spatial location of an auditory stimulus when a central light is removed from the environment. This result indicates that auditory activity in area LIP is not wholly dependent on behavioral training.
Rhesus monkeys (Macaca mulatta) were placed in front of a circular stimulus array. The array contained eight speakers that were separated by 12° relative to a “central” speaker. Visual stimuli were produced by a red light-emitting diode (LED) that was mounted and centered on each speaker. Eye position was recorded with 1-ms resolution using a scleral eye coil. Extracellular action potentials were recorded with tungsten electrodes (FHC) that were inserted into a recording chamber (Crist Instruments); the recording chamber was centered at stereotaxic coordinates 6 mm posterior and 12 mm lateral. The location of the lateral bank of the parietal cortex was determined by visualizing a recording microelectrode in the posterior parietal cortex of each monkey with magnetic resonance images (Groh et al. 2001). To ensure that recordings were made from area LIP, rather than area 7a that lies on the brain surface, the electrode was advanced 3.0 mm below the dura at the start of each recording session (Andersen et al. 1990; Lewis and Van Essen 2000). LIP neurons were identified by their responses to visual stimuli and their perisaccadic and saccadic responses. All surgical procedures and protocols were approved by Dartmouth College's Institutional Animal Care and Use Committee and were in accordance with federal guidelines for the care and use of animals in research.
Band-pass noise was generated from Gaussian white noise and filtered to have a pass-band between 0.55 and 15.25 kHz. Species-typical vocalizations were recorded and digitized as part of an earlier set of studies (Hauser 1998). The durations of the noise bursts were variable and matched the variance in the duration of the species-typical vocalizations [326 ± 129 (SD) ms]. Each auditory stimulus was presented at a sound level of 65 dB SPL (sound pressure level, relative to 20 μPa). The stimuli were presented through a D/A converter (DA1, Tucker Davis Technologies), an amplifier (SA1, Tucker Davis Technologies; MPA-250, Radio Shack), and a speaker (Pyle, PLX32).
Three tasks were used in this study: the “visual-saccade,” “visual-fixation,” and “gap-fixation” tasks. In the visual-saccade task, 500–1,000 ms after fixating the LED mounted on the central speaker (i.e., the “central LED”), one of the eight peripheral LEDs was illuminated. After an additional 500–1,000 ms, the central LED was extinguished, signaling the monkeys to shift their gaze to the illuminated peripheral LED. In the visual-fixation task (Fig. 1A), 1,000–1,500 ms after fixating the central LED, an auditory stimulus was presented. The monkeys maintained their gaze at the central LED during auditory-stimulus presentation and for an additional 1,000–1,500 ms after auditory-stimulus offset to receive a juice reward. In the gap-fixation task (Fig. 1B), 1,000–1,500 ms after fixating the central LED, it was extinguished. The monkeys, however, were required to maintain their gaze at the location of the extinguished central LED. Three hundred to 500 ms after the central LED was extinguished, an auditory stimulus was presented on 50% of the trials. Seven hundred to 800 ms after offset of the auditory stimulus, the central LED was re-illuminated, and the monkeys continued to maintain their gaze at its location for an additional 500–1,000 ms to receive a juice reward. During this task, monkeys kept their gaze within 1.5° of this fixation point, a variance comparable to that observed during the visual-fixation task, and did not systematically vary their eye position with auditory-stimulus location.
To avoid selection bias and to ensure our population of neurons was comparable to previous studies (Grunewald et al. 1999), the activity of any well-isolated neuron was recorded. The monkeys first participated in a block of trials of the visual saccade task. LIP activity during this task was correlated with the location of the peripheral LED to construct a spatial response field. The visual-stimulus location that elicited the highest firing rate during the period in which the peripheral LED was illuminated was designated as the “IN” location. The location 180° contralateral was the “OUT” location. For those LIP neurons that were not modulated during the visual saccade task, we operationally defined the speaker location that was 12° to the right of the central LED as the IN location and the speaker location 12° to the left of the central LED as the OUT location. Next, the monkeys participated in a block of trials of the visual-fixation task or the gap-fixation task. The location of the auditory stimuli (IN or OUT) and the stimulus type (band-pass noise or species-typical vocalizations) was varied randomly on a trial-by-trial basis. Since LIP neurons code comparable regions of auditory and visual space and since visual responses predict the presence of auditory responses (Linden et al. 1999; Mazzoni et al. 1996; Mullette-Gillman et al. 2002) (see results), defining the IN and OUT locations through the visual-saccade task did not bias us against finding LIP neurons sensitive to auditory stimuli.
Neural activity was examined during the stimulus period. This period was the time when an auditory stimulus was in the environment; the time of occurrence of action potentials was aligned relative to stimulus onset. Since the durations of the vocalization and the noise exemplars were different (see Auditory stimuli), neural activity during the stimulus period was normalized and expressed in terms of firing rate (i.e., the number of action potentials divided auditory-stimulus duration). A two-factor (stimulus location × auditory-stimulus type) ANOVA tested whether the stimulus period firing rate of a LIP neuron was modulated by stimulus location or auditory-stimulus type. This analysis was done independently on data collected during the visual-fixation and gap-fixation tasks.
Additionally, we quantified the amount of stimulus-location information (Cohen et al. 2002; Cover and Thomas 1991; Gnadt and Breznen 1996; Grunewald et al. 1999) contained in each neuron's firing rate. Stimulus-location information is a nonparametric index of a neuron's spatial selectivity. In brief, firing rates were binned to form a matrix in which stimulus location constituted one dimension and firing rate was the other dimension. Stimulus-location information was given by where s is the index of each stimulus location, r is the index of the firing rate bins, P(s,r) is the joint probability, and P(s) and P(r) are the marginal probabilities.
To facilitate comparisons across monkeys and stimulus modalities, data are reported in terms of relative information (Grunewald et al. 1999; Panzeri and Treves 1996). We computed relative information, on a neuron-by-neuron basis, by calculating the amount of stimulus-location information from the original data and from bootstrapped trials. In bootstrapped trials, the relationship between a neuron's firing rate and stimulus location was randomized, and the amount of information was calculated. This process was repeated 100 times, and the median value from this distribution of values was determined. The amount of relative stimulus-location information was calculated by subtracting the median amount of information obtained from boot-strapped trials from the amount obtained from the original data.
We recorded from 96 LIP neurons from the left hemispheres of two monkeys. Seventy-two percent (n = 69) of these neurons were modulated by the location of the peripheral stimulus during the visual saccade task, a proportion comparable with those reported in previous studies using a similar recording strategy (Barash et al. 1991; Grunewald et al. 1999). The activity of 60 of these neurons was recorded during both the visual-fixation and gap-fixation tasks.
We found that LIP neurons are modulated by auditory-stimulus location primarily during the gap-fixation task. An example neuron is shown in Fig. 2. During the visual-fixation task (Fig. 2, A and B), the firing rate of this neuron was not modulated by band-pass noise or species-typical vocalizations at either the IN or OUT locations. In contrast, during the gap-fixation task (Fig. 2, C and D), the firing rate of the neuron was modulated substantially by the location of species-typical vocalizations but not by the location of band-pass noise.
For each LIP neuron, a two-factor ANOVA examined whether the mean firing rate was modulated by the auditory-stimulus location (IN vs. OUT) or the type of auditory stimulus (band-pass noise vs. species-typical vocalizations). During the visual-fixation task, we found that five LIP neurons (n = 5/86; 6%) were modulated significantly by auditory-stimulus location and only one neuron (n = 1/86; 1%) was modulated by auditory-stimulus location and auditory-stimulus type. Neither of these two proportions of LIP neurons was different from that expected by chance (binomial probability, P > 0.05).
A different pattern emerged during the gap-fixation task. During this task, we found that a significant (binomial probability, P < 0.05) proportion of LIP neurons were modulated by auditory-stimulus location (n = 9/70; 13%). However, only one neuron (n = 1/70; 1%) was modulated by auditory-stimulus location and auditory-stimulus type, a proportion that was not different from that expected by chance (binomial probability, P > 0.05). Importantly, only 1 of the 60 neurons was modulated by auditory-stimulus location during both the gap-fixation and the visual-fixation tasks.
Finally, to confirm the results of the ANOVA analysis, we calculated the amount of relative stimulus-location information that was contained in the firing rate of LIP neurons. The distributions of relative stimulus-location information are shown in Fig. 3. The amount of relative stimulus-location information was dependent on the context in which the auditory stimuli were presented. During the visual-fixation task (Fig. 3, A and B), the mean amount of relative stimulus-location information did not differ significantly from zero (band-pass noise: mean = 0.004 bits, t = 1.2, df = 85, P > 0.05; species-typical vocalizations: mean = 0.004 bits, t = 1.06, df = 85, P > 0.05). However, the mean amount of relative stimulus-location information was significantly greater than zero during the gap-fixation task (Fig. 3, C and D): when band-pass noise was presented, the mean bit rate was 0.02 bits (t = 3.4, df = 69, P < 0.05), and when species-typical vocalizations were presented, the mean rate was 0.01 bits (t = 1.9, df = 69, P < 0.05).
To put these bit rates into perspective, we performed two additional analyses. First, we calculated the amount of relative stimulus-location information contained in the firing rate of LIP neurons during the visual-saccade task; the data for this analysis were the firing rates recorded during the time period beginning when the peripheral LED was illuminated and ending prior to the monkey saccading to its location. The mean bit rate for this analysis was 0.31, a mean value significantly greater than zero (P < 0.05) and substantially higher than the values observed during the visual-fixation and gap-fixation tasks. Second, as a control, we calculated the amount of relative stimulus-location information due to fluctuations in the monkeys' eye position during the gap-fixation task; the data for this analysis was based on the average position of the monkeys' eyes during the stimulus period. The mean bit rate for this analysis was 0.004 bits, which was not reliably different from 0 bits (t-test, P > 0.05). This analysis indicated that changes in the monkeys' eye position cannot wholly account for our observation that the firing rates of LIP neurons carry significant amounts of information about the location of auditory stimuli during the gap-fixation task.
Auditory spatial responses in area LIP have been previously reported to be contingent on monkeys learning to saccade to the location of an auditory stimulus (Grunewald et al. 1999). In contrast, we demonstrated that auditory spatial responses were not dependent on this auditory training: spatially selective auditory responses were dependent on the context in which the auditory stimuli were presented. Specifically, we found that when auditory stimuli were presented concurrently with the central LED, LIP neurons were not significantly modulated by the location of band-pass noise bursts or species-typical vocalizations (Figs. 2 and 3). These results confirm those reported by Grunewald et al. (1999) and extend them to include another class of auditory stimuli, species-typical vocalizations. However, when auditory stimuli were presented without the central LED, LIP neurons were modulated significantly by auditory-stimulus location (see Figs. 2 and 3), independent of auditory-stimulus type. It is important to note that the responses of LIP and other parietal neurons to peripheral visual stimuli are stronger when monkeys fixate a central LED than when the stimuli were presented without a central LED (Ben Hamed and Duhamel 2002; Mountcastle et al. 1981). Future work should examine directly whether the presence or absence of a central LED differentially affects the responses of LIP neurons to auditory and visual stimuli.
One possible interpretation of this study is to consider the hypothesis that area LIP codes salient or task relevant stimuli (Assad 2003; Gottlieb et al. 1998; Kusunoki et al. 2000). The central LED may be a salient visual stimulus, because it was relevant for successful completion of the task: the monkeys had to maintain their gaze at its location to receive a juice reward. The salience of the band-pass noise bursts and the species-typical vocalizations may be attributable to their abrupt presentation in the environment, which has been shown with visual stimuli to attract an observer's attention (Egeth and Yantis 1997; Gottlieb et al. 1998; Kusunoki et al. 2000). Additionally, the saliency of the species-typical vocalization may be attributable to the fact that they are a class of ethological stimuli (Gifford et al. 2003; Hauser 1997; Seyfarth and Cheney 2003) or have relatively complex spectrotemporal properties.
If both the visual and auditory stimuli were salient, why were LIP neurons modulated more during the gap-fixation task than during the visual-fixation task (Figs. 2 and 3)? We speculate that, during the visual-fixation task, auditory stimuli were not coded by LIP neurons due to the presence of the task-relevant central LED; the salience of this LED may be greater than the salience of the nontask related auditory stimuli. In contrast, during the gap-fixation task, the central LED was not present in the environment, and LIP neural resources were available to code the salient features of the auditory stimuli. Alternatively, it is possible that neural modulation during the gap-fixation task may reflect a potential location of a planned eye movement (Snyder et al. 1997, 2000) or changes in the monkey's spatial attention (Colby and Goldberg 1999).
Does behavioral training have an impact on auditory processing in area LIP? Behavioral training does not appear to increase the proportion of neurons that are modulated by auditory-stimulus location: Grunewald et al. (1999) and data from our laboratory (Mullette-Gillman et al. 2002) indicate that 12–13% of LIP neurons were modulated by auditory stimuli following behavioral training, a proportion comparable to the 13% reported in this study. However, while the proportion of auditory LIP neurons is comparable, behavioral training may increase the spatial selectivity of auditory responses: the mean bit rate in the Grunewald et al. (1999) study was ∼0.036 bits, which is substantially higher than our mean 0.015 bit rate. This result is consistent with the observation that LIP activity is modulated by the behavioral demands of an auditory task (Linden et al. 1999).
Overall, these results indicate that LIP neurons are modulated by the spatial location of auditory stimuli without behavioral training with auditory stimuli. These data also further implicate area LIP as playing an important role in multimodal integration (Cohen and Andersen 2002).
The authors thank A. Underhill and L. Boucher for help with animal care and training. Also, we greatly appreciate the helpful suggestions of M. Goldberg and the generosity of M. Hauser in providing recordings of rhesus vocalizations.
Y. E. Cohen was supported by grants from the Whitehall Foundation and National Institutes of Health and a Burke Award.
The costs of publication of this article were defrayed in part by the payment of page charges. The article must therefore be hereby marked “advertisement” in accordance with 18 U.S.C. Section 1734 solely to indicate this fact.
- Copyright © 2004 by the American Physiological Society