Humans localize sounds by comparing inputs across the two ears, resulting in a head-centered representation of sound-source position. When the head moves, information about head movement must be combined with the head-centered estimate to correctly update the world-centered sound-source position. Spatial updating has been extensively studied in the visual system, but less is known about how head movement signals interact with binaural information during auditory spatial updating. In the current experiments, listeners compared the world-centered azimuthal position of two sound sources presented before and after a head rotation that depended on condition. In the Active condition, subjects rotated their head by ~35° to the left or right, following a pre-trained trajectory. In the Passive condition, subjects were rotated along the same trajectory in a rotating chair. In the Cancellation condition, subjects rotated their head as in the Active condition, but the chair was counter-rotated based on head-tracking data such that the head effectively remained fixed in space while the body rotated beneath it. Subjects updated most accurately in the Passive condition but erred in the Active and Cancellation conditions. Performance is interpreted as reflecting the accuracy of perceived head rotation across conditions, which is modeled as a linear combination of proprioceptive/efference copy signals and vestibular signals. Resulting weights suggest that auditory updating is dominated by vestibular signals but with significant contributions from proprioception/efference copy. Overall, results shed light on the interplay of sensory and motor signals that determine the accuracy of auditory spatial updating.
- sound localization
- spatial updating
- efference copy
- Copyright © 2016, Journal of Neurophysiology