Illusory hand movements can be elicited by a textured disk or a visual pattern rotating under one's hand, while proprioceptive inputs convey immobility information (Blanchard et al., 2013). Here we investigated whether visuo-tactile integration can optimize velocity discrimination of illusory hand movements in line with Bayesian predictions. We induced illusory movements in fifteen volunteers by visual and/or tactile stimulation, delivered at six angular velocities. The participants had to compare hand illusion velocities with a 5°/s hand reference movement in an alternative forced choice paradigm. Results showed that the discrimination threshold decreased in visuo-tactile condition compared to unimodal (visual or tactile) conditions, reflecting better bimodal discrimination. The perceptual strength (gain) of the illusions also increased: the stimulation required to give rise to a 5°/s illusory movement was slower in the visuo-tactile condition compared to each of the two unimodal conditions. The Maximum Likelihood Estimation model satisfactorily predicted the improved discrimination threshold, but not the increase in gain. When we added a zero-centered Prior, reflecting immobility information, the Bayesian model did actually predict the gain increase, but systematically overestimated it. Interestingly, the predicted gains better fit the visuo-tactile performances when a proprioceptive noise was generated by co-vibrating antagonist wrist muscles. These findings show that kinesthetic information of visual and tactile origins is optimally integrated to improve velocity discrimination of self-hand movements. However, a Bayesian model alone could not fully describe the illusory phenomenon pointing to the crucial importance of the omnipresent muscle proprioceptive cues with respect to other sensory cues for kinesthesia.
- Bayesian modeling
- Multisensory integration
- Muscle proprioception
- Copyright © 2015, Journal of Neurophysiology