At any given moment, our brains receive input from multiple senses. Successful behaviour depends on our ability to prioritise the most important information and ignore the rest. A multiple-demand (MD) network of frontal and parietal regions is thought to support this process by adjusting to code information that is currently relevant (Duncan 2010). Accordingly, the network is proposed to encode a range of different types of information, including perceptual stimuli, task rules, and responses, as needed for the current cognitive operation. However, most MD research has used visual tasks, leaving limited information about whether these regions encode other sensory domains. We used multivoxel pattern analysis (MVPA) of functional magnetic resonance imaging (fMRI) data to test whether the MD regions code the details of somatosensory stimuli, in addition to tactile-motor response transformation rules and button-press responses. Participants performed a stimulus-response task in which they discriminated between two possible vibrotactile frequencies and applied a stimulus-response transformation rule to generate a button-press response. For MD regions, we found significant coding of tactile stimulus, rule and response. Primary and secondary somatosensory regions encoded the tactile stimuli and the button press response, but did not represent task rules. Our findings provide evidence that MD regions can code non-visual somatosensory task information, commensurate with a domain-general role in cognitive control.
- cognitive control
- somatosensory perception
- Copyright © 2016, Journal of Neurophysiology