Natural behavior occurs in multiple sensory and motor modalities and in particular is dependent on sensory feedback that constantly adjusts behavior. To investigate the underlying neuronal correlates of natural behavior, it is useful to have access to state-of-the-art recording equipment (e.g. 2-photon imaging, patch recordings, etc.) that frequently requires head-fixation. This limitation has been addressed with various approaches such as virtual reality/air ball or treadmill systems. However, achieving multimodal, realistic behavior in these systems can be challenging. These systems are often also complex and expensive to implement. Here we present "Air-Track", an easy to build, head-fixed behavioral environment that requires only minimal computational processing. The Air-Track is a lightweight, physical maze floating on an air table that has all the properties of the "real" world, including multiple sensory modalities tightly coupled to motor actions. To test this system, we trained mice in Go/No-Go and two-alternative forced choice tasks in a plus maze. Mice chose lanes, and discriminated apertures or textures by moving the Air-Track back and forth, and rotating it around themselves. Mice rapidly adapted to moving the track, and utilized visual, auditory and tactile cues to guide them in performing the tasks. A custom-controlled camera system monitored animal location, and generated data that could be used to calculate reaction times in the visual and somatosensory discrimination tasks. We conclude that the Air-Track system is ideal for eliciting natural behavior in concert with virtually any system for monitoring or manipulating brain activity.
- Active sensing
- virtual reality
- multi-sensory head-fixed behavior
- Copyright © 2016, Journal of Neurophysiology