You are here: vision-research.eu » Vision Research » The Young Researchers View » The Young Researcher's View: Arne F. Meyer (Q2-2021)

From vision to navigation (and back)

Arne F. Meyer

The Research of Arne F. Meyer

Background

The mammalian brain’s navigation system is informed in large part by visual signals. For example, we rely on vision to determine where the walls and doors are in the room and we recognize places based on visual landmarks, such as the shop around the corner. Together with path integration, an internal computation that transforms information about self-motion into a sense of location, vision is one of the main pillars of our ability to navigate. Yet, how visual signals are integrated into the brain’s navigation network is poorly understood. The aim of my research is to fill this gap, by investigating how visual images observed by the eyes are transformed in visual and spatial brain areas to support natural functions such as navigation.

I am using a “sensory coding” approach to understand what aspects of an animal’s visual input are encoded by “spatial” cells that represent objects such as walls or landmarks or an animal’s location in space. During my PhD at the Institute of Physics at the University of Oldenburg, I developed and employed computational models to investigate how complex and naturalistic sensory input is transformed into neural responses in subcortical and cortical brain areas [1--4]. These models represent a promising approach to studying vision in navigating animals as they can be applied to a wide range of data, including naturalistic visual images and spatial coding variables.

A major challenge that has stopped progress so far was to measure what an animal sees while it is navigating its environment. A few years ago, I set out to overcome this challenge in the mouse – a species in which vision and navigation have extensively been studied in isolation. By combining experimental and computational work, I hope to provide a step towards understanding how the high-level abstract spatial code in the brain arises in part from transient, fluctuating images on the retinas.

Measuring head and eye movements and neural activity in freely moving mice

Figure 1: Simultaneous measurement of multiple behavioral variables and neural activity in a freely moving mouse. (A) Neural activity is recorded with a chronic neural implant. Video data are simultaneously recorded using a miniature CMOS image sensor and an infrared (IR) mirror mounted on the implant with a custom holder. (B) Example traces of simultaneously recorded eye positions, head orientation and single cells from primary visual cortex. (adapted from Meyer et al., Neuron (2018).)

During my postdoc with Jennifer Linden and Maneesh Sahani (both University College London), I developed a miniature lightweight head-mounted video camera system (weight 1.3 grams) combined with movement sensors and chronic neural recordings to obtain simultaneous measurements of multiple behavioral variables, including body, head, and eye movements, and neural activity in freely moving mice [5]. This system allows the measurement of the two main factors that determine what an animal sees – head and eye movements – going beyond measurement of head position, which is typically used in navigation research. It also avoids complications of studying navigation in head-restrained animals. While head restraint facilitates neural recordings and stimulus control, it can lead to qualitative changes in visual (e.g., [5,6]) and spatial processing (e.g., [7,8]). We have made the system openly available and it is now part of the Open Ephys project (http://wwww.open-ephys.org/mousecam), one of the most widely used systems for behavioral electrophysiology.

Gaze dynamics in mice

Figure 2: Head and eye movements in freely moving mice consist of two types of eye-head coupling. “Head tilt compensation” stabilizes gaze relative to the horizontal plane (top). “Saccade and fixate” sampling of the environment provides a sequence of stable images on the retina (bottom). (adapted from Meyer et al., Current Biology (2020).)

Together with John O’Keefe (University College London) and Jasper Poort (University of Cambridge), I used the head and eye tracking system to reveal the dynamical structure of head and eye movements (gaze) in freely moving mice [9]. We found that the complex head and eye movement patterns in freely-moving mice can be decomposed into two distinct, independent components each linking eye and head but in different ways (Figure 2). The first type of eye-head coupling supports stabilization of the visual field relative to the ground. The second type allows the mouse to “saccade and fixate” similar to humans non-human primates but mostly parallel to the ground. Mice thus see their environment as a sequence of stable images when moving their heads. These findings relate eye movements in the mouse to other species, and provide a foundation for studying active vision during natural behaviors, such as visually-guided navigation, in the mouse.

Mouse visual cortex “sees” better in front of the animal

Figure 3: Relation between visual cortical organization and visual input. (A) Wide-field calcium imaging (left) reveals a visual cortical region with small population receptive field (pRF) size in mice (right). Smaller population pRFs integrate visual input from a smaller part of the visual space resulting in an enhanced representation of space in mouse visual cortex. (B) Reconstruction of optic flow in a freely moving mouse running through an environment. (C) Regions with higher spatial resolution coincide with Optical flow field during locomotion (body speed 10 cm/s) for the left (top ) and right (bottom) eye. Optical flow vectors were computed using head and eye positions and the geometry of the environment. Black arrows show average flow vectors across 4 mice. Green/purple shaded circles illustrate pRF size. Grey circles, pRF size at an azimuth of 50 degree relative to the animals heading direction. (adapted from van Beest et al., Nature Communications (in press).)

In a recent collaboration with Pieter Roelfsema’s group at the Netherlands Institute for Neuroscience, we investigated how gaze dynamics is related to cortical organization [10]. The representation of space in mouse visual cortex was thought to be relatively uniform, with no strong biases towards any particular region of space. This contrasts with the primate visual cortex with its overrepresentation of the fovea, placing potential limits on the translation of research in mice to humans. We identified a previously unexpected organization of the visual cortex of mice that resembles the fovea-centric organization of human visual cortex. Using population receptive field (pRF) mapping techniques, which provide an estimate of aggregate RF sizes, we found that mouse visual cortex contains a region in which pRFs are considerably smaller (Figure 3A). Importantly, eye movements keep this region at strategic locations in front of the animal, centered at the horizon, where natural scenes tend to have features such as target locations and landmarks (Figure 3B,C). This suggests that already at an early cortical level, the mouse visual system is adapted to process information relevant for navigation.

Outlook: The integration of visual signals into the brain’s navigation network

In 2019, I joined the Donders Institute for Brain, Cognition and Behaviour (Radboud University) as a Radboud Excellence Fellow, working with Francesco Battaglia. I also hold a secondary appointment at the Sainsbury Wellcome Centre in London. At the Donders Institute, I am studying cortical brain regions at the intersection of the visual and the navigation system, in particular the posterior parietal cortex and the retrosplenial cortex. These brain regions have long been hypothesized to play a crucial role in converting visual images that constantly move with the eyes into a stable spatial code anchored to the animal (egocentric) or to the world (allocentric). How this transformation is accomplished in the rodent cognitive mapping system is still in the realm of theory [11,12]. By exploiting the methods I developed in previous work, I will investigate the processing steps involved in this transformation with single-cell resolution in freely moving animals. I will also investigate how the knowledge of an animal’s position in the environment modifies processing in visual brain areas.

References

  1. Meyer, Arne Freerk, Diepenbrock, J.-P., Happel, M. F., Ohl, F. W., & Anemüller, J. (2014). Discriminative learning of receptive fields from responses to non-gaussian stimulus ensembles. PLOS ONE, 9(4), e93062.
  2. Meyer, Arne Freerk, Diepenbrock, J.-P., Ohl, F. W., & Anemüller, J. (2014). Temporal variability of spectro-temporal receptive fields in the anesthetized auditory cortex. Frontiers in Computational Neuroscience, 8(165).
  3. Meyer, Arne F., Diepenbrock, J.-P., Ohl, F. W., & Anemüller, J. (2015). Fast and robust estimation of spectro-temporal receptive fields using stochastic approximations. Journal of Neuroscience Methods, 246, 119–133.
  4. Meyer, Arne F., Williamson, R. S., Linden, J. F., & Sahani, M. (2017). Models of neuronal stimulus-response functions: Elaboration, estimation, and evaluation. Frontiers in Systems Neuroscience, 10, 109.
  5. Meyer, Arne F., Poort, J., O’Keefe, J., Sahani, M., & Linden, J. F. (2018). A head-mounted camera system integrates detailed behavioral monitoring with multichannel electrophysiology in freely moving mice. Neuron, 100(1), 46–60.
  6. Guitchounts, G., Masís, J., Wolff, S. B. E., & Cox, D. (2020). Encoding of 3D head orienting movements in the primary visual cortex. Neuron, 108, 512–525.
  7. Aghajan, Z. M., Acharya, L., Moore, J. J., Cushman, J. D., Vuong, C., & Mehta, M. R. (2015). Impaired spatial selectivity and intact phase precession in two-dimensional virtual reality. Nature Neuroscience, 18, 121–128.
  8. Minderer, M., Harvey, C. D., Donato, F., & Moser, E. I. (2016). Virtual reality explored. Nature, 533, 324–325.
  9. Meyer, Arne F., O’Keefe, J., & Poort, J. (2020). Two distinct types of eye-head coupling in freely moving mice. Current Biology, 30(11), 2116–2130.
  10. Beest, E. van, Mukherjee, S., Kirchberger, L., Schnabel, U. H., Togt, C. van der, Teeuwen, R. R., Barsegyan, A., Meyer, A. F., Poort, J., Roelfsema, P., & Self, M. W. (in press). Mouse visual cortex contains a region of enhanced spatial resolution. Nature Communications.
  11. Bicanski, A., & Burgess, N. (2020). Neuronal vector coding in spatial cognition. Nature Reviews Neuroscience, 21, 453–470.
  12. Wang, C., Chen, X., & Knierim, J. J. (2020). Egocentric and allocentric representations of space in the rodent brain. Current Opinion in Neurobiology, 60, 12–20.

Arne F. Meyer

Donders Institute for Brain, Cognition and Behaviour
Radboud University,
Nijmegen 6525 AJ,
The Netherlands

Sainsbury Wellcome Centre for Neural Circuits and Behaviour
University College London,
London W1T 4JG,
UK

E-mail: a1.meyer[at]donders.ru.nl