| How it looks matters. Motion processing for perception and action
G. S Masson
© privat

Prof. Dr. Guillaume S Masson

Abstract

The ability to judge speed and direction is a fundamental property of any biological visual motion
systems governing many visuomotor behaviors, from basic ocular tracking reflexes to complex
social interactions. In humans, visual motion information is essential to organize the perception of
our 3D visual environment into distinct, coherent and behaviorally-relevant visual entities from
numerous, noisy local information. Think of a flock of birds: despite the tremendous amount of local
motions, ones can perceive both the global (the flock) and local (a bird) trajectories at multiple
spatial and temporal scales in order to track or focus our attention back and forth between them.
To do so, several computational steps must be solved dynamically: local motion detection and
integration, object segmentation, region-based integration and ultimately, interpretation. Still, how
biological systems solve and implement these computations is poorly understood.
Over the last years, we have reappraised this question using a new class of motion stimuli called
Motion Clouds (MCs). They are dynamical random-phase textures where the statistics of energy
distributions can be finely and naturalistically defined along different dimensions (spatial and
temporal frequencies, speed/direction, spatial structure…). Using them we begun to better
understand how different spatiotemporal frequency channels interact to optimally compute image
speed for either motion perception or simple goal-directed actions such as smooth pursuit. We also
develop a new approach of probing the perceptual organization of motion inputs to understand
how, and when integration and segmentation rules apply. I will discuss how we can thus reconsider
how low- and mid-level visual motion processing are twined.