We tested the hypothesis that the brain uses a variance-based weighting of multisensory cues to estimate head rotation to perceive which way is up. The hypothesis predicts that the known bias in perceived vertical, which occurs when the visual environment is rotated in a vertical-plane, will be reduced by the addition of visual noise. Ten healthy participants sat head-fixed in front of a vertical screen presenting an annulus filled with coloured dots, which could rotate clockwise or counter-clockwise at six angular velocities (1, 2, 4, 6, 8, 16°/s) and with six levels of noise (0, 25, 50, 60, 75, 80%). Participants were required to keep a central bar vertical by rotating a hand-held dial. Continuous adjustments of the bar were required to counteract low-amplitude low-frequency noise that was added to the bar's angular position. During visual rotation, the bias in verticality perception increased over time to reach an asymptotic value. Increases in visual rotation velocity significantly increased this bias, while the addition of visual noise significantly reduced it, but did not affect perception of visual rotation velocity. The biasing phenomena were reproduced by a model that uses a multisensory variance-weighted estimate of head rotation velocity combined with a gravito-inertial acceleration signal (GIA) from the vestibular otoliths. The time-dependent asymptotic behaviour depends on internal feedback loops that act to pull the brain's estimate of gravity direction towards the GIA signal. The model's prediction of our experimental data furthers our understanding of the neural processes underlying human verticality perception.

doi.org/10.1371/journal.pone.0227040, hdl.handle.net/1765/123943
PLoS ONE
Department of Neuroscience

Dakin, C.J. (Christopher J.), Kumar, P. (Prateek), Forbes, P.A. (Patrick A.), Peters, A. (Amy), & Day, B.L. (Brian L.). (2020). Variance based weighting of multisensory head rotation signals for verticality perception. PLoS ONE, 15(1). doi:10.1371/journal.pone.0227040