To investigate form-related activity inmotion-sensitive cortical areas, we recorded cell responses to animate implied motion in macaque middle temporal (MT) and medial superior temporal (MST) cortex and investigated these areas using fMRI in humans. In the single-cell studies, we compared responses with static images of human or monkey figures walking or running left or right with responses to the same human and monkey figures standing or sitting still. We also investigated whether the view of the animate figure (facing left or right) that elicited the highest response was correlated with the preferred direction for moving random dot patterns. First, figures were presented inside the cell's receptive field. Subsequently, figures were presented at the fovea while a dynamic noise pattern was presented at the cell's receptive field location. The results show that MT neurons did not discriminate between figures on the basis of the implied motion content. Instead, response preferences for implied motion correlated with preferences for low-level visual features such as orientation and size. No correlation was found between the preferred view of figures implying motion and the preferred direction for moving random dot patterns. Similar findings were obtained in a smaller population of MST cortical neurons. Testing human MT+ responses with fMRI further corroborated the notion that low-level stimulus featuresmight explain implied motion activation in human MT+. Together, these results suggest that prior human imaging studies demonstrating animate implied motion processing in area MT+ can be best explained by sensitivity for low-level features rather than sensitivity for the motion implied by animate figures.

doi.org/10.1162/jocn.2010.21533, hdl.handle.net/1765/33960
Journal of Cognitive Neuroscience
Erasmus MC: University Medical Center Rotterdam

Lorteije, J., Barraclough, N., Jellema, T., Raemaekers, M., Duijnhouwer, J., Xiao, D., … Wezel, R. (2011). Implied motion activation in cortical area MT can be explained by visual low-level features. Journal of Cognitive Neuroscience, 23(6), 1533–1548. doi:10.1162/jocn.2010.21533