论文部分内容阅读
The visual world is complex and continuously changing.Nevertheless, within a few hundred milliseconds, our brain transforms patterns of light falling on our retina into coherent scene percepts.The brain presumably accomplishes this extraordinary feat by exploiting structural regularities in visual scenes to efficiently encode and analyze visual input.We wanted to know to what extent visual regularities carry information relevant for perceptual and emotional processing of visual input.To this end, we recorded behavioral and neural responses to natural pictures and movies from a number of subjects.In the behavioral experiment, the subjects were asked to assign pictures and movies to one of a fixed set of emotional categories.We found that evoked emotion can be predicted on the basis of two simple visual scene statistics: feature energy and spatial coherence.In a separate neural experiment, these two scene statistics explained ERP and fMRI responses in visual cortex and higher-order brain areas.Importantly, representational similarity analysis of ERP and fMRI responses revealed that feature energy and spatial coherence carry category-specific information: dissimilarities in neural responses to different picture and movie categories corresponded with dissimilarities between the same categories in terms of feature energy and spatial coherence.These converging computational, behavioral and neural results suggest that simple visual regularities in the natural environment mediate information relevant for perceptual and emotional categorization.