Helmholtz Retreat 6-8 June 2012
The title of his talk is: Where we look determines what we see: effects of eye movements on lightness perception
Abstract: Judging the lightness of visual stimuli has been studied for centuries. The light reaching the eye is the product of the illumination and the reflectance of the object, and also depends on the scene geometry. However, only the proportion of reflected light is an invariant property of the object and thus of great importance for vision. There are several well-established factors that support lightness constancy in the face of these challenges. On one hand, lateral inhibition between retinal neurons filters out shallow intensity gradients which are mostly due to illumination effects. On the other hand, more complex factors also have an effect on lightness perception, such as object shape or the interpretation of transparent surfaces. Eye movements, however, have been almost completely neglected so far, even though the visual system does sample the local properties of objects by moving the eyes around. Since visual acuity, luminance sensitivity, contrast sensitivity and color sensitivity change with retinal eccentricity, in order to finely analyze the visual scene, our visual system has to stitch together its representation of the world from many small samples. We therefore tested the hypothesis of a link between the local information sampled from those fixations and the apparent lightness of an object in a color matching task. Our results show that where we look can have massive effects on perception. When observers matched the lightness of natural objects they based their judgments on the brightest parts of the objects, and at the same time they tended to fixate points with above-average luminance. When we forced participants to fixate a specific point on the object using a gaze-contingent display setup, the matched lightness was higher when observers fixated bright regions. This indicates a causal link between the luminance of the fixated region and the lightness match for the whole object. Simulations with rendered physical lighting show that this fixation strategy is an efficient and simple heuristic for the visual system to arrive at accurate and invariant judgments of lightness.
About his research The emphasis of my current research is on information processing in the visual system. Specifically, I am concerned with the relationship between low level sensory processes, higher level visual cognition, and sensorimotor integration. My goal is to answer the question how complex scenes and objects are perceived in a natural environment, how they are represented in the brain, and how the visual information is used to drive the motor system.
The title of his talk is: Reflections of the legacy of Paul Bertelson.
Abstract: Paul Bertelson (1926-2008) was a professor in cognitive science at the Université Libre de Bruxelles, Belgium and Tilburg University, the Netherlands. He started his carrier with Donald Broadbent (Cambridge) on ‘mental chronometry’, and then moved to Brussels where he worked on topics like the refractory period, time order perception, cerebral lateralization, spatial attention, spoken word recognition, reading, Braille, and illiteracy. In this talk, I will reflect on his multisensory work while in Tilburg. I will share some of his insights on the relation between vision and audition in space, time, and speech. I will also show some recent demos that Paul could not experience anymore, but that I am sure of that he would have loved.
About his research In our lab, we investigate how information from different sense organs is combined so that a coherent representation of the world is obtained. We focus on the intersensory integration of auditory, visual, and tactile information in the spatial, temporal, phonetic, and emotional domain. Besides traditional behavioral methods, we use ERPs and fMRI with normal subjects and patients (e.g., blindsight, neglect, schizophrenics). Recently, we also started to look at intersensory integration in infants.
The title of his talk is: Seeing motion.
Abstract: In this talk a review will be given of the multisensory approach to the visual perception (no contradiction implied) of movement, as this is the approach I have taken for the better part of my scientific career. The talk will focus on the empirical paradigm developed to quantitatively measure the size and gain of what is known as the efference copy, the role of neural noise and the effects of concurrent self motion and vestibular stimulation. The framework explains well-known phenomena as the Filehne illusion (seeing illusory object motion during eye movements made across the object) and the Aubert-Fleishl paradox (object velocity is underestimated when the object is pursued with the eyes) and some demonstrations will be given of illusions predicted from the theory.
About his research Wertheim (1942) studied psychology at the Hebrew university of Jerusalem in Israel and Groningen Unversity in the Netherlands. A large part of his carreer was spent at the TNO Human Factors Institute in Soesterberg. Apart from his basic research on motion perception, he has been working on a new approach to visual search, and measurement methods to quantify the concept of visual conspicuity. He has a strong interest in applied reaserach as well, having collaborated with several international organisations of human factors and cognitive ergonomics.
Heidelberglaan 2 Utrecht
3584 CS Utrecht