Much of the work in our lab attempts to integrate cognitive, computational, and neuroimaging techniques to better understand a broad range of visual cognitive behaviors. We monitor how people move their eyes as they perform various visual tasks, including visual search, object representation, working memory, and scene perception. We then describe this eye-movement behavior in the context of an image-based computational model and compare its simulated behaviors to the actual human eye-movements. Through the adoption of this reciprocal experimental and computational research plan, we hope to better understand not only the behavioral primitives that we enlist during the task, but also the computational language used by our cognitive systems to accomplish it.
Keywords: visual attention, eye movements, computational modeling