Visual Perception and Attention Lab

Visual Perception and Attention Laboratory

Welcome

The sensory world is a “buzzing, booming” confusion of information. The brain processes different types of information in different areas and reunites the information to produce the impression of a seamless, integrated world. How the brain solves the “binding problem” is poorly understood. VPAL works to solve this problem by using multiple converging techniques to understand how the brain binds perceptual information as it directs motor behaviour in real time.

Behaviour
The first step to understanding the neural circuitry that mediates attention and binding is to observe and understand participants’ behaviour. We measure the accuracy and reaction time of participants’ judgments of sensory information. Recording their eye movements via infrared cameras also allows us to investigate how the brain extracts sensory information in real time.

Neuroimaging
EEG measures synchronized activity across populations of neurons, with high temporal resolution, while participants are performing behavioural tasks. This technique lends itself well to a wide range of participants including children and special populations. fMRI compliments EEG data by providing high spatial resolution. Together they produce a clear picture of the brain activity underlying cognition.

Dr. Fallah and members of his lab use systems and cognitive neuroscience approaches to understand:

  • how attention works
  • how features are integrated across multiple brain areas to form object representations
  • how attention and object representations drive eye movements
  • multisensory processing including auditory/visual and visual/proprioceptive 
  • how the visual system prioritizes peripersonal space (the area within reach of our arms)
  • the networks in the brain perform those processes, using neuroimaging (EEG)
  • how these networks (EEG) are impaired due to damage, such as concussion and subconcussive impacts
  • how athletics/exercise improve cognitive processing

CURRENT EXTERNALLY FUNDED PROJECTS

Assistive Gaze Tracking with the iPad TrueDepth Camera
How can the iPad gaze tracking solution be used in individuals with limited mobility and speech impairments. In-lab and clinical testing.
Role: Principal Investigator Funded by: VISTA & NSERC Engage

Cortical Interactions in Eye Movements
We gaze around a full visual world. How does the rest of the visual input affect saccades to a target of interest?
Role: Principal Investigator Funded by: Natural Sciences and Engineering Research Council

Feature Integration and Object Processing
How are features which are processed in different places and stages in the visual system, integrated into a coherent object representation?
Role: Principal Investigator Funded by: Natural Sciences and Engineering Research Council

Feature Modulation of Executive Functions
How do the features of the visual stimuli affect the strength of executive functions such as attention, working memory, response inhibition. Can this be used to aid impairments in neurodegenerative diseases like Alzheimer’s.
Role: Principal Investigator Funded by: Natural Sciences and Engineering Research Council

SUPERVISION

Currently available to supervise graduate students: Yes

Currently taking on work-study students, Graduate Assistants or Volunteers: Yes

Available to supervise undergraduate thesis projects: Yes