Projects

We work at the interface between the mind, brain, machines and the external world, using behavioral measurements (including eye-tracking), physiological measurements and computational modeling. We work with humans (and soon with animals, including non-human primates) as well as with open datasets, and much of our research focus is on vision, hearing and eye-movements, including both sensory, attentional and cognitive aspects. We keep a keen eye on direct applications to devices, algorithms and human health.

We practice open science !

Funding:

NSERC logo

McGill logo

CIRMMT logo

VHRN logo

IVADO logo

UNIQUE logo

We are finally up and running in full swing after the long pandemic-imposed pause.

These are some of our ongoing projects:

Visual attentional dynamics and eye-movements

Lead: Amanda Pruss (co-supervised with Chris Pack) - We are continuing the work from the Yao et al. eLife paper and will investigate information transfer across eye-movements.

  • How anticipatory remapping along the saccade direction predicts perisaccadic biphasic mislocalization. OSF Preprints. Krishna BS (2023).

Neuro-AI

Lead: Yohai-Eliel Berreby - We are looking at both foundational and applied issues related to how artificial and biological neural networks function.

  • How anticipatory remapping along the saccade direction predicts perisaccadic biphasic mislocalization. OSF Preprints. Krishna BS (2023).

  • Correlations between different auditory temporal response properties of inferior colliculus neurons. OSF Preprints. Berreby YE, Krishna BS. (2023).

Active vision

Leads: Katarzyna Jurewicz and Buxin Liao - Katarzyna is pursuing her IVADO-funded work on the computational, psychophysical and physiological aspects of active vision. And Buxin is working on modeling active vision.

  • Information integration across saccades plays a prominent role during goal-directed viewing of everyday scenes. PsyArxiv. Jurewicz K, Liao B, Krishna BS. (2023).

  • Correction of saccadic decisions during free-viewing visual search in the monkey. OSF Preprints. Ipata AE, Bisley JW, Krishna BS. (2023).

Audiovisual perception

Lead: Noa Kemp (co-supervised with Catherine Guastavino - We are working on multiple projects involving visual and auditory localization, motion and objecthood.

Flow

Lead: Oren Gurevitch (co-supervised with Simone Dalla Bella) - We are beginning an exciting, highly exploratory project on characterizing the physiological and phenomenological correlates of flow-states.

Strabismus

Lead: Suresh Krishna - We are starting a series of projects looking into visual perception and simple interventional possibilities in intermittent exotropes.

Google Summer of Code 2025 projects

  1. INCF - ActiveVision : a data and model portal for the study of goal-directed vision

  2. INCF - BrainHeart : an integrated open-source software tool for studying heart-brain interactions

  3. INCF - BreathState : an open-source Android/iOS and PC app for breathing and heart-rate synchronization (350h)

  4. INCF - GestureCap : Markerless gesture-recognition and motion-capture Ml/AI-based tool to drive music and speech generation, and develop neuroscientific/psychological theories of music creativity and music-movement-dance interactions

  5. INCF - SciCommons : a social-web tool for scientific discussion, interaction, rating, and peer-review

An example previous project aiming to build an open-source phone-based eye-tracker using PyTorch is here.