Our lab works in the area of human and robot perception and cognition.  We study hearing and vision in humans using experimental psychology and dense-array EEG.  Our main area of interest is in auditory perception and auditory attention.  We’re interested in speech perception, sound localization and selective attention in complex and noisy environments.  We use what we’ve learned to write AI software to control humanoid robots.

Hearing isn’t like seeing.  Sounds don’t get in the way of other sounds and block them from our view.  Instead, sound waves mix together as they propagate from their sources to our ears.  The human brain is excellent at hearing, even when many sounds are mixed together; robots aren’t very good at this.  Our goal is to understand how the human brain solves this problem, and apply that understanding to make robots that can understand the auditory world as well as we can.