My lab studies how information from one sensory system influences processing in other sensory systems, as well as how this information is integrated in the brain. Specifically, we investigate the mechanisms underlying basic auditory, visual, and tactile interactions, synesthesia, multisensory body image perception, and visual facilitation of speech perception. Our current research examines multisensory processes using a variety of techniques including psychophysical testing and illusions, fMRI and DTI, electrophysiological measures of neural activity (both EEG and iEEG), and lesion mapping in patients with brain tumors. Our intracranial electroencephalography (iEEG/ECoG/sEEG) recordings are a unique resource that allow us to record neural activity directly from the human brain from clinically implanted electrodes in patients. These recordings are collected while patients perform the same auditory, visual, and tactile tasks that we use in our other behavioral and neuroimaging studies, but iEEG measures have millisecond temporal resolution as well as millimeter spatial precision, providing unparalleled information about the flow of neural activity in the brain. We use signal processing techniques and machine learning methods to identify how information is encoded in the brain and how it is disrupted in clinical contexts (e.g., in patients with a brain tumor).