|Institution(en)||Max Planck Institute for Psycholinguistics, Nijmegen|
|Ort||Virtual: Please send an e-mail to firstname.lastname@example.org to get the Zoom invitation link. To be let in, use your proper name.|
During communication in real-life settings, the brain integrates information from auditory (e.g., speech) and visual (e.g., gestures) signals to form a unified percept of our environment. In this talk, I will first focus on previous work that investigated whether and how visual signals enhance speech comprehension in adverse listening conditions for both native and non-native listeners. I will then discuss our more recent work, where we used rapid invisible frequency tagging to generate steady-state evoked fields to study how visual and auditory signals interact in the brain. Finally, I will present findings from two recent dual-EEG studies that investigated multimodal language processing in interactive settings.
Dr. Linda Drijvers is a research group leader at the Max Planck Institute for Psycholinguistics, where she heads the Communicative Brain group. She completed a bachelor’s degree in Dutch Language & Literature, a research master’s degree in Cognitive Neuroscience, and a PhD investigating the oscillatory dynamics underlying speech-gesture integration in clear and adverse listening conditions at Radboud University. In her research, she is interested in how the brain combines what you see and hear during face-to-face communication. The core theory her group wants to test is whether and how oscillatory neural activity plays a role in integrating these different sources of information within and between conversational partners. She investigates this using new cutting-edge techniques, including dual-EEG, MEG, rapid invisible frequency tagging, and detailed behavioural analyses of auditory and visual signals in interactive contexts.