Leibniz-Zentrum Allgemeine Sprachwissenschaft Leibniz-Gemeinschaft

Research Areas

Details-Events

Multimodal language processing in brain and behaviour

Speaker Linda Drijvers
Affiliaton(s) Max Planck Institute for Psycholinguistics, Nijmegen
Date 30.11.2022
Time 11:00 o'clock
Venue Virtual: Please send an e-mail to zas.xtalks@leibniz-zas.de to get the Zoom invitation link. To be let in, use your proper name.

Abstract

During communication in real-life settings, the brain integrates information from auditory (e.g., speech) and visual (e.g., gestures) signals to form a unified percept of our environment. In this talk, I will first focus on previous work that investigated whether and how visual signals enhance speech comprehension in adverse listening conditions for both native and non-native listeners. I will then discuss our more recent work, where we used rapid invisible frequency tagging to generate steady-state evoked fields to study how visual and auditory signals interact in the brain. Finally, I will present findings from two recent dual-EEG studies that investigated multimodal language processing in interactive settings.

About Linda Drijvers

Dr. Linda Drijvers is a research group leader at the Max Planck Institute for Psycholinguistics, where she heads the Communicative Brain group. She completed a bachelor’s degree in Dutch Language & Literature, a research master’s degree in Cognitive Neuroscience, and a PhD investigating the oscillatory dynamics underlying speech-gesture integration in clear and adverse listening conditions at Radboud University. In her research, she is interested in how the brain combines what you see and hear during face-to-face communication. The core theory her group wants to test is whether and how oscillatory neural activity plays a role in integrating these different sources of information within and between conversational partners. She investigates this using new cutting-edge techniques, including dual-EEG, MEG, rapid invisible frequency tagging, and detailed behavioural analyses of auditory and visual signals in interactive contexts.