Return to search

CORTICAL PHASE SYNCHRONISATION MEDIATES NATURAL FACE-SPEECH PERCEPTION

It is a challenging task for researchers to determine how the brain solves multisensory perception, and the neural mechanisms involved remain subject to theoretical conjecture.  According to a hypothesised cortical model for natural audiovisual stimulation, phase synchronised communications between participating brain regions play a mechanistic role in natural audiovisual perception.  The purpose of this study was to test the hypothesis by investigating oscillatory dynamics from ongoing EEG recordings whilst participants passively viewed ecologically realistic face-speech interactions in film.  Lagged-phase synchronisation measures were computed for conditions of eye-closed rest (REST), speech-only (auditory-only, A), face-only (visual-only, V) and face-speech (audio-visual, AV) stimulation. Statistical contrasts examined AV > REST, AV > A, AV > V and AV-REST > sum(A,V)-REST effects.  Results indicated that cross-communications between the frontal lobes, intraparietal associative areas and primary auditory and occipital cortices are specifically enhanced during natural face-speech perception and that phase synchronisation mediates the functional exchange of information associated with face-speech processing between both sensory and associative regions in both hemispheres.  Furthermore, phase synchronisation between cortical regions was modulated in parallel within multiple frequency bands.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-122825
Date January 2015
CreatorsBlomberg, Rina
PublisherLinköpings universitet, Institutionen för datavetenskap
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0028 seconds