It is a challenging task for researchers to determine how the brain solves multisensory perception, and the neural mechanisms involved remain subject to theoretical conjecture. According to a hypothesised cortical model for natural audiovisual stimulation, phase synchronised communications between participating brain regions play a mechanistic role in natural audiovisual perception. The purpose of this study was to test the hypothesis by investigating oscillatory dynamics from ongoing EEG recordings whilst participants passively viewed ecologically realistic face-speech interactions in film. Lagged-phase synchronisation measures were computed for conditions of eye-closed rest (REST), speech-only (auditory-only, A), face-only (visual-only, V) and face-speech (audio-visual, AV) stimulation. Statistical contrasts examined AV > REST, AV > A, AV > V and AV-REST > sum(A,V)-REST effects. Results indicated that cross-communications between the frontal lobes, intraparietal associative areas and primary auditory and occipital cortices are specifically enhanced during natural face-speech perception and that phase synchronisation mediates the functional exchange of information associated with face-speech processing between both sensory and associative regions in both hemispheres. Furthermore, phase synchronisation between cortical regions was modulated in parallel within multiple frequency bands.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-122825 |
Date | January 2015 |
Creators | Blomberg, Rina |
Publisher | Linköpings universitet, Institutionen för datavetenskap |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.002 seconds