Return to search

ERF and scale-free analyses of source-reconstructed MEG brain signals during a multisensory learning paradigm

The analysis of Human brain activity in magnetoencephalography (MEG) can be generally conducted in two ways: either by focusing on the average response evoked by a stimulus repeated over time, more commonly known as an ''event-related field'' (ERF), or by decomposing the signal into functionally relevant oscillatory or frequency bands (such as alpha, beta or gamma). However, the major part of brain activity is arrhythmic and these approaches fail in describing its complexity, particularly in resting-state. As an alternative, the analysis of the 1/f-type power spectrum observed in the very low frequencies, a hallmark of scale-free dynamics, can overcome these issues. Yet it remains unclear whether this scale-free property is functionally relevant and whether its fluctuations matter for behavior. To address this question, our first concern was to establish a visual learning paradigm that would entail functional plasticity during an MEG session. In order to optimize the training effects, we developed new audiovisual (AV) stimuli (an acoustic texture paired with a colored visual motion) that induced multisensory integration and indeed improved learning compared to visual training solely (V) or accompanied with acoustic noise (AVn). This led us to investigate the neural correlates of these three types of training using first a classical method such as the ERF analysis. After source reconstruction on each individual cortical surface using MNE-dSPM, the network involved in the task was identified at the group-level. The selective plasticity observed in the human motion area (hMT+) correlated across all individuals with the behavioral improvement and was supported by a larger network in AV comprising multisensory areas. On the basis of these findings, we further explored the links between the behavior and scale-free properties of these same source-reconstructed MEG signals. Although most studies restricted their analysis to the global measure of self-similarity (i.e. long-range fluctuations), we also considered local fluctuations (i.e. multifractality) by using the Wavelet Leader Based Multifractal Formalism (WLBMF). We found intertwined modulations of self-similarity and multifractality in the same cortical regions as those revealed by the ERF analysis. Most astonishing, the degree of multifractality observed in each individual converged during the training towards a single attractor that reflected the asymptotic behavioral performance in hMT+. Finally, these findings and their associated methodological issues are compared with the ones that came out from the ERF analysis.

Identiferoai:union.ndltd.org:CCSD/oai:tel.archives-ouvertes.fr:tel-00984990
Date10 March 2014
CreatorsZilber, Nicolas
PublisherUniversité Paris Sud - Paris XI
Source SetsCCSD theses-EN-ligne, France
LanguageEnglish
Detected LanguageEnglish
TypePhD thesis

Page generated in 0.0092 seconds