This dissertation studies the neural basis of song, a universal human behavior. The relationship of words and melodies in the perception of song at phonological, semantic, melodic, and rhythmic levels of processing was investigated using the fine temporal resolution of Electroencephalography (EEG). The observations reported here may shed light on a ubiquitous human experience and also inform the discussion of whether language and music share neural resources or recruit domain-specific neural mechanisms. Experiment 1 was designed to determine whether words and melody in song are processed interactively or independently. Participants listened to sung words in which the melodies and/or the words were similar or different, and performed a same/different task while attending to the linguistic and musical dimensions in separate blocks of trials. Event-Related Potentials and behavioral data converged in showing interactive processing between the linguistic and musical dimensions of sung words, regardless of the direction of attention. In particular, the N400 component, a well-established marker of semantic processing, was modulated by musical melody. The observation that variations in musical features affect lexico-semantic processing in sung language was a novel finding with implications for shared neural resources between language and music. Experiment 2 was designed to explore the idea that well-aligned text-settings, in which the strong syllables occur on strong beats, capture listeners' attention and help them understand song lyrics. EEG was recorded while participants listened to sung sentences whose linguistic stress patterns were well-aligned, misaligned, or had variable alignment with the musical meter, and performed a lexical decision task on subsequently presented visual targets. / Results showed that induced beta and evoked gamma power were modulated differently for well-aligned and misaligned syllables, and that task performance was adversely affected when visual targets followed misaligned and varied sentences. These findings suggest that alignment of linguistic stress and musical meter in song enhance beat tracking and linguistic segmentation by entraining periodic fluctuations in high frequency brain activity to the stimuli. A series of follow-up studies has been outlined to further investigate the relationship between rhythmic attending in speech and music, and the influence of metrical alignment in songs on childhood language acquisition. / by Reyna Leigh Gordon. / Thesis (Ph.D.)--Florida Atlantic University, 2010. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2010. Mode of access: World Wide Web.
Identifer | oai:union.ndltd.org:fau.edu/oai:fau.digital.flvc.org:fau_3532 |
Contributors | Gordon, Reyna Leigh., Charles E. Schmidt College of Science, Center for Complex Systems and Brain Sciences |
Publisher | Florida Atlantic University |
Source Sets | Florida Atlantic University |
Language | English |
Detected Language | English |
Type | Text, Electronic Thesis or Dissertation |
Format | xii, 194 p. : ill. (some col.), electronic |
Rights | http://rightsstatements.org/vocab/InC/1.0/ |
Page generated in 0.0018 seconds