• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 46
  • 2
  • 1
  • Tagged with
  • 54
  • 54
  • 21
  • 15
  • 14
  • 13
  • 12
  • 11
  • 9
  • 9
  • 6
  • 5
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Shake your rattle down to the ground : infants' exploration of objects relative to surface.

Morgante, James D. 01 January 2006 (has links) (PDF)
No description available.
42

The role of experience in infants' representations of unseen, sounding objects.

Sylvia, Monica R. 01 January 1999 (has links) (PDF)
No description available.
43

The role of vision in infants' precision reaching.

Johnson, Renee L. 01 January 2001 (has links) (PDF)
No description available.
44

Tracking infant attention to talking faces

Unknown Date (has links)
Speech perception plays an important role in how infants begin to produce speech. This study aims to understand how changes in infant selective attention to various parts of talking faces guides their understanding of speech and subsequent production. In this study, we tracked infant (4-12 months of age) and adult gaze patterns to determine where on a face they attend, when hearing and seeing the face speak in either their native (English) or a non-native language (Spanish). We also tracked infant selective attention to moving-silent and silent-static faces, to determine if this would result in different patterns of attention. The findings suggest that there are two shifts in infant attention. The first shift occurs between four and eight months of age, with infants shifting their eyes to the mouth of the talking face. The second shift occurs around twelve months of age, when infants begin to return their gaze back to the eye region when hearing and seeing their native language, but continue to attend to the mouth region when hearing and seeing the non-native language. Overall, the results of this study suggest that changes in selective attention to talking faces guides the development of speech production and is dependent on early language experience. / by Amy H. Tift. / Thesis (M.A.)--Florida Atlantic University, 2012. / Includes bibliography. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
45

Infants' perception of synthetic-like multisensory relations

Unknown Date (has links)
Studies have shown that human infants can integrate the multisensory attributes of their world and, thus, have coherent perceptual experiences. Multisensory attributes can either specify non-arbitrary (e.g., amodal stimulus/event properties and typical relations) or arbitrary properties (e.g., visuospatial height and pitch). The goal of the current study was to expand on Walker et al.'s (2010) finding that 4-month-old infants looked longer at rising/falling objects when accompanied by rising/falling pitch than when accompanied by falling/rising pitch. We did so by conducting two experiments. In Experiment 1, our procedure matched Walker et al.'s (2010) single screen presentation while in Experiment 2 we used a multisensory paired-preference procedure. Additionally, we examined infants' responsiveness to these synesthetic-like events at multiple ages throughout development (four, six, and 12 months of age). ... In sum, our findings indicate that the ability to match changing visuospatial height with rising/falling pitch does not emerge until the end of the first year of life and throw into doubt Walker et al.'s (2010) claim that 4-month-old infants perceive audiovisual synesthetic relations in a manner similar to adults. / by Nicholas Minar. / Thesis (M.A.)--Florida Atlantic University, 2013. / Includes bibliography. / Mode of access: World Wide Web. / System requirements: Adobe Reader.
46

Investigating the Mechanisms Underlying Infant Selective Attention to Multisensory Speech

Unknown Date (has links)
From syllables to fluent speech, it is important for infants to quickly learn and decipher linguistic information. To do this, infants must not only use their auditory perception but also their visual perception to understand speech and language as a multisensory coherent event. Previous research by Lewkowicz and Hansen-Tift (2012) demonstrated that infants shift their allocation of visual attention from the eyes to the mouth of the speaker's face throughout development as they become interested in speech production. This project examined how infants, from 4-14-months of age, allocate their visual attention to increasingly complex speech tasks. In Experiment 1, infants were presented with upright and inverted faces vocalizing syllables and the results demonstrated that in response to the upright faces, 4-month-old infants attended to the eyes and 8- and 10-month-olds attended equally to the eyes and mouth. In response to the inverted face presentation, both the 4- and 10-month-olds attended equally to the eyes and mouth but the 8-month olds attended to the eyes. In Experiment 2, infants were presented with a phoneme matching task (Patterson & Werker, 1999, 2002, 2003) and the results demonstrated that the 4-month-old infants successfully matched the voice to the corresponding face, but that older infants did not. Measures of their selective attention to this task showed that the 4-month-old infants attended more to the eyes of the faces during the task, not attending to the redundant speech information at the mouth, but older infants attended equally to the eyes and mouth, although they did not match the voice to the face. Experiment 3 presented infants with a fluent speech matching task (Lewkowicz et al., 2015) which demonstrated that although the infants (12-14-months) did not systematically match the voice to the corresponding face, the infants attended more to the mouth region, which would have provided them with the neces sary redundant information. Overall, these studies demonstrate that there are developmental changes in how infants distribute their visual attention to faces as they learn about speech and that the complexity of the speech is a critical factor in how they allocate their visual attention. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2015. / FAU Electronic Theses and Dissertations Collection
47

Multisensory Cues Facilitate Infants’ Ability to Discriminate Other-Race Faces

Unknown Date (has links)
Our everyday world consists of people and objects that are usually specified by dynamic and concurrent auditory and visual attributes, which is known to increase perceptual salience and, therefore, facilitate learning and discrimination in infancy. Interestingly, early experience with faces and vocalizations has two seemingly opposite effects during the first year of life, 1) it enables infants to gradually acquire perceptual expertise for the faces and vocalizations of their own race and, 2) it narrows their ability to discriminate the faces of other-race faces (Kelly et al., 2007). It is not known whether multisensory redundancy might help older infants overcome the other-race effect reported in previous studies. The current project investigated infant discrimination of dynamic and vocalizing other-race faces in younger and older infants using habituation and eye-tracking methodologies. Experiment 1 examined 4-6 and 10-12-month-old infants' ability to discriminate either a native or non-native face articulating the syllable /a/. Results showed that both the 4-6- and the 10-12-month-olds successfully discriminated the faces,regardless of whether they were same- or other-race faces. Experiment 2 investigated the contribution of auditory speech cues by repeating Experiment 1 but in silence. Results showed that only the 10-12-month-olds tested with native-race faces successfully discriminated them. Experiment 3 investigated whether it was speech per se or sound in general that facilitated discrimination of the other-race faces in Experiment 1 by presenting a synchronous, computer-generated "boing" sound instead of audible speech cues. Results indicated that the 4-6-month olds discriminated both types of faces but that 10-12-month-olds only discriminated own-race faces. These results indicate that auditory cues, along with dynamic visual cues, can help infants overcome the effects of previously reported narrowing and facilitate discrimination of other-race static, silent faces. Critically, our results show that older infants can overcome the other race-effect when dynamic faces are accompanied by speech but not when they are accompanied by non- speech cues. Overall, a generalized auditory facilitation effect was found as a result of multisensory speech. Moreover, our findings suggest that infants' ability to process other- race faces following perceptual narrowing is more plastic than previously thought. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2015. / FAU Electronic Theses and Dissertations Collection
48

Examining the functions of infant musicality within a childcare community

Baxani, Nita January 2018 (has links)
The purpose of this case study was to explore and understand the function of music in an infant community. By observing the musical behaviors of seven children under the age of two in both childcare and home settings, I sought to gain new insights that can inform parents, caregivers, and educators about the engagement with and possible functions of music for infants. The theories of Communicative Musicality and psychobiological needs informed this study and provided the lenses through which I observed infant musicality. Data collection comprised semi-structured interviews with parents at home, interviews with teachers, weekly infant room observation fieldnotes, weekly infant music class video observations, parent and teacher diary entries, and artifacts such as memos, videos, and photos from the childcare and home settings. Data analysis involved identifying infant musical behaviors and their possible functions with respect to the child’s musical experience, framed as episodes. Through the use of portraiture, the individual music making of each infant was described within the contexts of the home, school, field observation, and music class settings, and relationships that developed through musical interactions were highlighted within the infant community. Results indicate that vocal and movement behaviors were the most prominent behaviors identified overall, and communication had the highest frequency of all functions. In contrast to the school-based teacher and researcher field observation settings where vocal behaviors were high, movement behaviors were identified as most prevalent during music class. The child-centered emergent curriculum provided space for the infants to demonstrate choice and leadership by setting up musical toys, pointing to an instrument, moving to indicate direction in a song, bringing song books to adults, singing fragments of songs, participating on the periphery, and gesturing for more. Infants listened and engaged in music class by moving and playing instruments and displayed their attentiveness by later recalling and initiating these activities in other settings. Increased infant room vocalizations outside music time included those resulting from delayed imitation and extensions from music class. Music is a social endeavor wherein infants build community, motivating leadership, friendship, and kinship.
49

The development of accuracy in early speech acquisition: relative contributions of production and auditory perceptual factors

Warner-Czyz, Andrea Dawn 28 August 2008 (has links)
Not available / text
50

Why are attractive faces preferred?: an electrophysiological test of averageness theory

Griffin, Angela Marie 28 August 2008 (has links)
Not available / text

Page generated in 0.088 seconds