Within the current decade, the number of Hispanic students has doubled so that about 16% of the total student population within the United States are Spanish-speakers (U.S. Census Bureau, 2017). With this growing population comes a responsibility to understand and implement best practices for educating these students. Because literacy is a building-block for learning, one integral part of this responsibility consists of developing valid and reliable means of assessing pre-reading skills that are predictive of later reading abilities (Lonigan, Burgess, & Anthony, 2000; Wagner, Torgesen, & Rashotte, 1994).
English-language learning children are being identified for having reading difficulties and disabilities two to three years later than their English-proficient peers (Chu & Flores, 2011). As a population, they are also overly misidentified as having reading difficulties/disabilities and being unnecessarily placed into a special education system (McCardle, Mele-McCarthy, Cutting, Leos, & D’Emilio, 2005b; Sanatullova Allison & Robinson-Young, 2016). Per a nationwide survey of Speech-Language Pathologists, one large contributing factor for this dilemma is the lack of appropriate assessment instruments (Roseberry-McKibbin, Brice, & O’Hanlon, 2005).
Phonological awareness is the ability to focus on and manipulate units of spoken language (words, syllables, onsets, rimes, and/or phonemes). It is one of the most significant predictors of later reading abilities. A large body of evidence provides support for this within the English language but also within other alphabetic languages, such as Spanish (e.g. Carillo, 1994; Durgunoglu, Nagy, Hancin-Bhatt, 1993; Schneider, Kuspert, Roth, Vise, & Marx, 1997). Thus, assessments of phonological awareness have been shown to be reliable measures that predict later reading abilities in Spanish-speaking children and English-proficient children alike (Farver, Nakamoto, & Lonigan, 2007).
There are many standardized assessments available to test phonological awareness as an emergent literacy skill in English. In congruence with the previously mentioned nationwide survey, Spanish assessments of phonological awareness are less abundant. Additionally, these tests tend to be expensive, time-consuming to give, and require training of the administrator. These tests are static in nature and regularly require the child to comprehend complex administrative instructions which is often problematic for children with limited language skills in Spanish and/or English (Barker, Bridges, & Saunders, 2014).
The current study aims to build upon existing data regarding development of the DAPA-S by evaluating the validity of a shorter version of the DAPA-S (the DAPA-S Short Form) with children from Spanish-speaking backgrounds. The DAPA-S Short form was designed with the purpose of retaining all the test items of the full version but with an altered structure which allows for significantly shorter administration time. The DAPA-S and the shorter version were both designed as Spanish dynamic assessments of phonological awareness which are computerized, have simple instructions, provide information about a child’s ability to learn from instruction, and do not require speech responses.
The twelve participants that were involved in this study were given the DAPA-S Short Form as well as other assessments related to phonological awareness or emergent reading. Three of those participants did not complete the study due to poor attendance or behavioral challenges. Therefore, this study reports on nine participants who completed the full assessment battery.
To investigate concurrent validity, correlational analysis was performed with the DAPA-S Short Form scores and scores from a measure of phonological awareness, the Test of Phonological Sensitivity in Spanish (TOPSS; Brea, Silliman, Bahr, & Bryant, 2003). The Elision, Rapid Automatic Naming, and Letter Name/ Letter Sound subtests from the TOPSS were administered. No significant correlations were observed between either subtest from the DAPA-S Short Form and any of the subtests from the TOPSS (r = .49 for Elision, r = .36 for RAN, r = .43 for Letter Name/Letter Sound subtests). Therefore, concurrent validity was not established as measured in this study.
To investigate convergent validity, correlational analysis was performed with the DAPA-S Short Form subtests and the scores from a measure of Spanish emergent reading skills, the Letter-word Identification (LWID) subtest from the Woodcock-Muñoz Language Survey – Revised (WMLS-R; Woodcock, Muñoz – Sandoval, Ruef, & Alvarado, 2005). Significant correlation was observed between the First Syllable subtest of the DAPA-S Short Form and the test of emergent literacy (r = .87, p < .01); no significant correlation was observed for the Last Syllable subtest of the DAPA-S Short Form (r = .44) and the test of emergent literacy. Therefore, the First Syllable subtest from the DAPA-S Short Form demonstrates good convergent validity, while the Last Syllable subtest did not.
Data suggests that the DAPA-S Short Form demonstrates excellent internal reliability (Cronbach’s alpha = .99 for both subtests) but requires modifications and further testing with a larger sample size in order to be considered as a valid measure of phonological awareness. If developed through further research, the DAPA-S Short Form as well as the full version of the assessment could prove to be invaluable tools in educational and clinical settings.
Identifer | oai:union.ndltd.org:USF/oai:scholarcommons.usf.edu:etd-8581 |
Date | 29 June 2018 |
Creators | Wyman Chin, Kelsey R. |
Publisher | Scholar Commons |
Source Sets | University of South Flordia |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Graduate Theses and Dissertations |
Page generated in 0.0103 seconds