Spelling suggestions: "subject:"isixhosa dialect"" "subject:"iisixhosa dialect""
1 |
Exploring Differential Item Functioning on reading achievement between English and isiXhosaMtsatse, Nangamso January 2017 (has links)
Post-Apartheid South Africa has undergone an educational language policy shift from only Afrikaans and English in education to the representation of all 11 official languages: Afrikaans, English, isiZulu, isiXhosa, isiNdebele, siSwati, Sesotho, Setswana, Tshivenda and Xitsonga. The national languages policy included the Language in Education Policy (LiEP), which stipulates that learners from grades 1- 3 in all ways possible should be provided the opportunity to be taught in their home language (HL). With this change, there has been a need to increase access to African languages in education. The 2007 Status of LoLT report released by the Department of Education (DoE) revealed that since 1996 up to 65% of learners in the foundation phase are being taught in their home language. In other ways, the LiEP has been successful in bridging the gap of access to African languages in the basic education system. With that said, there has been rapid growth of interest in early childhood crosscultural literacy assessment across the globe. Internationally South Africa has participated in the Southern and Eastern Africa Consortium for Monitoring Education Quality as well as the Progress in International Reading Literacy Study studies. The design of these particular international studies meant participation in the same assessment but in different languages, calling into question the equivalence of assessments across languages. Assessing across languages should aim to encourage linguistic equivalence, functioning equivalence, cultural equivalence as well as metric equivalence. South Africa has taken part in three cycles of the Progress in International Reading Literacy (PIRLS) study. The purposes of the current study is to present secondary analysis of the prePIRLS 2011 data, to investigate any differential item functioning (DIF) of the achievement scores between English and isiXhosa. The Organisation for Economic Co-operation and Development (OECD) developed a framework of input, process and output for curriculum process. The framework shows the multiple facets that needs to be considered when implementing a curriculum in a country. The curriculum process framework was used as the theoretical framework for this study. The framework views curriculum success as a process of measuring how the intended curriculum (input) was implemented (process) and should be reflected in the attained curriculum (output). The adapted framework is LiEP as the attained curriculum, as learners in the prePIRLS 2011 are tested in the LoLT in Grades 1-3. Followed by the prePIRLS 2011 assessment, as the implemented curriculum testing the learners’ comprehension skills requires by grade 4 in their HL. Lastly, the attained curriculum refers the learners’ achievement scores in the prePIRLS 2011 study. A sample of 819 Grade 4 learners (539 English L1 speaking learners and 279 isiXhosa L1 speakign learners) that participated in the prePIRLS 2011 study were included in this study. These learners wrote a literary passage called The Lonely Giraffe, accompanied by 15 items. The study made use of the Rasch model to investigate any evidence of Differential Item Functioning (DIF) on the reading achievement of the learners. The findings showed that the items did not reflect an equal distribution. In addition, an item by item DIF analysis revealed discrimination on one subgroup over the other. A further investigation showed that these discriminations could be explained by means of inaccurate linguistic equivalence. The linguistic equivalence could be explained by means of mistranslation and/or dialectal differences. Subsequently, the complexities of dialects in African languages are presented by providing isiXhosa alternative translations to the items. The significance of the current study is in its potential contribution in further understanding language complexities in large-scale assessments. Additionally, in attempts to provide valid, reliable and fair assessment data across sub-groups. / Dissertation (MEd)--University of Pretoria, 2017. / Science, Mathematics and Technology Education / Centre for Evaluation & Assessment (CEA) / MEd / Unrestricted
|
2 |
Exploring the scalar equivalence of the picture vocabulary scale of the Woodcock Munoz language survey across rural and urban isiXhosa-speaking learnersBrown, Qunita January 2012 (has links)
Magister Artium (Psychology) - MA(Psych) / The fall of apartheid and the rise of democracy have brought assessment issues in multicultural societies to the forefront in South Africa. The rise of multicultural assessment demands the development of tests that are culturally relevant to enhance fair testing practices, and issues of bias and equivalence of tests become increasingly important. This study forms part of a larger project titled the Additive Bilingual Education Project (ABLE). The Woodcock Munoz Language Survey (WMLS) was specifically selected to evaluate the
language aims in the project, and was adapted from English to isiXhosa. Previous research has indicated that one of the scales in the adapted isiXhosa version of the WMLS, namely the Picture Vocabulary Scale (PV), displays some item bias, or differential item functioning (DIF), across rural and urban isiXhosa learners. Research has also indicated that differences in dialects can have an impact on test takers’ scores. It is therefore essential to explore the
structural equivalence of the adapted isiXhosa version of the WMLS on the PV scale across rural and urban isiXhosa learners, and to ascertain whether DIF is affecting the extent to which the same construct is measured across both groups. The results contribute to establishing the scalar equivalence of the adapted isiXhosa version of the WMLS across rural and urban isiXhosa-speaking learners. Secondary Data Analysis (SDA) was employed because this allowed the researcher to re-analyse the existing data in order to further evaluate construct equivalence. The sample of the larger study consisted of 260 learners, both male and female, selected from a population of Grade 6 and 7 learners attending schools in the Eastern Cape. The data was analysed by using the statistical programme Comprehensive Exploratory Factor Analysis (CEFA) and the Statistical Package for Social Sciences (SPSS). Exploratory factor analysis and the Tucker’s phi coefficient were used. The results indicated distinct factor loadings for both groups, but slight differences were observed which raised concerns about construct equivalence. Scatter plots were employed to investigate further, which also gave cause for concern. It was therefore concluded that construct equivalence was only partially attained. In addition, the Cronbach’s Alpha per factor was calculated, showing that internal consistency was displayed only for Factor 1 and not for Factor 2 for the rural group, or both factors for the urban group. Scalar equivalence across the two groups must therefore be explored further.
|
3 |
An evaluation of group differences and items bias, across rural isiXhosa learners and urban isiXhosa learners, of the isiXhosa version of the Woodcock Muñoz Language Survey (WMLS)Silo, Unathi Lucia January 2010 (has links)
Magister Psychologiae - MPsych / In many countries defined by multilingualism, language has been identified as a great influence during psychological and educational testing. In South Africa (SA), factors such as changes in policies and social inequalities also influence testing. Literature supports the translation and adaptation of tests used in such contexts in order to avoid bias caused by language. Different language versions of tests then need to be evaluated for equivalence, to ensure that scores across the different language versions have the same meaning. Differences in dialects may also impact on the results of such tests.Results of an isiXhosa version of the Woodcock Muñoz Language Survey (WMLS),which is a test used to measure isiXhosa learners’ language proficiency, show significant mean score differences on the test scores across rural and urban firstlanguage speakers of isiXhosa. These results have indicated a possible problem regarding rural and urban dialects during testing. This thesis evaluates the item bias of the subtests in this version of the WMLS across rural and urban isiXhosa learners. This was accomplished by evaluating the reliability and item characteristics for group differences, and by evaluating differential item functioning across these two groups on the subtests of the WMLS. The sample in this thesis comprised of 260 isiXhosa learners from the Eastern Cape Province in grade 6 and grade 7, both males and females. This sample was collected in two phases: (1) secondary data from 49 rural and 133 urban isiXhosa learners was included in the sample; (2) adding to the secondary data, a primary data collection from 78 rural isiXhosa learners was made to equalise the two
sample groups. All ethical considerations were included in this thesis. The results were surprising and unexpected. Two of the subtests in the WMLS showed evidence of scalar equivalence as only a few items were identified as problematic. However, two of the subtests demonstrated more problematic items. These results mean that two subtests of the WMLS that demonstrated evidence of scalar equivalence can be used to measure the construct of language proficiency, while the other two sub-tests that showed problematic items need to be further investigated, as the responses given by learners on these items seem to be determined by their group membership and not by their ability.
|
Page generated in 0.0495 seconds