Spelling suggestions: "subject:"international largescale assessment"" "subject:"international largerscale assessment""
1 |
Preserving 20 Years of TIMSS Trend Measurements: Early Stages in the Transition to the eTIMSS AssessmentFishbein, Bethany January 2018 (has links)
Thesis advisor: Ina V.S. Mullis / This dissertation describes the foundation for maintaining TIMSS’ 20 year trend measurements with the introduction of a new computer- and tablet-based mode of assessment delivery—eTIMSS. Because of the potential for mode effects on the psychometric behavior of the trend items that TIMSS relies on to maintain comparable scores between subsequent assessment cycles, development efforts for TIMSS 2019 began over three years in advance. This dissertation documents the development of eTIMSS over this period and features the methodology and results of the eTIMSS Pilot / Item Equivalence Study. The study was conducted in 25 countries and employed a within-subjects, counterbalanced design to determine the effect of the mode of administration on the trend items. Further analysis examined score-level mode effects in relation to students’ socioeconomic status, gender, and self-efficacy for using digital devices. Strategies are discussed for mitigating threats of construct irrelevant variance on students’ eTIMSS performance. The analysis by student subgroups, similar item discriminations, high cross-mode correlations, and equivalent rankings of country means provide support for the equivalence of the mathematics and science constructs between paperTIMSS and eTIMSS. However, the results revealed an overall mode effect on the TIMSS trend items, where items were more difficult for students in digital formats compared to paper. The effect was larger in mathematics than science. An approach is needed to account for the mode effects in maintaining trend measurements from previous cycles to TIMSS 2019. Each eTIMSS 2019 trend country will administer the paper trend booklets to an additional nationally representative bridge sample of students, and a common population equating approach will ensure the link between paperTIMSS and eTIMSS scores. / Thesis (PhD) — Boston College, 2018. / Submitted to: Boston College. Lynch School of Education. / Discipline: Educational Research, Measurement and Evaluation.
|
2 |
Validation of the performance of Tshivenda learners in PIRLS 2006Labuschagne, Melissa J. January 2015 (has links)
The aim of this study is to validate the Tshivenda learner performance in the Progress in International Reading Literacy Study (PIRLS) 2006 in which there is an anomaly in the Tshivenda language group. By comparing the performance of the Tshivenda learners to that of learners who wrote the PIRLS 2006 test in the other official languages, the notion of performance is related to equivalence in translation in that, if the learners wrote equivalent instruments across all official South African languages, then it is possible that the difference in performance was related to different translation equivalence. Therefore, the validation of the learner performance in this study is directly linked to the validation of the translation.
The South African national results of PIRLS 2006 revealed that the Tshivenda language speakers, who had written the PIRLS tests in a secondary language, achieved higher scores than those Tshivenda speakers who had written the tests in their mother tongue (Tshivenda). This result was considered an anomaly. This research investigated the role of translation as an influencing factor in learner comprehension, which may have contributed to this anomaly.
Some of the procedures and standards set in place for PIRLS 2006 related to translation and verification were examined. Issues of language and culture, with specific reference to the availability of media in Tshivenda are discussed in the literature. Further investigation was conducted into what translation entails including translation and back-translation, equivalence and non-equivalence as well as the comprehension processes required by each of the four released PIRLS 2006 texts.
This study is a secondary analysis of data gathered for PIRLS 2006. Permission to use the data was given in 2011 by the Centre for Evaluation and Assessment at the University of Pretoria, the PIRLS National Centre. Details of the original sampling, collection and analysis methods are provided as part of the discussion on the quality assurance, validity and reliability of the original study. The secondary analysis of the data utilised a mixed methods approach which involved Classical Test Theory and Content Analysis in order to accurately explore this data. The results of this study indicated that, despite the fact that the back-translation revealed many errors, the translation did not affect the learners’ level of comprehension. / Dissertation (MEd)--University of Pretoria, 2015. / tm2015 / Science, Mathematics and Technology Education / MEd / Unrestricted
|
3 |
Utilisation d’une approche écologique pour l’analyse des résultats d'évaluations standardisées : cas des performances en lecture aux tests PASEC2014-CamerounAlioum 09 1900 (has links)
L’objectif de la présente étude est de soutenir l’utilisation d’une approche écologique pour l’analyse des données d’une évaluation standardisée. Nous implémentons cette approche en utilisant une Analyse de Classes Latentes (ACL) avec covariables sur les données du test de lecture du Programme d’Analyse des Systèmes Éducatifs de la CONFEMEN (PASEC) des élèves francophones de 6e année du primaire au Cameroun (N=617) et montrons comment une telle approche peut susciter un regard nouveau sur les résultats de cette évaluation.
En effet, les programmes d’Évaluations standardisées à Larges Échelles (ELE) visent à évaluer les apprentissages et les compétences des individus et fournissent des données d’aide à la décision dans beaucoup de pays (Hogan, 2017 ; Loye, 2011 ; Wagemaker, 2014). Dans le cadre de ces ELE, les performances sont estimées uniquement sur la base des réponses que fournissent les candidats et donc de leurs capacités cognitives (Zumbo et al, 2015). Pourtant, de plus en plus de recherches suggèrent de considérer la performance à un test comme un phénomène qui se réalise dans un réseau interconnecté de connaissances, de caractéristiques individuelles et de contextes particuliers (McNamara, 2007 ; McNamara et Roever, 2006 ; Mislevy, 2018 ; Zumbo et al, 2015). L’approche écologique, qui s’inscrit dans cette perspective, s’intéresse aux hypothèses contextuelles, sociales et culturelles lors de l’estimation des performances à un test (McNamara, 2007 ; McNamara et Roever, 2006 ; Zumbo et al, 2015).
Nos résultats mettent en évidence des écologies de performances en lecture qui varient en fonction des régions. Dans chacune de ces écologies, les profils de performances en lecture qui s’y dégagent dépendent des caractéristiques des élèves et des caractéristiques des milieux scolaires et extrascolaires dans lesquels ils évoluent. Nous mettons ainsi l’emphase sur le caractère situé des performances à un test permettant une lecture plus équitable des performances des différents candidats. Ce faisant, nous formulons des recommandations qui tiennent compte des réalités contextuelles des candidats contrairement aux recommandations uniformes souvent formulées à la suite des résultats aux ELE. / The purpose of this study is to support the use of an ecological approach for the analysis of standardized data assessment. This study is implimented by Latent Classes Analysis (ACL) with covariables on the test reading data of the « Programme d’Analyse des Systèmes Éducatifs de la CONFEMEN » (PASEC) french-speaking students of 6th grade primary school in Cameroon (N = 617) and shows how this approach can change the way we interpret the results of this study.
Indeed, Standardized Large Scale Assessments (ELE) programs aim to appraise the learning and skills of individuals and provide decision support data in many countries (Hogan, 2017 ; Loye, 2011; Wagemaker, 2014). In the context of these ELEs, performance is estimated solely based on the responses provided by the candidates and therefore of their cognitive abilities (Zumbo et al, 2015). Yet a growing body of research suggests viewing test performance as a phenomenon that occurs in an interconnected network of knowledge, individual characteristics, and particular contexts (McNamara, 2007 ; McNamara and Roever, 2006 ; Mislevy, 2018 ; Zumbo et al. , 2015). The ecological approach, which is part of this perspective, is interested in the contextual, social and cultural hypotheses when estimating the performance of a test (McNamara and Roever, 2006 ; McNamara, 2007 ; Zumbo et al, 2015).
Our results highlight reading performance ecologies that vary by region. In each of these ecologies, the reading performance profiles that emerge depend on the characteristics of the students and the characteristics of the school and out-of-school environments in which they operate. We therefore put the emphasis on the situated character of the performances on a test allowing a fairer reading of the performances of the different candidates. In doing so, we formulate recommendations that take into account the contextual realities of applicants, unlike the standard recommendations often made following ELE results.
|
Page generated in 0.1651 seconds