1 |
Guessing and cognitive diagnostics: A general multicomponent latent trait model for diagnosisLutz, Megan Elyse 08 June 2015 (has links)
A common issue noted by detractors of the traditional scoring of Multiple Choice (MC) tests is the confounding of guessing or other false positives with partial knowledge and full knowledge. The current study provides a review of classical test theory (CTT) approaches to handling guessing and partial knowledge. When those methods are rejected, the item response theory (IRT) and cognitive diagnostic modeling (CDM) approaches, and their relative strengths and weaknesses, are considered. Finally, a generalization of the Multicomponent Latent Trait Model for Diagnosis (MLTM-D; Embretson & Yang, 2013) is proposed. The results of a simulation study are presented, which indicate that, in the presence of guessing, the proposed model has more reliable and accurate item parameter estimates than the MLTM-D, generally yielding better recovery of person parameters. Discussion of the methods and findings, as well as some suggested directions for further study, is included.
|
2 |
Maximizing the Potential of Multiple-choice Items for Cognitive Diagnostic AssessmentGu, Zhimei 09 January 2012 (has links)
When applying cognitive diagnostic models, the goal is to accurately estimate students’ diagnostic profiles. The accuracy of these estimates may be enhanced by looking at the types of incorrect options a student selects. This thesis research examines the additional diagnostic information available from the distractors in multiple-choice items used in large-scale achievement assessments and identifies optimal conditions for extracting diagnostic information. The study is based on the analyses of both real student responses and simulated data. The real student responses are from a large-scale provincial math assessment for grade 6 students in Ontario. Data were then simulated under different skill dimensionality and item discrimination conditions. Comparisons were made between student profile estimates when using the DINA and MC-DINA models. The MC-DINA model is a newly developed cognitive diagnostic model where the probability of a student choosing a particular item option depends on how closely the student’s cognitive skill profile matches the skills tapped by that option. The results from the simulation data analysis suggested that when the simulated data included additional diagnostic information in the distractors, the MC-DINA model was able to use that information to improve the estimation of the student profiles, which shows the utility of the additional information obtained from item distractors. The value of adding information from distractors was greater when there was lower item discrimination and more skill multidimensionality. However, in the real data, the keyed options provided more diagnostic information than the distractors, and there was little information in the distractors that could be utilized by the MC-DINA model. This implies that current math test items could be further developed to include diagnostically rich distractors. The study offers some suggestions for a design of multiple-choice test items and its formative use.
|
3 |
Maximizing the Potential of Multiple-choice Items for Cognitive Diagnostic AssessmentGu, Zhimei 09 January 2012 (has links)
When applying cognitive diagnostic models, the goal is to accurately estimate students’ diagnostic profiles. The accuracy of these estimates may be enhanced by looking at the types of incorrect options a student selects. This thesis research examines the additional diagnostic information available from the distractors in multiple-choice items used in large-scale achievement assessments and identifies optimal conditions for extracting diagnostic information. The study is based on the analyses of both real student responses and simulated data. The real student responses are from a large-scale provincial math assessment for grade 6 students in Ontario. Data were then simulated under different skill dimensionality and item discrimination conditions. Comparisons were made between student profile estimates when using the DINA and MC-DINA models. The MC-DINA model is a newly developed cognitive diagnostic model where the probability of a student choosing a particular item option depends on how closely the student’s cognitive skill profile matches the skills tapped by that option. The results from the simulation data analysis suggested that when the simulated data included additional diagnostic information in the distractors, the MC-DINA model was able to use that information to improve the estimation of the student profiles, which shows the utility of the additional information obtained from item distractors. The value of adding information from distractors was greater when there was lower item discrimination and more skill multidimensionality. However, in the real data, the keyed options provided more diagnostic information than the distractors, and there was little information in the distractors that could be utilized by the MC-DINA model. This implies that current math test items could be further developed to include diagnostically rich distractors. The study offers some suggestions for a design of multiple-choice test items and its formative use.
|
Page generated in 0.0858 seconds