• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Psychometric properties of technology-enhanced item formats: an evaluation of construct validity and technical characteristics

Crabtree, Ashleigh R. 01 May 2016 (has links)
The purpose of this research is to provide information about the psychometric properties of technology-enhanced (TE) items and the effects these items have on the content validity of an assessment. Specifically, this research investigated the impact that the inclusion of TE items has on the construct of a mathematics test, the technical properties of these items, and the influence these item types have on test characteristics. An empirical dataset was used to investigate the impact of including TE items on a multiple-choice (MC) assessment. The test used was the Iowa End-of-Course Algebra I (IEOC-A) assessment. The sample included 3850 students from the state of Iowa who took the IEOC-A assessment in the spring of 2012. The base form of the Algebra EOC assessment consisted of 30 MC items. Sixty TE items were developed and aligned to the same blueprint as the MC items. These items were appended in sets of five to the base form, in effect resulting in 12 different test forms. The forms were randomly assigned to students during the spring administration window. Several methods were used in an attempt to form a more complete understanding of the content characteristics and technical properties of TE items. This research first examined whether adding TE items to an established MC exam had an effect on the construct of the test. The factor analysis confirmed a two-factor model comprising latent factors of MC and TE items, indicating that TE items may add a new dimension to the test. Subsequent to these findings, a more thorough analysis of the item pool was conducted and IRT analyses were done to investigate item information, test information, and relative efficiency. This analysis indicated that there may be a difference in the way students perform on MC and TE items. There is evidence in this particular pool of items that there is a difference in these two item types. This difference may manifest itself as an additional, perhaps unintended, construct on the exam. Additionally, TE items may perform differently depending on the ability level of the student. Specifically, TE items may provide more information, and measure the construct more efficiently than MC items at higher levels of ability. Finally, the quantity of TE items included on a test has the potential to affect the relative efficiency of the instrument, underscoring the importance of selecting items that reinforce the purpose and uses of the test.

Page generated in 0.0457 seconds