Return to search

Psychometric properties of technology-enhanced item formats: an evaluation of construct validity and technical characteristics

The purpose of this research is to provide information about the psychometric properties of technology-enhanced (TE) items and the effects these items have on the content validity of an assessment. Specifically, this research investigated the impact that the inclusion of TE items has on the construct of a mathematics test, the technical properties of these items, and the influence these item types have on test characteristics. An empirical dataset was used to investigate the impact of including TE items on a multiple-choice (MC) assessment. The test used was the Iowa End-of-Course Algebra I (IEOC-A) assessment. The sample included 3850 students from the state of Iowa who took the IEOC-A assessment in the spring of 2012. The base form of the Algebra EOC assessment consisted of 30 MC items. Sixty TE items were developed and aligned to the same blueprint as the MC items. These items were appended in sets of five to the base form, in effect resulting in 12 different test forms. The forms were randomly assigned to students during the spring administration window.
Several methods were used in an attempt to form a more complete understanding of the content characteristics and technical properties of TE items. This research first examined whether adding TE items to an established MC exam had an effect on the construct of the test. The factor analysis confirmed a two-factor model comprising latent factors of MC and TE items, indicating that TE items may add a new dimension to the test. Subsequent to these findings, a more thorough analysis of the item pool was conducted and IRT analyses were done to investigate item information, test information, and relative efficiency. This analysis indicated that there may be a difference in the way students perform on MC and TE items. There is evidence in this particular pool of items that there is a difference in these two item types. This difference may manifest itself as an additional, perhaps unintended, construct on the exam. Additionally, TE items may perform differently depending on the ability level of the student. Specifically, TE items may provide more information, and measure the construct more efficiently than MC items at higher levels of ability. Finally, the quantity of TE items included on a test has the potential to affect the relative efficiency of the instrument, underscoring the importance of selecting items that reinforce the purpose and uses of the test.

Identiferoai:union.ndltd.org:uiowa.edu/oai:ir.uiowa.edu:etd-6406
Date01 May 2016
CreatorsCrabtree, Ashleigh R.
ContributorsWelch, Catherine J., Dunbar, Stephen B.
PublisherUniversity of Iowa
Source SetsUniversity of Iowa
LanguageEnglish
Detected LanguageEnglish
Typedissertation
Formatapplication/pdf
SourceTheses and Dissertations
RightsCopyright 2016 Ashleigh Crabtree

Page generated in 0.0601 seconds