Return to search

Characterizing Multiple-Choice Assessment Practices in Undergraduate General Chemistry

<p>Assessment of
student learning is ubiquitous in higher education chemistry courses because it
is the mechanism by which instructors can assign grades, alter teaching
practice, and help their students to succeed. One type of assessment that is
popular in general chemistry courses, yet difficult to create effectively, is
the multiple-choice assessment. Despite its popularity, little is known about
the extent that multiple-choice general chemistry exams adhere to accepted
design practices or the processes that general chemistry instructors engage in
while creating these assessments. Further understanding of multiple-choice
assessment quality and the design practices of general chemistry instructors
could inform efforts to improve the quality of multiple-choice assessment
practice in the future. This work attempted to characterize multiple-choice
assessment practices in undergraduate general chemistry classrooms by, 1)
conducting a phenomenographic study of general chemistry instructor’s
assessment practices and 2) designing an instrument that can detect violations
of item writing guidelines in multiple-choice chemistry exams. </p>

<p>The
phenomenographic study of general chemistry instructors’ assessment practices
included 13 instructors from the United States who participated in a
three-phase interview. They were asked to describe how they create multiple-choice
assessments, to evaluate six multiple-choice exam items, and to create two
multiple-choice exam items using a think-aloud protocol. It was found that the
participating instructors considered many appropriate assessment design
practices yet did not utilize, or were not familiar with, all the appropriate
assessment design practices available to them. </p>

<p>Additionally, an
instrument was developed that can be used to detect violations of item writing guidelines
in multiple-choice exams. The instrument, known as the Item Writing Flaws
Evaluation Instrument (IWFEI) was shown to be reliable between users of the
instrument. Once developed, the IWFEI was used to analyze 1,019 general
chemistry exam items. This instrument provides a tool for researchers to use to
study item writing guideline adherence, as well as, a tool for instructors to
use to evaluate their own multiple-choice exams. The use of the IWFEI is hoped
to improve multiple-choice item writing practice and quality.</p>

<p>The results of
this work provide insight into the multiple-choice assessment design practices of
general chemistry instructors and an instrument that can be used to evaluate multiple-choice
exams for item writing guideline adherence. Conclusions, recommendations for
professional development, and recommendations for future research are discussed.</p>

  1. 10.25394/pgs.11317076.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/11317076
Date04 December 2019
CreatorsJared B Breakall (8080967)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/Characterizing_Multiple-Choice_Assessment_Practices_in_Undergraduate_General_Chemistry/11317076

Page generated in 0.008 seconds