Return to search

Investigating Test-takers’ Use of Linguistic Tools in Second Language Academic Writing Assessment

Advancements in technology have greatly influenced how students write, the ways they interact with readers, and the genres they create. In order to reflect real-world writing behaviors in the assessment setting and to be able to generalize test-takers’ performance from the assessment to their true writing ability, the current study investigated test-takers’ use of linguistic tools in second language academic writing assessment. The linguistic tools of interest involved three frequently used tool types: spelling, grammar, and reference tools (i.e., dictionary and thesaurus). Three highly contextualized tasks which reflect the tasks second language learners may encounter in the academic domain of language use (i.e., writing an apologetic email, a negative online review, and an opinion on a discussion board) were used as a way to elicit test-takers’ writing ability. Additionally, as a means of measuring writing performance, writing ability was defined in terms of the accuracy and/or variety of grammatical forms, semantic meanings, and pragmatic meanings produced in the written responses (Purpura, 2004, 2014, 2017).

Using a mixed methods design, the current study first analyzed the quantitative data, which included 120 test-takers’ scores on the writing test, based on an analytic rubric through classical test theory, many-facet Rasch measurement, and multivariate generalizability theory. Test-takers’ scores across assessment conditions (i.e., access to no linguistic tools, spelling, grammar, or reference tool), proficiency levels (i.e., intermediate, advanced, and proficient) and three tasks (i.e., email, online review, and discussion board post) were compared. In order to explain the reasons behind the similarities and difference across the assessment conditions, proficiency levels, and tasks found in the quantitative analyses, the qualitative data, which included screen recordings of test-takers’ process of producing text, were analyzed.

The results of the study were discussed to provide empirical evidence in supporting the domain description, evaluation, generalizability, explanation, extrapolation, and utilization claims (Kane, 2006, 2013) in regard to providing support in discussing the possibilities of allowing test-takers’ use of linguistic tools in second language writing assessment. Based on Kane’s framework for validation, the findings revealed that allowing linguistic tools—especially spelling and reference tools—in writing assessment contexts could be a possibility.

Identiferoai:union.ndltd.org:columbia.edu/oai:academiccommons.columbia.edu:10.7916/D8B00HDQ
Date January 2018
CreatorsOh, Sae Rhim
Source SetsColumbia University
LanguageEnglish
Detected LanguageEnglish
TypeTheses

Page generated in 0.0017 seconds