Serious games show promise as an effective training method, but such games are complex and few guidelines exist for their effective evaluation. We draw on the design science literature to develop a serious game evaluation framework that emphasizes grounding evaluation in each of four key areas-theoretical, technical, empirical, and external. We further recommend that serious game developers assume an iterative, adaptive approach to grounding an evaluation effort in these four areas, emphasizing some areas more than others at different stages of the development cycle. We illustrate our framework using a case study of a large-scale serious game development project. The case study illustrates a holistic approach to serious game evaluation that is valuable to both researchers and practitioners.
Identifer | oai:union.ndltd.org:arizona.edu/oai:arizona.openrepository.com:10150/626121 |
Date | 01 1900 |
Creators | Wilson, David W., Jenkins, Jeff, Twyman, Nathan, Jensen, Matthew, Valacich, Joe, Dunbar, Norah, Wilson, Scott, Miller, Claude, Adame, Bradly, Lee, Yu-Hao, Burgoon, Judee, Nunamaker, Jay F. |
Contributors | Univ. of Arizona, Tucson, AZ, USA |
Publisher | IEEE |
Source Sets | University of Arizona |
Language | English |
Detected Language | English |
Type | Article |
Rights | © 2016 IEEE. |
Relation | http://ieeexplore.ieee.org/document/7427261/ |
Page generated in 0.0022 seconds