End-user authored tutorials are increasingly becoming the norm for assisting users with learning software applications, but little is known about the quality of these tutorials. Using metrics derived from previous work, I characterize the quality of text- and image-based Photoshop tutorials available to users online. I compare these tutorials across four sources representing tutorials that are, i) written by a close-knit online community, ii) written by expert users, iii) most likely to be found, and iv) representative of the general population of tutorials. I found that not only are expert users generally writing higher quality tutorials than the other authors, but also, many of the typical tutorials are suffering from some important limitations. Most notably, they often lack attempts to help users avoid common errors, and seldom provide users with appropriate amounts of reasoning for undertaking steps. I also examine a typical tutorial rating system and find that it does not sufficiently distinguish quality between tutorials. I demonstrate the use of my findings by presenting two applications that I designed: a tutorial authoring tool, and a tutorial presentation site.
Identifer | oai:union.ndltd.org:LACETR/oai:collectionscanada.gc.ca:MWU.1993/22059 |
Date | 22 August 2013 |
Creators | Lount, Matthew |
Contributors | Bunt, Andrea (Computer Science), Irani, Pourang (Computer Science) Jamieson, Randall (Psychology) |
Source Sets | Library and Archives Canada ETDs Repository / Centre d'archives des thèses électroniques de Bibliothèque et Archives Canada |
Detected Language | English |
Page generated in 0.0133 seconds