Introduction: Simulation has an increasing role in medical education. It offers the ability to learn and practice in a safe environment. Ultrasound is a key tool for many clinicians; however, it requires significant experience to gain expertise. The most common method to gain experience is by training courses with volunteers, where experts are present for one-on-one teaching. This is time and labour intensive. Commercial ultrasound simulators are increasingly available with software capable of generating automated metrics. We sought validity evidence to support the use of automated metrics as a tool for assessment of learners completing a Focused Assessment with Sonography in Trauma (FAST) exam.
Methods: Three groups with differing expertise were recruited to participate: novices with no ultrasound training, intermediates who had completed a formal course within six months, and experts with at least five years of clinical experience. All participants were recorded while completing a FAST exam. Automated metrics of time, path length, angular movement, and percent area viewed were obtained. This video was then scored using the Quality of Ultrasound Imaging and Competence (QUICk) by two expert assessors. Participants were also asked to complete ten find fluid exercises, where automated metrics were generated. Automated metrics from the recorded FAST and QUICk were compared using Kruskall-Wallis to assess for differences in expertise. Correlations between QUICk score and the automated metrics were assessed using Pearson’s correlation coefficient. Find fluid exercises were also assessed using repeated measures one-way ANOVA models.
Results: Time, angular movement, and percent area viewed left upper quadrant (LUQ) were significantly different with novices requiring more time and angular movement, and higher percent area viewed LUQ than experts. The QUICk scores were significantly higher for the experts and intermediates compared to the novices. The scores from the QUICk overall and checklist did not correlate with any automated metrics. Individual components of positioning and handling, probe handling, and image scrolling were negatively correlated with percent area viewed LUQ. Overall, the QUICk tool could differentiate novices from both intermediates and experts when using the VIMEDIX-AR simulator. Several automated metrics could differentiate expertise. Further work should develop a composite score of automated metrics to assess learners. / Thesis / Master of Science (MSc) / Simulation has become ubiquitous in medical education, offering a safe environment to learn and practice new skills. With the increasing availability of point of care ultrasound and the need for significant training to generate and interpret images, simulation is becoming ever more important. We sought to assess an expert assessment tool for use with an ultrasound simulator and to validate automated metrics associated with the VIMEDIX-AR simulator. The expert assessment tool could reliably differentiate different expertise levels. Three of our automated metrics could discern different levels of expertise. Further work is needed to assess if a composite score of automated metrics could better differentiate skill.
Identifer | oai:union.ndltd.org:mcmaster.ca/oai:macsphere.mcmaster.ca:11375/27287 |
Date | January 2021 |
Creators | Ward, Mellissa |
Contributors | Engels, Paul, Health Science Education |
Source Sets | McMaster University |
Language | en_US |
Detected Language | English |
Type | Thesis |
Page generated in 0.0023 seconds