Spelling suggestions: "subject:"amedical educationization"" "subject:"amedical education.action""
31 |
Exploring the use of a web-based virtual patient to support learning through reflectionChesher, Douglas January 2004 (has links)
This thesis explores the support of learning through reflection, in the context of medical students and practitioners, working through a series of simulated consultations involving the diagnosis and management of chronic illness. A model of the medical consultative process was defined, on which a web-based patient simulation was developed. This simulation can be accessed over the Internet using commonly available web-browsers. It enables users to interact with a virtual patient by taking a history, examining the patient, requesting and reviewing investigations, and choosing appropriate management strategies. The virtual patient can be reviewed over a number of consultations, and the patient outcome is dependant on the management strategy selected by the user. A second model was also developed, that adds a layer of reflection over the consultative process. While interacting with the virtual patient users are asked to formulate and test their hypotheses. Simple tools are included to encourage users to record their observations and thoughts for further learning, as well as providing links to web-based library resources. At the end of each consultation, users are asked to review their actions and indicate whether they think their actions were critical, relevant, or not relevant to the diagnosis and management of the patient in light of their current knowledge. Users also have the opportunity to compare their activity to their peers or an expert in the case under study. Three formal cycles of evaluation were undertaken during the design and development of the software. A number of clinicians were involved in the initial design to ensure there was an appropriate structure that matched clinical practice. Formative evaluation was conducted to review the usability of the application, and based on user feedback a number of changes were made to the user interface and structure of the application. A third, end user, evaluation was undertaken using a single case concerning the diagnosis and management of hypertriglyceridaemia in the context of Type 1B Glycogen Storage Disease. This evaluation involved ten medical students, five general practitioners and two specialists. The evaluation involved observation using a simplified think-aloud, as well as administration of a questionnaire. Users were engaged by the simulation, and were able to use the application with only a short period of training. Usability issues still exist with respect to the processing of natural language input, especially when asking questions of the virtual patient. Until such time that natural language recognition is able to provide satisfactory performance, alternative, list-based, methods of interaction will be required. Evaluation involving medical students, general practitioners, and specialist medical practitioners demonstrated that reflection can be supported and encouraged by providing appropriate tools, as well as by judiciously interrupting the consultative process and providing time for reflection to take place. Reflection could have been further enhanced if users had been educated on reflection as a learning modality prior to using SIMPRAC. Further work is also required to improve the simulation environment, improve the interfaces for supporting reflection, and further define the benefits of using this approach for medical education and professional development with respect to learning outcomes and behavioural change.
|
32 |
An Evaluation of the Utility of a Hybrid Objective Structured Clinical Examination for the use of Assessing Residents Enrolled in McMaster University's Orthopaedic Surgery Residency ProgramGavranic, Vanja 04 1900 (has links)
<p><strong>Introduction: </strong>McMaster University’s orthopaedic surgery residency program implemented the OSCE as an assessment tool in 2010; this study evaluates the first four OSCEs administered to residents. The OSCEs were composed of knowledge-testing stations, which are normally not included in this testing format, and performance-testing stations. Recruiting enough faculty evaluators challenged the ability to feasibly implement this examination format. Knowledge-testing stations were incorporated since they do not require evaluators to be present. Reliability was assessed, and the correlation between knowledge-testing station scores and performance-testing station scores was determined. The ability of the OSCE to discriminate between residents in different post-graduate years (PGYs) was assessed. Residents’ acceptability of the OSCE was also assessed. <strong>Methods: </strong>Reliability was assessed using generalizability theory. The correlation of knowledge-testing and performance-testing station scores was measured with Pearson’s r. A two-way ANOVA was used to analyze whether the OSCE can discriminate between residents in different PGYs. An exit survey was administrated after each OSCE to assess acceptability. <strong>Results: </strong>The generalizability estimates of each OSCE ranged from 0.71 to 0.87. The disattenuated correlation between knowledge- and performance-testing stations for senior residents was 1.00, and 0.89 for junior residents. A significant effect of year of residency was found for the October 2010 OSCE in the ANOVA (F(1,30) = 11.027, p = 0.005), but the remaining OSCEs did not replicate this finding. In general, residents felt that they were able to present an accurate portrayal of themselves in the OSCEs and that the examination covered appropriate topics. <strong>Discussion: </strong>The OSCEs were reliable and acceptable to residents. The high correlations between knowledge- and performance-testing station scores suggest that the examination can be made more feasible by including knowledge-testing stations. The small sample sizes made significant differences difficult to detect between levels of training, resulting in inconclusive evidence for this construct validation measure.</p> / Master of Science (MSc)
|
33 |
2010 July 06 - Medical Student Education Committee MinutesMedical Student Education Committee, East Tennessee State University 06 July 2010 (has links) (PDF)
No description available.
|
34 |
2010 August 03 - Medical Student Education Committee MinutesMedical Student Education Committee, East Tennessee State University 03 August 2010 (has links) (PDF)
No description available.
|
35 |
2018 June 12 - Medical Student Education Committee Annual Meeting MinutesMedical Student Education Committee, East Tennessee State University 12 June 2018 (has links) (PDF)
No description available.
|
36 |
2018 June 12 - Medical Student Education Committee Retreat Meeting MinutesMedical Student Education Committee, East Tennessee State University 12 June 2018 (has links) (PDF)
No description available.
|
37 |
2010 September 7 - Medical Student Education Committee MinutesMedical Student Education Committee, East Tennessee State University 07 September 2010 (has links) (PDF)
No description available.
|
38 |
2010 October 5 - Medical Student Education Committee MinutesMedical Student Education Committee, East Tennessee State University 05 October 2010 (has links) (PDF)
No description available.
|
39 |
2010 November 2 - Medical Student Education Committee MinutesMedical Student Education Committee, East Tennessee State University 02 November 2010 (has links) (PDF)
No description available.
|
40 |
2010 December 7 - Medical Student Education Committee MinutesMedical Student Education Committee, East Tennessee State University 07 December 2010 (has links) (PDF)
No description available.
|
Page generated in 0.1176 seconds