Return to search

USING ARTIFICIAL INTELLIGENCE TO PROVIDE DIFFERENTIATED FEEDBACK AND INSTRUCTION IN INTRODUCTORY PHYSICS

<p>Cognitive load theory (CLT) lays out a tripartite scheme concerned with how learners cognitively interact with instructional materials during learning and problem solving. Cognitive load refers to the utilization of working memory resources, and CLT designates three types of cognitive load as intrinsic cognitive load, extraneous cognitive load, and germane cognitive load. Intrinsic cognitive load is related to the intrinsic complexity of the material. Extraneous cognitive load is concerned with unnecessary utilization of cognitive resources due to suboptimal instructional design. Germane cognitive load results from processing the intrinsic load and schema acquisition. The expertise reversal effect follows as a consequence of CLT.  </p>
<p>The expertise reversal effect (ERE) states that instructional materials that are beneficial to low prior knowledge (LPK) learners may be detrimental to high prior knowledge (HPK) learners. Less guided materials have been shown to reduce extraneous cognitive load for these learners and therefore produce a greater benefit.  </p>
<p>In this work we present the development of online instructional modules that deliver content in two distinct styles, differentiated by their use of guiding features. the high level guidance version (HLG) uses guiding features, such as animations and voice narration, which have been shown to benefit LPK learners. Alternatively, guiding features have been shown to be destructive to the learning of HPK students. The low level guidance (LLG) version uses text in place of voice narration and pop-up content in place of continuous animations. Both versions led to a statistically significant improvement from pre-test to post-test. However, both HPK and LPK students showed a preference for the HLG version of the module, contrary to the ERE. Future work will focus on improving the ability to indentify HPK and LPK students, and refining methods for providing optimal instructional materials for these cohorts.  </p>
<p>Meanwhile, the use of machine learning is an emerging trend in education. Machine learning has been used in roles such as automatic scoring of essays in scientific argumentation tasks and providing feedback to students in real time. In this work we report our results on two projects using machine learning in education. In one project we used machine learning to predict students’ correctness on a physics problem given an essay outlining their approach to solving the problem. Our overall accuracy in predicting problem correctness given a student’s strategy essay was 80%. We were able to detect students whose approach would lead to an incorrect solution at a rate of 87%. However, deploying this model to provide real-time feedback would necessitate performance improvement. Planned future work on this problem includes hand grading essays to produce a label that reflects the scientific merit of each essay, using more sophisticated models (like Google’s B.E.R.T.), and generalizing to a larger set of problems. </p>
<p>In another study, we used data about students’ prior academic behavior to predict academic risk in a first-year algebra based physics course. Their final course grade was used to define their risk category as; B- and above is designated low risk, and C+ and below is designated as high-risk. Using a mix of numerical and category features such as high school gpa, ACT/SAT scores, gender, and ethnicity we were able to predict student academic risk with 75% overall accuracy. Students with a very high grade (A) or students with a very low grade (D,F,W) were identified at a rate 92% and 88% (respectively).</p>
<p>Prior work has shown that performance can be greatly increased by including in-class features into the model. Future work will focus on obtaining raw data, rather than using curved scores reported to the university registrar. Also, obtaining more batches of data to improve predictive power with existing models developed in this study.<br>
</p>

  1. 10.25394/pgs.19666623.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/19666623
Date27 April 2022
CreatorsJeremy M Munsell (12468648)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/USING_ARTIFICIAL_INTELLIGENCE_TO_PROVIDE_DIFFERENTIATED_FEEDBACK_AND_INSTRUCTION_IN_INTRODUCTORY_PHYSICS/19666623

Page generated in 0.0021 seconds