Return to search

Accessible Real-time Eye-Gaze Tracking For Neurocognitive Health Assessments, A Multimodal Web-based Approach

We introduce a novel integration of real-time, predictive eye-gaze tracking models into a multimodal dialogue system tailored for remote health assessments. This system is designed to be highly accessible requiring only a conventional webcam for video input along with minimal cursor interaction and utilizes engaging gaze-based tasks that can be performed directly in a web browser. We have crafted dynamic subsystems that capture high-quality data efficiently and maintain quality through instances of user attrition and incomplete calls. Additionally, these subsystems are designed with the foresight to allow for future re-analysis using improved predictive models, as well as enable the creation and training of new eye-gaze tracking datasets. As we explored gaze patterns for various user-performed tasks, we developed generalizable eye-gaze metrics that capture and reflect the distinct gaze trends among different cohorts. And through testing various feature extraction and classification methods, we have found promising results that have enabled us to effectively classify individuals with Mild Neurocognitive Disorder (MiNCD) / Mild Cognitive Impairment (MCI) in a crowdsourced pilot study (N = 35) with an average accuracy of 0.94 (f1 = 0.83). Although just the beginning, this work represents the first step towards establishing predictive eye-gaze tracking as an accessible and important modality for healthcare applications moving forward, with the potential to significantly impact remote screening and monitoring of neurocognitive health.

Identiferoai:union.ndltd.org:CALPOLY/oai:digitalcommons.calpoly.edu:theses-4491
Date01 June 2024
CreatorsTisdale, Daniel C
PublisherDigitalCommons@CalPoly
Source SetsCalifornia Polytechnic State University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceMaster's Theses

Page generated in 0.0024 seconds