Our motivation for this work is to develop an autonomous robot system that is able to perform autism intervention therapy. Autism spectrum disorder (ASD) is a common type of neurodevelopmental disorder that affects millions of people in the United States alone. The best way of treating ASD and help people with ASD learn new skills is through applied behavior analysis (ABA, i.e. autism intervention therapy). Because of the fact that people with ASD feel less stressful in a predictable and simple environment compared to interacting with other people and autism intervention therapy provided by professional therapists are generally expensive and inaccessible, it would be beneficial to build robots that can perform intervention therapy with children without a therapist/instructor present. In this research, we focus on the task of detecting engagement/disengagement levels of a child in a therapy session as a first step in designing a therapy robot. In this work, we mainly utilize an RGB-D camera, namely the Microsoft Kinect 2.0, to extract kinematic joint data from the therapy session. We also set up a child study with the Kid’s Creek therapy center to recruit children with ASD and record their interactions with a therapist while working on a touch-screen based game on a tablet. After carefully selecting features derived from skeletons’ movements and poses, we showed that our system can produce an accuracy of 97% when detecting engagements and disengagements using cross-validation assessment.
Identifer | oai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/55043 |
Date | 27 May 2016 |
Creators | Ge, Bi |
Contributors | Howard, Ayanna |
Publisher | Georgia Institute of Technology |
Source Sets | Georgia Tech Electronic Thesis and Dissertation Archive |
Detected Language | English |
Type | Thesis |
Format | application/pdf |
Page generated in 0.0018 seconds