Return to search

Can You Read My Mind? : A Participatory Design Study of How a Humanoid Robot Can Communicate Its Intent and Awareness

Communication between humans and interactive robots will benefit if people have a clear mental model of the robots' intent and awareness. The aim with this thesis was to investigate how human-robot interaction is affected by manipulation of social cues on the robot. The research questions were: How do social cues affect mental models of the Pepper robot, and how can a participatory design method be used for investigating how the Pepper robot could communicate intent and awareness? The hypothesis for the second question was that nonverbal cues would be preferred over verbal cues. An existing standard platform was used, Softbank's Pepper, as well as state-of-the-art tasks from the RoboCup@Home challenge. The rule book and observations from the 2018 competition were thematically coded and the themes created eight scenarios. A participatory design method called PICTIVE was used in a design study, where five student participants went through three phases, label, sketch and interview, to create a design for how the robot should communicate intent and awareness. The use of PICTIVE was a suitable way to extract a lot of design ideas. However, not all scenarios were optimal for the task. The design study confirmed the use of mediating physical attributes to alter the mental model of a humanoid robot to reach common ground. Further, it did not confirm the hypothesis that nonverbal cues would be preferred over verbal cues, though it did show that verbal cues would not be enough. This, however, needs to be further tested in live interactions.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-158033
Date January 2019
CreatorsThunberg, Sofia
PublisherLinköpings universitet, Interaktiva och kognitiva system
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0028 seconds