21 |
Distance-Scaled Human-Robot Interaction with Hybrid CamerasPai, Abhishek 24 October 2019 (has links)
No description available.
|
22 |
Are mindless robots less threatening? : The role of transparency about robots’ lack of human-like psychological capabilities / Upplevs robotar utan tankeförmåga som ett mindre hot? : Vikten av transparens gällande robotars brist på människolika psykologiska förmågorWillyams, Emma January 2023 (has links)
The use of social robots is often seen as a solution for handling future challenges such as caring for a growing population of elderly people. However, previous research has shown that robots can be perceived as threatening and for a successful implementation of robots in society we are dependent on the public’s acceptance of the technology. This thesis investigates whether transparency about robots’ lack of human-like psychological capabilities can reduce the perceived damage of robots on humans and human identity (henceforth “perceived damage”), and whether the effect of such transparency is moderated by the human-like appearance of the robot. Ninety-two study participants, randomly assigned to either a transparent or neutral condition, were presented with pictures and descriptions of four robots that varied in human-likeness. The capabilities of the robots were differently described in the two conditions, using either non-psychological (e.g., “programmed responses”) or psychological terminology (e.g., “respond in natural manner”). Participants subsequently filled in a scale measuring perceived damage. The results showed that the transparent condition was associated with lower perceived damage than the non-transparent condition for the most human-like robot. There was no significant interaction effect between transparency and human-like appearance. The findings suggest that transparency about robots' lack of human-like psychological capabilities can have a positive effect on perceived damage when the robot has a very human-like appearance. However, further research is needed to investigate whether transparency reduces perceived damage in real-world interactions with robots.
|
23 |
Supporting Flight Control for UAV-Assisted Wilderness Search and Rescue Through Human Centered Interface DesignCooper, Joseph L. 15 November 2007 (has links) (PDF)
Inexpensive, rapidly deployable, camera-equipped Unmanned Aerial Vehicle (UAV) systems can potentially assist with a huge number of tasks. However, in many cases such as wilderness search and rescue (WiSAR), the potential users of the system may not be trained as pilots. Simple interface concepts can be used to build an interaction layer that allows an individual with minimal operator training to use the system to facilitate a search or inspection task. We describe an analysis of WiSAR as currently accomplished and show how a UAV system might fit into the existing structure. We then discuss preliminary system design efforts for making UAV-enabled search possible and practical. Finally, we present both a carefully controlled experiment and partially structured field trials that illustrate principles for making UAV-assisted search a reality. Our experiments show that the traditional method for controlling a camera-enabled UAV is significantly more difficult than integrated methods. Success and troubles during field trials illustrate several desiderata and information needs for a UAV search system.
|
24 |
Human-Swarm Interaction: Effects on Operator Workload, Scale, and Swarm TopologyPendleton, Brian O. 04 September 2013 (has links) (PDF)
Robots, including UAVs, have found increasing use in helping humans with dangerous and difficult tasks. The number of robots in use is increasing and is likely to continue increasing in the future. As the number of robots increases, human operators will need to coordinate and control the actions of large teams of robots. While multi-robot supervisory control has been widely studied, it requires that an operator divide his or her attention between robots. Consequently, the use of multi-robot supervisory control is limited by the number of robots that a human or team of humans can reasonably control. Swarm robotics -- large numbers of low-cost robots displaying collective behaviors -- offers an alternative approach by providing the operator with a small set of inputs and parameters that alter the behavior of a large number of autonomous or semi-autonomous robots. Researchers have asserted that this approach is more scalable and offers greater promise for managing huge numbers of robots. The emerging field of Human-Swarm Interaction (HSI) deals with the effective management of swarms by human operators. In this thesis we offer foundational work on the effect of HSI (a) on the individual robots, (b) on the group as a whole, and (c) on the workload of the human operator. We (1) show that existing general swarm algorithms are feasible on existing robots and can display collective behaviors as shown in simulations in the literature, (2) analyze the effect of interaction style and neighborhood type on the swarm's topology, (3) demonstrate that operator workload stays stable as the size of the swarm increases, but (4) find that operator workload is influenced by the interaction style. We also present considerations for swarm deployment on real robots.
|
25 |
Investigating Augmented Reality for Improving Child-Robot InteractionHansson, Emmeli January 2019 (has links)
Communication in HRI, both verbal and non-verbal, can be hard for a robot to interpret and to convey which can lead to misinterpretations by both the human and the robot. In this thesis we look at answering the question if AR can be used to improve communication of a social robot’s intentions when interacting with children. We looked at behaviors such as getting children to pick up a cube, place a cube, give the cube to another child, tap the cube and shake the cube. We found that picking the cube was the most successful and reliable behavior and that most behaviors were slightly better with AR. Additionally, endorsement behavior was found to be necessary to engage the children, however, it needs to be quicker, more responsive and clearer. In conclusion, there is potential for using AR to improve the intent communication of a robot, but in many cases, the robot behavior alone was already quite clear. A larger study would need to be conducted to further explore this. / I Människa-Robot Interaktion kan både verbal och icke-verbal kommunikation vara svårt för en robot att förstå och förmedla vilket kan leda till missförstånd från både människans och robotens håll. I den här rapporten vill vi svara på frågan ifall AR kan användas för att förbättra kommunikationen av en social robots avsikter när den interagerar med barn. De beteenden vi kollade på var att få ett barn att plocka upp en kub, placera den, ge den till ett annat barn, knacka på den och skaka den. Resultaten var att plocka upp kuben var det mest framgångsrika och pålitliga beteendet och att de flesta beteenden var marginellt bättre med AR. Utöver det hittade vi också att bifallsbeteenden behövdes för att engagera barnen men behövde vara snabbare, mer responsiva och tydligare. Sammanfattningsvis finns det potential för att använda AR, men i många fall var enbart robotens beteenden redan väldigt tydliga. En större studie skulle behövas för att utforska detta ytterligare.
|
26 |
The Effects of a Humanoid Robot's Non-lexical Vocalization on Emotion Recognition and Robot PerceptionLiu, Xiaozhen 30 June 2023 (has links)
As robots have become more pervasive in our everyday life, social aspects of robots have attracted researchers' attention. Because emotions play a key role in social interactions, research has been conducted on conveying emotions via speech, whereas little research has focused on the effects of non-speech sounds on users' robot perception. We conducted a within-subjects exploratory study with 40 young adults to investigate the effects of non-speech sounds (regular voice, characterized voice, musical sound, and no sound) and basic emotions (anger, fear, happiness, sadness, and surprise) on user perception. While listening to the fairytale with the participant, a humanoid robot (Pepper) responded to the story with a recorded emotional sound with a gesture. Participants showed significantly higher emotion recognition accuracy from the regular voice than from other sounds. The confusion matrix showed that happiness and sadness had the highest emotion recognition accuracy, which aligns with the previous research. Regular voice also induced higher trust, naturalness, and preference compared to other sounds. Interestingly, musical sound mostly showed lower perceptions than no sound.
A further exploratory study was conducted with an additional 49 young people to investigate the effect of regular non-verbal voices (female voices and male voices) and basic emotions (happiness, sadness, anger, and relief) on user perception. We also further explored the impact of participants' gender on emotion and social perception toward robot Pepper. While listening to a fairy tale with the participants, a humanoid robot (Pepper) responded to the story with gestures and emotional voices. Participants showed significantly higher emotion recognition accuracy and social perception from the voice + Gesture condition than Gesture only conditions. The confusion matrix showed that happiness and sadness had the highest emotion recognition accuracy, which aligns with the previous research. Interestingly, participants felt more discomfort and anthropomorphism in male voices compared to female voices. Male participants were more likely to feel uncomfortable when interacting with Pepper. In contrast, female participants were more likely to feel warm. However, the gender of the robot voice or the gender of the participant did not affect the accuracy of emotion recognition. Results are discussed with social robot design guidelines for emotional cues and future research directions. / Master of Science / As robots increasingly appear in people's lives as functional assistants or for entertainment, there are more and more scenarios in which people interact with robots. More research on human-robot interaction is being proposed to help develop more natural ways of interaction. Our study focuses on the effects of emotions conveyed by a humanoid robot's non-speech sounds on people's perception about the robot and its emotions. The results of our experiments show that the accuracy of emotion recognition of regular voices is significantly higher than that of music and robot-like voices and elicits higher trust, naturalness, and preference. The gender of the robot's voice or the gender of the participant did not affect the accuracy of emotion recognition. People are now not inclined to traditional stereotypes of robotic voices (e.g., like old movies), and expressing emotions with music and gestures mostly shows a lower perception. Happiness and sadness were identified with the highest accuracy among the emotions we studied. Participants felt more discomfort and human-likeness in the male voices than in female voices. Male participants were more likely to feel uncomfortable when interacting with the humanoid robot, while female participants were more likely to feel warm. Our study discusses design guidelines and future research directions for emotional cues in social robots.
|
27 |
From a Machine to a CollaboratorBozorgmehrian, Shokoufeh 05 January 2024 (has links)
This thesis book represents an exploration of the relationship between architecture and robotics, tailored to meet the requirements of both architecture students and professionals and any other creative user. The investigation encompasses three distinct robotic arm applications for architecture students, introduces and evaluates an innovative 3D printing application with robotic arms, and presents projects focused on the design of human-robot interaction techniques and their system development. Furthermore, the thesis showcases the development of a more intuitive human-robot interaction system and explores various user interaction methods with robotic arms for rapid prototyping and fabrication. Each experiment describes the process, level of interaction, and key takeaways. The narrative of the thesis unfolds as a journey through different applications of robotic fabrication, emphasizing the creative human as the focal point of these systems. This thesis underscores the significance of user experience research and anticipates future innovations in the evolving landscape of the creative field. The discoveries made in this exploration lay a foundation for the study and design of interfaces and interaction techniques, fostering seamless collaboration between designers and robotic systems. Keywords: Robotic Fabrication - Human-Robot Interaction (HRI) - Human-Computer Interaction (HCI) - User Experience Research - Human-Centered Design - Architecture - Art - Creative Application / Master of Architecture
|
28 |
Layer Based Auditory Displays Of Robots’ Actions And IntentionsOrthmann, Bastian January 2021 (has links)
Unintentional encounters between robots and humans will increase in the future and require concepts for communicating the robots’ internal states. Auditory displays can be used to convey the relevant information to people who share public spaces with social robots. Based on data gathered in a participatory design workshop with robot experts, a layer based approach for real-time generated audio feedback is introduced, where the information to be displayed is mapped to certain audio parameters. First exploratory sound designs were created and evaluated in an online study. The results show which audio parameter mappings should be examined further to display certain internal states, like e.g. mapping amplitude modulation to the robot’s speed or enhancing alarm frequencies for indicating urgent tasks. Features such as speed, urgency and large size were correctly identified in more than 50% of evaluations, while information about the robot’s interactivity or its small size were not comprehensible to the participants. / Oavsiktliga möten mellan robotar och människor kommer öka i framtiden vilket kräver koncept för att kommunicera robotarnas inre tillstånd. Ljuddisplayer kan användas för att förmedla relevant information till människor som delar offentliga utrymmen med sociala robotar. Baserat på data som samlats in i en deltagande designworkshop med robottexperter introduceras ett lagerbaserat tillvägagångssätt för realtidsgenererad ljudåterkoppling, där den information som ska visas mappas till vissa ljudparametrar. Explorativa ljuddesigns skapades och utvärderades i en online-studie. Resultaten visar vilka ljudparametrar som bör undersökas ytterligare för att visa vissa interna tillstånd, som t.ex. att mappa amplitudmodulering till robotens hastighet eller att förbättra larmfrekvenser för att indikera brådskande ärenden. Egenskaper som hastighet, brådska och stor storlek identifierades korrekt i mer än 50 % av utvärderingarna, men information om robotens interaktivitet och lilla storlek var svårbegriplig för deltagarna.
|
29 |
Animism and Anthropomorphism in Living Spaces : Designing for 'Life' in spatial interactionsMenon, Arjun Rajendran January 2020 (has links)
Integrating animism and anthropomorphism into technology and our interactions with said technology allows for the design of better affordances, easier comprehension, and more intricate interactions between humans and technological artefacts. This study seeks to understand the circumstances and contexts under which humans tend to form emotional bonds with nonhuman entities and ascribe life-like or human-like qualities to them, through qualitative research. It also seeks to investigate whether animism and anthropomorphism apply to abstract entities such as a space, through ‘constructive design-based research’ and ‘thing-centered design’ methodologies. The investigations yield several insights in general, that are useful to designers attempting to incorporate animism and anthropomorphism into their work. The prototyping led to the creation of a prototype space that can serve as the foundation for future research. / Integrering av animism och antropomorfism i teknik och vår interaktion med nämnda teknik möjliggör design av bättre överkomliga priser, lättare förståelse och mer invecklade interaktioner mellan människor och tekniska artefakter. Denna studie syftar till att förstå de omständigheter och sammanhang under vilka människor tenderar att bilda känslomässiga band med icke-mänskliga enheter och tillskriva dem livsliknande eller mänskliga egenskaper genom kvalitativ forskning. Det försöker också undersöka om animism och antropomorfism gäller abstrakta enheter som ett utrymme, genom ‘constructive design-based research’ och ‘thing-centered design’ metoder. Undersökningarna ger i allmänhet flera insikter som är användbara för designers som försöker integrera animism och antropomorfism i sitt arbete. Prototyperingen ledde till skapandet av ett prototyputrymme som kan tjäna som grund för framtida forskning.
|
30 |
Do Autistic Individuals Experience the Uncanny Valley Phenomenon?: The Role of Theory of Mind in Human-Robot InteractionJaramillo, Isabella 01 August 2015 (has links)
Theory of Mind (ToM) has repeatedly been defined as the ability to understand that others believe their own things based on their own subjective interpretations and experiences, and that their thoughts are determined independently from your own. In this study, we wanted to see if individual differences in ToM are capable of causing different perceptions of an individual's interactions with human like robotics and highlight whether or not individual differences in ToM account for different levels of how individuals experience what is called the "Uncanny Valley phenomenon" and to see whether or not having a fully developed theory of mind is essential to the perception of the interaction. This was assessed by inquiring whether or not individuals with Autism Spectrum Disorder (ASD) perceive robotics and artificially intelligent technology in the same ways that typically developed individuals do; we focused on the growing use of social robotics in ASD therapies. Studies have indicated that differences of ToM exist between individuals with ASD and those who are typically developed. Comparably, we were also curious to see if differences in empathy levels also accounted for differences in ToM and thus a difference in the perceptions of human like robotics. A robotic image rating survey was administered to a group of University of central Florida students, as well as 2 surveys - the Autism Spectrum Quotient (ASQ) and the Basic Empathy Scale (BES), which helped optimize a measurement for theory of mind. Although the results of this study did not support the claim that individuals with ASD do not experience the uncanny valley differently than typically developed individuals, there were significant enough results to conclude that different levels of empathy may account for individual differences in the uncanny valley. People with low empathy seemed to have experienced less of an uncanny valley feeling, while people with higher recorded empathy showed to experience more of an uncanny valley sensitivity.
|
Page generated in 0.0278 seconds