• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 41
  • 3
  • 2
  • 1
  • Tagged with
  • 54
  • 54
  • 53
  • 19
  • 12
  • 12
  • 11
  • 10
  • 10
  • 10
  • 10
  • 9
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Toward Enabling Safe & Efficient Human-Robot Manipulation in Shared Workspaces

Hayne, Rafi 01 September 2016 (has links)
"When humans interact, there are many avenues of physical communication available ranging from vocal to physical gestures. In our past observations, when humans collaborate on manipulation tasks in shared workspaces there is often minimal to no verbal or physical communication, yet the collaboration is still fluid with minimal interferences between partners. However, when humans perform similar tasks in the presence of a robot collaborator, manipulation can be clumsy, disconnected, or simply not human-like. The focus of this work is to leverage our observations of human-human interaction in a robot's motion planner in order to facilitate more safe, efficient, and human-like collaborative manipulation in shared workspaces. We first present an approach to formulating the cost function for a motion planner intended for human-robot collaboration such that robot motions are both safe and efficient. To achieve this, we propose two factors to consider in the cost function for the robot's motion planner: (1) Avoidance of the workspace previously-occupied by the human, so robot motion is safe as possible, and (2) Consistency of the robot's motion, so that the motion is predictable as possible for the human and they can perform their task without focusing undue attention on the robot. Our experiments in simulation and a human-robot workspace sharing study compare a cost function that uses only the first factor and a combined cost that uses both factors vs. a baseline method that is perfectly consistent but does not account for the human's previous motion. We find using either cost function we outperform the baseline method in terms of task success rate without degrading the task completion time. The best task success rate is achieved with the cost function that includes both the avoidance and consistency terms. Next, we present an approach to human-attention aware robot motion generation which attempts to convey intent of the robot's task to its collaborator. We capture human attention through the combined use of a wearable eye-tracker and motion capture system. Since human attention isn't static, we present a method of generating a motion policy that can be queried online. Finally, we show preliminary tests of this method."
2

Implementering av människa-robot samarbetscell i en labbmiljö : Implementation av människa-robot samarbete / Implementation of human-robot collaboration in a lab environment : Implementation of human-robot collaboration

Wiemann, Marcus January 2019 (has links)
I ett tidigare genomfört examensarbete gjordes en förstudie på hur en människa-robot samarbetscell skulle fungera i en laboratoriemiljö. I Arvid Bobergs ”HRC Implementation in laboratory environment” skulle cellen tas fram på uppdrag av Eurofins för att arbeta med kemikalie- och mikrobiologiska analyser inom jordbruk, mat och miljö. För att verifiera de lösningsförslag som togs fram skulle en implementering behöva utformas i en fysisk miljö.  Projektets huvudsyfte var att ta fram en samarbetscell som skulle utföra arbetsuppgifter i en labbmiljö. För detta ändamål har en station, arbetsmoment och komponenter tagits fram och implementerats på ASSAR. Stationen har programmerats för att visa upp möjligheterna som roboten har att erbjuda i en samarbetscell med hjälp av ABB RobotStudio och online programmering.  Valet av robot var om möjligt att använda sig av ABB:s YuMi robot. Detta för att det var roboten som förstudien som arbetet bygger på använde sig av i dess modell samt byggde sin teori på och eftersom förstudiens arbete ligger till grunden för detta projekt.  Implementationen av stationen har genomförts i steg för att kunna testa olika upplägg och erhålla bättre förståelse av robotens egenskaper och vad den är kapabel att utföra i förhållande till räckvidd och flexibilitet. För att skapa de mer avancerade funktionerna i programmet användes offline programmering i ABB RobotStudio kombinerat med hjälp av online programmering. Funktionerna blir för avancerade för att skriva i en TeachPendant eftersom det blir långa rader med kod för att skapa de avancerade funktioner som roboten använder sig av för att utföra sina arbetsuppgifter.  Arbetet på ASSAR har lett till att ett flertal olika lösningar har tagits fram och tänkts över tills ett koncept valts och implementerats på ASSAR. Detta i form av en samarbetscell som visar upp olika funktioner för att utföra arbetsuppgifter i en labbmiljö med hjälp av YuMi-roboten ifrån ABB och ett arbetsbord som skapats under projektets gång.  Projektet har uppnått flertalet uppsatta mål för arbetet men några har inte uppnåtts, detta på grund av förseningar som uppkommit under projektets gång. Förseningarna har gjort att arbetsgången ändrats och det resultat som författaren försökt uppnå förändrats för att ta fram en samarbetscell och ge ett resultat åt projektet. / In a previous final year project, a study was carried out on how a robotic collaborative cell would work in a laboratory environment. In Arvid Bobergs "HRC Implementation in laboratory environment" the cell would be developed on behalf of Eurofins to work with chemical and microbiological analyses in agriculture, food and environment. To verify the suggested solutions, an implementation would need to be designed in a physical environment. The main purpose of the project was to develop a collaborative cell that would perform tasks in a lab environment. For this purpose, a station, work operations and components have been developed and implemented at ASSAR. The station has been programmed to showcase the possibilities the robot has to offer in a collaborative cell with the help of ABB Robot Studio and online programming. The choice of the robot was if possible, to make use of ABB's YuMi robot. This is because it was the robot that the pre-study that the work is based on used in its model and built its theory on and because the work of the feasibility study is the foundation of this project. The implementation of the station has been completed in steps to be able to test different structure and obtain a better understanding of the robot's characteristics and what it is capable to perform in relation to range and flexibility. To create the more advanced features of the program was used offline programming in ABB Robot Studio combined with the help of online programming. The functions become too advanced to write in a TeachPendant because there will be long lines of code to create the advanced functions that the robot uses to perform its tasks. The work at ASSAR has led to several different solutions being developed and thought over until a concept has been chosen and implemented at ASSAR. This in the form of a collaborative cell that showcases various functions to perform tasks in a lab environment using the YuMi robot from ABB and a worktable created during the project. The project has achieved several goals for the work, but some have not been achieved, because of delays that have arisen during the course of the project. The delays have made the workflow change and the result that the author has tried to achieve has changed to develop a collaborative cell and give a result to the project.
3

Indoor robot localization and collaboration

Zaharans, Eriks January 2013 (has links)
The purpose of this thesis is to create an indoor rescue scenario with multiple self-localizing robots that are able to collaborate for a victim search. Victims are represented by RFID tags and detecting them combined with an accurate enough location data is considered as a successful finding. This setup is created for use in a laboratory assignment at Linköping University. We consider the indoor localization problem by trying to use as few sensors as possible and implement three indoor localization methods - odometry based, passive RFID based, and our approach by fusing both sensor data with particle filter.The Results show that particle filter based localization performs the best in comparison to the two other implemented methods and satisfies the accuracy requirements stated for the scenario. The victim search problem is solved by an ant mobility (pheromone-based) approach which integrates our localization method and provides a collaborative navigation through the rescue area. The purpose of the pheromone mobility approach is to achieve a high coverage with an acceptable resource consumption.Experiments show that area is covered with approximately 30-40% overhead in traveled distance comparing to an optimal path.
4

Facilitating Human-Robot Collaboration Using a Mixed-Reality Projection System

January 2017 (has links)
abstract: Human-Robot collaboration can be a challenging exercise especially when both the human and the robot want to work simultaneously on a given task. It becomes difficult for the human to understand the intentions of the robot and vice-versa. To overcome this problem, a novel approach using the concept of Mixed-Reality has been proposed, which uses the surrounding space as the canvas to augment projected information on and around 3D objects. A vision based tracking algorithm precisely detects the pose and state of the 3D objects, and human-skeleton tracking is performed to create a system that is both human-aware as well as context-aware. Additionally, the system can warn humans about the intentions of the robot, thereby creating a safer environment to work in. An easy-to-use and universal visual language has been created which could form the basis for interaction in various human-robot collaborations in manufacturing industries. An objective and subjective user study was conducted to test the hypothesis, that using this system to execute a human-robot collaborative task would result in higher performance as compared to using other traditional methods like printed instructions and through mobile devices. Multiple measuring tools were devised to analyze the data which finally led to the conclusion that the proposed mixed-reality projection system does improve the human-robot team's efficiency and effectiveness and hence, will be a better alternative in the future. / Dissertation/Thesis / Masters Thesis Computer Science 2017
5

Using Augmented Reality technology to improve health and safety for workers in Human Robot Collaboration environment: A literature review

Chemmanthitta Gopinath, Dinesh January 2022 (has links)
Human Robot Collaboration (HRC) allows humans to operate more efficiently by reducing their human effort. Robots can do the majority of difficult and repetitive activities with or without human input. There is a risk of accidents and crashes when people and robots operate together closely. In this area, safety is extremely important. There are various techniques to increase worker safety, and one of the ways is to use Augmented Reality (AR). AR implementation in industries is still in its early stages. The goal of this study is to see how employees' safety may be enhanced when AR is used in an HRC setting. A literature review is carried out, as well as a case study in which managers and engineers from Swedish firms are questioned about their experiences with AR-assisted safety. This is a qualitative exploratory study with the goal of gathering extensive insight into the field, since the goal is to explore approaches for AR to improve safety. Inductive qualitative analysis was used to examine the data. Visualisation, awareness, ergonomics, and communication are the most critical areas where AR may improve safety, according to the studies. When doing a task, augmented reality aids the user in visualizing instructions and information, allowing them to complete the task more quickly and without mistakes. When working near robots, AR enhances awareness and predicts mishaps, as well as worker trust in a collaborative atmosphere. When AR is utilized to engage with collaborative robots, it causes less physical and psychological challenges than when traditional approaches are employed. AR allows operators to communicate with robots without having to touch them, as well as make adjustments. As a result, accidents are avoided and safety is ensured. There is a gap between theoretical study findings and data gathered from interviews in real time. Even though AR and HRC are not new topics, and many studies are being conducted on them, there are key aspects that influence their adoption in sectors. Due to considerations such as education, experience, suitability, system complexity, time, and technology, HRC and AR are employed less for assuring safety in industries by managers in various firms. In this study, possible future solutions to these challenges are also presented.
6

Using Motion Capture and Virtual Reality to test the advantages of Human Robot Collaboration

Rivera, Francisco January 2019 (has links)
Nowadays Virtual Reality (VR) and Human Robot Collaboration (HRC) are becoming more and more important in Industry as well as science. This investigation studies the applications of these two technologies in the ergonomic field by developing a system able to visualise and present ergonomics evaluation results in real time assembly tasks in a VR Environment, and also, evaluating the advantages of Human Robot Collaboration by studying in Virtual Reality a specific operation carried at Volvo Global Trucks Operation´s factory in Skövde. Regarding the first part of this investigation an innovative system was developed able to show ergonomic feedbacks in real time, as well as make ergonomic evaluations of the whole workload inside of a VR environment. This system can be useful for future research in the Virtual Ergonomics field regarding matters related to ergonomic learning rate of the workers when performing assembly tasks, design of ergonomic workstations, effect of different types assembly instructions in VR and a wide variety of different applications. The assembly operation with and without robot was created in IPS to use its VR functionality in order to test the assembly task in real users with natural movements of the body. The posture data of the users performing the tasks in Virtual Reality was collected. The users performed the task without the collaborative robot and then, with the collaborative robot. Their posture data was collected by using a Motion Capture equipment called Smart Textiles (developed at the University of Skövde) and the two different ergonomic evaluations (Using Smart Textiles’ criteria) of the two different task compared. The results show that when the robot implemented in this specific assembly task, the posture of the workers (specially the posture of the arms) has a great improvement if it is compared to the same task without the robot.
7

Affective Motivational Collaboration Theory

Shayganfar, Mohammad 25 January 2017 (has links)
Existing computational theories of collaboration explain some of the important concepts underlying collaboration, e.g., the collaborators' commitments and communication. However, the underlying processes required to dynamically maintain the elements of the collaboration structure are largely unexplained. Our main insight is that in many collaborative situations acknowledging or ignoring a collaborator's affective state can facilitate or impede the progress of the collaboration. This implies that collaborative agents need to employ affect-related processes that (1) use the collaboration structure to evaluate the status of the collaboration, and (2) influence the collaboration structure when required. This thesis develops a new affect-driven computational framework to achieve these objectives and thus empower agents to be better collaborators. Contributions of this thesis are: (1) Affective Motivational Collaboration (AMC) theory, which incorporates appraisal processes into SharedPlans theory. (2) New computational appraisal algorithms based on collaboration structure. (3) Algorithms such as goal management, that use the output of appraisal to maintain collaboration structures. (4) Implementation of a computational system based on AMC theory. (5) Evaluation of AMC theory via two user studies to a) validate our appraisal algorithms, and b) investigate the overall functionality of our framework within an end-to-end system with a human and a robot.
8

Virtual lead-through robot programming : Programming virtual robot by demonstration

Boberg, Arvid January 2015 (has links)
This report describes the development of an application which allows a user to program a robot in a virtual environment by the use of hand motions and gestures. The application is inspired by the use of robot lead-through programming which is an easy and hands-on approach for programming robots, but instead of performing it online which creates loss in productivity the strength from offline programming where the user operates in a virtual environment is used as well. Thus, this is a method which saves on the economy and prevents contamination of the environment. To convey hand gesture information into the application which will be implemented for RobotStudio, a Kinect sensor is used for entering the data into the virtual environment. Similar work has been performed before where, by using hand movements, a physical robot’s movement can be manipulated, but for virtual robots not so much. The results could simplify the process of programming robots and supports the work towards Human-Robot Collaboration as it allows people to interact and communicate with robots, a major focus of this work. The application was developed in the programming language C# and has two different functions that interact with each other, one for the Kinect and its tracking and the other for installing the application in RobotStudio and implementing the calculated data into the robot. The Kinect’s functionality is utilized through three simple hand gestures to jog and create targets for the robot: open, closed and “lasso”. A prototype of this application was completed which through motions allowed the user to teach a virtual robot desired tasks by moving it to different positions and saving them by doing hand gestures. The prototype could be applied to both one-armed robots as well as to a two-armed robot such as ABB’s YuMi. The robot's orientation while running was too complicated to be developed and implemented in time and became the application's main bottleneck, but remained as one of several other suggestions for further work in this project.
9

An Augmented Reality Human-Robot Collaboration System

Green, Scott Armstrong January 2008 (has links)
Although robotics is well established as a research field, there has been relatively little work on human-robot collaboration. This type of collaboration is going to become an increasingly important issue as robots work ever more closely with humans. Clearly, there is a growing need for research on human-robot collaboration and communication between humans and robotic systems. Research into human-human communication can be used as a starting point in developing a robust human-robot collaboration system. Previous research into collaborative efforts with humans has shown that grounding, situational awareness, a common frame of reference and spatial referencing are vital in effective communication. Therefore, these items comprise a list of required attributes of an effective human-robot collaborative system. Augmented Reality (AR) is a technology for overlaying three-dimensional virtual graphics onto the user's view of the real world. It also allows for real time interaction with these virtual graphics, enabling a user to reach into the augmented world and manipulate it directly. The internal state of a robot and its intended actions can be displayed through the virtual imagery in the AR environment. Therefore, AR can bridge the divide between human and robotic systems and enable effective human-robot collaboration. This thesis describes the work involved in developing the Augmented Reality Human-Robot Collaboration (AR-HRC) System. It first garners design criteria for the system from a review of communication and collaboration in human-human interaction, the current state of Human-Robot Interaction (HRI) and related work in AR. A review of research in multimodal interfaces is then provided highlighting the benefits of using such an interface design. Therefore, an AR multimodal interface was developed to determine if this type of design improved performance over a single modality design. Indeed, the multimodal interface was found to improve performance, thereby providing the impetus to use a multimodal design approach for the AR-HRC system. The architectural design of the system is then presented. A user study conducted to determine what kind of interaction people would use when collaborating with a mobile robot is discussed and then the integration of a mobile robot is described. Finally, an evaluation of the AR-HRC system is presented.
10

Evaluation of a human-robot collaboration in an industrial workstation

Gonzalez, Victoria, Ruiz Castro, Pamela January 2018 (has links)
The fast changes in the industry require improved production workstations which ensure the workers' safety and improve the efficiency of the production. Technology developments and revised legislation have increased the possibility of using collaborative robots. This allows for new types of industry workstations where robots and humans cooperate in performing tasks. In addition to safety, the design of collaborative workstations needs to consider the areas of ergonomics and task allocation to ensure appropriate work conditions for the operators, while providing overall system efficiency. By facilitating the design development process of such workstations, the use of software simulations can help in gaining quality, save time and money by supporting decision making and testing concepts before creating a physical workstation, in turn, aimed to lead to better final solutions and a faster process of implementation or reconfiguration. The aim of this study is to investigate the possibility of having a human-robot collaboration in a workstation that is based on a use-case from the industry. The concept designs will be simulated and verified through a physical prototype, with which ergonomic analysis, time analysis, and risk assessments will be compared to validate the resultant collaborative workstation.

Page generated in 0.1037 seconds