Spelling suggestions: "subject:"humanrobot collaboration"" "subject:"humanrobot acollaboration""
1 |
Toward Enabling Safe & Efficient Human-Robot Manipulation in Shared WorkspacesHayne, Rafi 01 September 2016 (has links)
"When humans interact, there are many avenues of physical communication available ranging from vocal to physical gestures. In our past observations, when humans collaborate on manipulation tasks in shared workspaces there is often minimal to no verbal or physical communication, yet the collaboration is still fluid with minimal interferences between partners. However, when humans perform similar tasks in the presence of a robot collaborator, manipulation can be clumsy, disconnected, or simply not human-like. The focus of this work is to leverage our observations of human-human interaction in a robot's motion planner in order to facilitate more safe, efficient, and human-like collaborative manipulation in shared workspaces. We first present an approach to formulating the cost function for a motion planner intended for human-robot collaboration such that robot motions are both safe and efficient. To achieve this, we propose two factors to consider in the cost function for the robot's motion planner: (1) Avoidance of the workspace previously-occupied by the human, so robot motion is safe as possible, and (2) Consistency of the robot's motion, so that the motion is predictable as possible for the human and they can perform their task without focusing undue attention on the robot. Our experiments in simulation and a human-robot workspace sharing study compare a cost function that uses only the first factor and a combined cost that uses both factors vs. a baseline method that is perfectly consistent but does not account for the human's previous motion. We find using either cost function we outperform the baseline method in terms of task success rate without degrading the task completion time. The best task success rate is achieved with the cost function that includes both the avoidance and consistency terms. Next, we present an approach to human-attention aware robot motion generation which attempts to convey intent of the robot's task to its collaborator. We capture human attention through the combined use of a wearable eye-tracker and motion capture system. Since human attention isn't static, we present a method of generating a motion policy that can be queried online. Finally, we show preliminary tests of this method."
|
2 |
Implementering av människa-robot samarbetscell i en labbmiljö : Implementation av människa-robot samarbete / Implementation of human-robot collaboration in a lab environment : Implementation of human-robot collaborationWiemann, Marcus January 2019 (has links)
I ett tidigare genomfört examensarbete gjordes en förstudie på hur en människa-robot samarbetscell skulle fungera i en laboratoriemiljö. I Arvid Bobergs ”HRC Implementation in laboratory environment” skulle cellen tas fram på uppdrag av Eurofins för att arbeta med kemikalie- och mikrobiologiska analyser inom jordbruk, mat och miljö. För att verifiera de lösningsförslag som togs fram skulle en implementering behöva utformas i en fysisk miljö. Projektets huvudsyfte var att ta fram en samarbetscell som skulle utföra arbetsuppgifter i en labbmiljö. För detta ändamål har en station, arbetsmoment och komponenter tagits fram och implementerats på ASSAR. Stationen har programmerats för att visa upp möjligheterna som roboten har att erbjuda i en samarbetscell med hjälp av ABB RobotStudio och online programmering. Valet av robot var om möjligt att använda sig av ABB:s YuMi robot. Detta för att det var roboten som förstudien som arbetet bygger på använde sig av i dess modell samt byggde sin teori på och eftersom förstudiens arbete ligger till grunden för detta projekt. Implementationen av stationen har genomförts i steg för att kunna testa olika upplägg och erhålla bättre förståelse av robotens egenskaper och vad den är kapabel att utföra i förhållande till räckvidd och flexibilitet. För att skapa de mer avancerade funktionerna i programmet användes offline programmering i ABB RobotStudio kombinerat med hjälp av online programmering. Funktionerna blir för avancerade för att skriva i en TeachPendant eftersom det blir långa rader med kod för att skapa de avancerade funktioner som roboten använder sig av för att utföra sina arbetsuppgifter. Arbetet på ASSAR har lett till att ett flertal olika lösningar har tagits fram och tänkts över tills ett koncept valts och implementerats på ASSAR. Detta i form av en samarbetscell som visar upp olika funktioner för att utföra arbetsuppgifter i en labbmiljö med hjälp av YuMi-roboten ifrån ABB och ett arbetsbord som skapats under projektets gång. Projektet har uppnått flertalet uppsatta mål för arbetet men några har inte uppnåtts, detta på grund av förseningar som uppkommit under projektets gång. Förseningarna har gjort att arbetsgången ändrats och det resultat som författaren försökt uppnå förändrats för att ta fram en samarbetscell och ge ett resultat åt projektet. / In a previous final year project, a study was carried out on how a robotic collaborative cell would work in a laboratory environment. In Arvid Bobergs "HRC Implementation in laboratory environment" the cell would be developed on behalf of Eurofins to work with chemical and microbiological analyses in agriculture, food and environment. To verify the suggested solutions, an implementation would need to be designed in a physical environment. The main purpose of the project was to develop a collaborative cell that would perform tasks in a lab environment. For this purpose, a station, work operations and components have been developed and implemented at ASSAR. The station has been programmed to showcase the possibilities the robot has to offer in a collaborative cell with the help of ABB Robot Studio and online programming. The choice of the robot was if possible, to make use of ABB's YuMi robot. This is because it was the robot that the pre-study that the work is based on used in its model and built its theory on and because the work of the feasibility study is the foundation of this project. The implementation of the station has been completed in steps to be able to test different structure and obtain a better understanding of the robot's characteristics and what it is capable to perform in relation to range and flexibility. To create the more advanced features of the program was used offline programming in ABB Robot Studio combined with the help of online programming. The functions become too advanced to write in a TeachPendant because there will be long lines of code to create the advanced functions that the robot uses to perform its tasks. The work at ASSAR has led to several different solutions being developed and thought over until a concept has been chosen and implemented at ASSAR. This in the form of a collaborative cell that showcases various functions to perform tasks in a lab environment using the YuMi robot from ABB and a worktable created during the project. The project has achieved several goals for the work, but some have not been achieved, because of delays that have arisen during the course of the project. The delays have made the workflow change and the result that the author has tried to achieve has changed to develop a collaborative cell and give a result to the project.
|
3 |
Facilitating Human-Robot Collaboration Using a Mixed-Reality Projection SystemJanuary 2017 (has links)
abstract: Human-Robot collaboration can be a challenging exercise especially when both the human and the robot want to work simultaneously on a given task. It becomes difficult for the human to understand the intentions of the robot and vice-versa. To overcome this problem, a novel approach using the concept of Mixed-Reality has been proposed, which uses the surrounding space as the canvas to augment projected information on and around 3D objects. A vision based tracking algorithm precisely detects the pose and state of the 3D objects, and human-skeleton tracking is performed to create a system that is both human-aware as well as context-aware. Additionally, the system can warn humans about the intentions of the robot, thereby creating a safer environment to work in. An easy-to-use and universal visual language has been created which could form the basis for interaction in various human-robot collaborations in manufacturing industries.
An objective and subjective user study was conducted to test the hypothesis, that using this system to execute a human-robot collaborative task would result in higher performance as compared to using other traditional methods like printed instructions and through mobile devices. Multiple measuring tools were devised to analyze the data which finally led to the conclusion that the proposed mixed-reality projection system does improve the human-robot team's efficiency and effectiveness and hence, will be a better alternative in the future. / Dissertation/Thesis / Masters Thesis Computer Science 2017
|
4 |
Using Augmented Reality technology to improve health and safety for workers in Human Robot Collaboration environment: A literature reviewChemmanthitta Gopinath, Dinesh January 2022 (has links)
Human Robot Collaboration (HRC) allows humans to operate more efficiently by reducing their human effort. Robots can do the majority of difficult and repetitive activities with or without human input. There is a risk of accidents and crashes when people and robots operate together closely. In this area, safety is extremely important. There are various techniques to increase worker safety, and one of the ways is to use Augmented Reality (AR). AR implementation in industries is still in its early stages. The goal of this study is to see how employees' safety may be enhanced when AR is used in an HRC setting. A literature review is carried out, as well as a case study in which managers and engineers from Swedish firms are questioned about their experiences with AR-assisted safety. This is a qualitative exploratory study with the goal of gathering extensive insight into the field, since the goal is to explore approaches for AR to improve safety. Inductive qualitative analysis was used to examine the data. Visualisation, awareness, ergonomics, and communication are the most critical areas where AR may improve safety, according to the studies. When doing a task, augmented reality aids the user in visualizing instructions and information, allowing them to complete the task more quickly and without mistakes. When working near robots, AR enhances awareness and predicts mishaps, as well as worker trust in a collaborative atmosphere. When AR is utilized to engage with collaborative robots, it causes less physical and psychological challenges than when traditional approaches are employed. AR allows operators to communicate with robots without having to touch them, as well as make adjustments. As a result, accidents are avoided and safety is ensured. There is a gap between theoretical study findings and data gathered from interviews in real time. Even though AR and HRC are not new topics, and many studies are being conducted on them, there are key aspects that influence their adoption in sectors. Due to considerations such as education, experience, suitability, system complexity, time, and technology, HRC and AR are employed less for assuring safety in industries by managers in various firms. In this study, possible future solutions to these challenges are also presented.
|
5 |
Using Motion Capture and Virtual Reality to test the advantages of Human Robot CollaborationRivera, Francisco January 2019 (has links)
Nowadays Virtual Reality (VR) and Human Robot Collaboration (HRC) are becoming more and more important in Industry as well as science. This investigation studies the applications of these two technologies in the ergonomic field by developing a system able to visualise and present ergonomics evaluation results in real time assembly tasks in a VR Environment, and also, evaluating the advantages of Human Robot Collaboration by studying in Virtual Reality a specific operation carried at Volvo Global Trucks Operation´s factory in Skövde. Regarding the first part of this investigation an innovative system was developed able to show ergonomic feedbacks in real time, as well as make ergonomic evaluations of the whole workload inside of a VR environment. This system can be useful for future research in the Virtual Ergonomics field regarding matters related to ergonomic learning rate of the workers when performing assembly tasks, design of ergonomic workstations, effect of different types assembly instructions in VR and a wide variety of different applications. The assembly operation with and without robot was created in IPS to use its VR functionality in order to test the assembly task in real users with natural movements of the body. The posture data of the users performing the tasks in Virtual Reality was collected. The users performed the task without the collaborative robot and then, with the collaborative robot. Their posture data was collected by using a Motion Capture equipment called Smart Textiles (developed at the University of Skövde) and the two different ergonomic evaluations (Using Smart Textiles’ criteria) of the two different task compared. The results show that when the robot implemented in this specific assembly task, the posture of the workers (specially the posture of the arms) has a great improvement if it is compared to the same task without the robot.
|
6 |
Affective Motivational Collaboration TheoryShayganfar, Mohammad 25 January 2017 (has links)
Existing computational theories of collaboration explain some of the important concepts underlying collaboration, e.g., the collaborators' commitments and communication. However, the underlying processes required to dynamically maintain the elements of the collaboration structure are largely unexplained. Our main insight is that in many collaborative situations acknowledging or ignoring a collaborator's affective state can facilitate or impede the progress of the collaboration. This implies that collaborative agents need to employ affect-related processes that (1) use the collaboration structure to evaluate the status of the collaboration, and (2) influence the collaboration structure when required. This thesis develops a new affect-driven computational framework to achieve these objectives and thus empower agents to be better collaborators. Contributions of this thesis are: (1) Affective Motivational Collaboration (AMC) theory, which incorporates appraisal processes into SharedPlans theory. (2) New computational appraisal algorithms based on collaboration structure. (3) Algorithms such as goal management, that use the output of appraisal to maintain collaboration structures. (4) Implementation of a computational system based on AMC theory. (5) Evaluation of AMC theory via two user studies to a) validate our appraisal algorithms, and b) investigate the overall functionality of our framework within an end-to-end system with a human and a robot.
|
7 |
Virtual lead-through robot programming : Programming virtual robot by demonstrationBoberg, Arvid January 2015 (has links)
This report describes the development of an application which allows a user to program a robot in a virtual environment by the use of hand motions and gestures. The application is inspired by the use of robot lead-through programming which is an easy and hands-on approach for programming robots, but instead of performing it online which creates loss in productivity the strength from offline programming where the user operates in a virtual environment is used as well. Thus, this is a method which saves on the economy and prevents contamination of the environment. To convey hand gesture information into the application which will be implemented for RobotStudio, a Kinect sensor is used for entering the data into the virtual environment. Similar work has been performed before where, by using hand movements, a physical robot’s movement can be manipulated, but for virtual robots not so much. The results could simplify the process of programming robots and supports the work towards Human-Robot Collaboration as it allows people to interact and communicate with robots, a major focus of this work. The application was developed in the programming language C# and has two different functions that interact with each other, one for the Kinect and its tracking and the other for installing the application in RobotStudio and implementing the calculated data into the robot. The Kinect’s functionality is utilized through three simple hand gestures to jog and create targets for the robot: open, closed and “lasso”. A prototype of this application was completed which through motions allowed the user to teach a virtual robot desired tasks by moving it to different positions and saving them by doing hand gestures. The prototype could be applied to both one-armed robots as well as to a two-armed robot such as ABB’s YuMi. The robot's orientation while running was too complicated to be developed and implemented in time and became the application's main bottleneck, but remained as one of several other suggestions for further work in this project.
|
8 |
An Augmented Reality Human-Robot Collaboration SystemGreen, Scott Armstrong January 2008 (has links)
Although robotics is well established as a research field, there has been relatively little work on human-robot collaboration. This type of collaboration is going to become an increasingly important issue as robots work ever more closely with humans. Clearly, there is a growing need for research on human-robot collaboration and communication between humans and robotic systems.
Research into human-human communication can be used as a starting point in developing a robust human-robot collaboration system. Previous research into collaborative efforts with humans has shown that grounding, situational awareness, a common frame of reference and spatial referencing are vital in effective communication. Therefore, these items comprise a list of required attributes of an effective human-robot collaborative system.
Augmented Reality (AR) is a technology for overlaying three-dimensional virtual graphics onto the user's view of the real world. It also allows for real time interaction with these virtual graphics, enabling a user to reach into the augmented world and manipulate it directly. The internal state of a robot and its intended actions can be displayed through the virtual imagery in the AR environment. Therefore, AR can bridge the divide between human and robotic systems and enable effective human-robot collaboration.
This thesis describes the work involved in developing the Augmented Reality Human-Robot Collaboration (AR-HRC) System. It first garners design criteria for the system from a review of communication and collaboration in human-human interaction, the current state of Human-Robot Interaction (HRI) and related work in AR. A review of research in multimodal interfaces is then provided highlighting the benefits of using such an interface design. Therefore, an AR multimodal interface was developed to determine if this type of design improved performance over a single modality design. Indeed, the multimodal interface was found to improve performance, thereby providing the impetus to use a multimodal design approach for the AR-HRC system.
The architectural design of the system is then presented. A user study conducted to determine what kind of interaction people would use when collaborating with a mobile robot is discussed and then the integration of a mobile robot is described. Finally, an evaluation of the AR-HRC system is presented.
|
9 |
Evaluation of a human-robot collaboration in an industrial workstationGonzalez, Victoria, Ruiz Castro, Pamela January 2018 (has links)
The fast changes in the industry require improved production workstations which ensure the workers' safety and improve the efficiency of the production. Technology developments and revised legislation have increased the possibility of using collaborative robots. This allows for new types of industry workstations where robots and humans cooperate in performing tasks. In addition to safety, the design of collaborative workstations needs to consider the areas of ergonomics and task allocation to ensure appropriate work conditions for the operators, while providing overall system efficiency. By facilitating the design development process of such workstations, the use of software simulations can help in gaining quality, save time and money by supporting decision making and testing concepts before creating a physical workstation, in turn, aimed to lead to better final solutions and a faster process of implementation or reconfiguration. The aim of this study is to investigate the possibility of having a human-robot collaboration in a workstation that is based on a use-case from the industry. The concept designs will be simulated and verified through a physical prototype, with which ergonomic analysis, time analysis, and risk assessments will be compared to validate the resultant collaborative workstation.
|
10 |
HRC implementation in laboratory environment : Development of a HRC demonstratorBoberg, Arvid January 2018 (has links)
Eurofins is one of the world's largest laboratories which, among other things, offer chemical and microbiological analyses in agriculture, food and environment. Several 100.000 tests of various foods are executed each year at Eurofins’ facility in Jönköping and the current processes include much repeated manual tasks which could cause ergonomic problems. The company therefore wants to investigate the possibilities of utilizing Human-Robot Collaboration (HRC) at their facility. Human-Robot Collaboration is a growing concept that has made a big impression in both robot development and Industry 4.0. A HRC approach allow humans and robots to share their workspaces and work side by side, without being separated by a protective fence which is common among traditional industrial robots. Human-Robot Collaboration is therefore believed to be able to optimize the workflows and relieve human workers from unergonomic tasks. The overall aim of the research project presented is to help the company to gain a better understanding about the existing HRC technologies. To achieve this goal, the state-of-the-art of HRC had to be investigated and the needs, possibilities and limitations of HRC applications had to be identified at Eurofins’ facility. Once these have been addressed, a demonstrator could be built which could be used for evaluating the applicability and suitability of HRC at Eurofins. The research project presented used the design science research process. The state-of-the-art of HRC was studied in a comprehensive literature review, reviewing sterile robots and mobile robotics as well. The presented literature review could identify possible research gaps in both HRC in laboratory environments and mobile solutions for HRC applications. These areas studied in the literature review formed together the basis of the prepared observations and interviews, used to generate the necessary data to develop the design science research artefact, the demonstrator. ABB's software for robotic simulation and offline programming, RobotStudio, were used in the development of the demonstrator, with the collaborative robot YuMi chosen for the HRC implementation. The demonstrator presented in the research project has been built, tested and refined in accordance to the design science research process. When the demonstrator could illustrate an applicable solution, it was evaluated for its performance and quality using a mixed methods approach. Limitations were identified in both the performance and quality of the demonstrator's illustrated HRC implementation, including adaptability and sterility constraints. The research project presented could conclude that a HRC application would be possible at a station which were of interest by the company, but would however not be recommended due to the identified constraints. Instead, the company were recommended to look for stations which are more standardized and have less hygienic requirements. By the end of the research project, additional knowledge was contributed to the company, including how HRC can affect today's working methods at Eurofins and in laboratory environments in general.
|
Page generated in 0.1106 seconds