Spelling suggestions: "subject:"machineinteractions (HMI)"" "subject:"humaninteraction (HMI)""
1 |
Human Interaction with Autonomous machines: Visual Communication to Encourage TrustNorstedt, Emil, Sahlberg, Timmy January 2020 (has links)
En pågående utveckling sker inom konstruktionsbranschen där maskiner går från att styras manuellt av en mänsklig förare till styras autonomt, d.v.s. utan mänsklig förare. Detta arbete har varit i samarbete med Volvo CE och deras nya autonoma hjullastare. Då maskinen kommer operera i en miljö kring människor, så krävs en hög säkerhet för att eliminera olyckor. Syftet med arbetet har varit att utveckla ett system för öka säkerheten och förtroendet för människorna i närheten av den autonoma maskinen. Systemet byggs på visuell kommunikation för att uppnå en tillit mellan parterna. Arbetet har baserats på en iterativ process där prototypande, testande och analysering har varit i focus för att uppnå ett lyckat resultat. Genom skapande av modeller med olika funktioner så har en större förståelse kring hur visuell kommunikation mellan människa och maskin kan skapas för att bygga upp en tillit sinsemellan. Detta resulterade i ett koncept som bygger på en kommunikation via ögon från maskinen. Ögonkontakt har visats sig vara en viktig faktor för människor för att skapa ett förtroende för någon eller något i obekväma och utsatta situationer. Maskinen förmedlar olika uttryck genom att ändra färg och form på ögonen för att uppmärksamma och informera människor som rör sig i närheten av maskinen. Genom att anpassa färg och form på ögon kan information uppfattas på olika sätt. Med denna typ av kommunikation kan ett förtroende för maskinen skapas och på så sätt höjs säkerhet och tillit. / Ongoing development is happening within the construction industry. Machines are transformed from being operated by humans to being autonomous. This project has been a collaboration with Volvo Construction Equipment (Volvo CE), and their new autonomous wheel loader. The autonomous machine is supposed to operate in the same environment as people. Therefore, a developed safety system is required to eliminate accidents. The purpose has been developing a system to increase the safety for the workers and to encourage trust for the autonomous machine. The system is based on visual communication to achieve trust between the machine and the people around it. An iterative process, with a focus on testing, prototyping, and analysing, has been used to accomplish a successful result. Better understanding has been developed on how to design a human-machine-interface to encourage trust by creating models with a variety of functions. The iterative process resulted in a concept that communicates through eyes. Eye-contact is an essential factor for creating trust in unfamiliar and exposed situations. The solution mediating different expressions by changing the colour and shape of the eyes to create awareness and to inform people moving around in the same environment. Specific information can be mediated in various situations by adopting the colour and shape of the eyes. Trust can be encouraged for the autonomous machine using this way of communicating.
|
2 |
Usability Criteria for Human-Machine Interaction with Automated Guided Vehicles : An exploratory study on user perceptionsFriebel, Victoria January 2022 (has links)
Logistics 4.0 describes the profound paradigm shift driven by digital transformation that poses new challenges for the logistics industry. Working conditions change significantly as Automated Guided Vehicles (AGVs) take over material handling tasks. However, Human-Machine Interaction (HMI) between AGVs and their users, and the requirements for the design of AGV user interfaces considering the challenges of Logistics 4.0 have not yet been researched in depth. This qualitative study in collaboration with the intralogistics company FlexQube explores the perceived usability of AGV user interfaces for their users, derives usability criteria and investigates how usability affects the HMI. Research subject is the company's own AGV line. The six exploratory user interviews conducted with both customers and internal employees show the relevance of Nielsen's usability heuristics and identify overlaps with existing propositions for Human-Robot Interaction. The findings also highlight the impact of user demographics on the perceived usability and on the use of the automated features of the AGV. Also, the impact of the challenges of Logistics 4.0 is discussed. Thus, the study proposes that these four main aspects should be considered in the design of AGV user interfaces and suggests further research on the influence of usability on technology acceptance of automation.
|
3 |
Simulator-Based Design : Methodology and vehicle display applicationAlm, Torbjörn January 2007 (has links)
Human-in-the-loop simulators have long been used in the research community as well as in industry. The aviation field has been the pioneers in the use of simulators for design purposes. In contrast, corresponding activities in the automotive area have been less widespread. Published reports on experimental activities based on human-in-the-loop simulations have focused on methods used in the study, but nobody seems to have taken a step back and looked at the wider methodological picture of Simulator-Based Design. The purpose of this thesis is to fill this gap by drawing, in part, upon the author’s long experience in this field. In aircraft and lately also in ground vehicles there has been a technology shift from pure mechanics to computer-based systems. The physical interface has turned into screen-based solutions. This trend towards glass has just begun for ground vehicles. This development in vehicle technology has opened the door for new design approaches, not only for design itself, but also for the development process. Simulator-Based Design (SBD) is very compatible with this trend. The first part of this thesis proposes a structure for the process of SBD and links it to the corresponding methodology for software design. In the second part of the thesis the focus changes from methodology to application and specifically to the design of three-dimensional situation displays. Such displays are supposed to support the human operator with a view of a situation beyond the more or less limited visual range. In the aircraft application interest focuses on the surrounding air traffic in the light of the evolving free-flight concept, where responsibility for separation between aircraft will be (partly) transferred from ground-based flight controllers to air crews. This new responsibility must be supported by new technology and the situational view must be displayed from the perspective of the aircraft. Some basic design questions for such 3D displays were investigated resulting in an adaptive interface approach, where the current situation and task govern the details of information presentation. The thesis also discusses work on situation displays for ground vehicles. The most prominent example may be the Night Vision system, where the road situation ahead is depicted on a screen in the cab. The existing systems are based on continuous presentation, an approach that we have questioned, since there is strong evidence for negative behavioral adaptation. This means, for example, that the driver will drive faster, since vision has been enhanced, and thereby consume the safety margins that the system was supposed to deliver. Our investigation supports a situation-dependant approach and no continuous presentation. In conclusion, the results from our simulator-based studies showed advantages for adaptive interface solutions. Such design concepts are much more complicated than traditional static interfaces. This finding emphasizes the need for more dynamic design resources in order to have a complete understanding of the situation-related interface changes. The use of human-in-the-loop simulators and deployment of Simulator-Based Design will satisfy this need.
|
4 |
<b>Collaborative Human and Computer Controls of Smart Machines</b>Hussein Bilal (17565258) 07 December 2023 (has links)
<p dir="ltr">A Human-Machine Interaction (HMI) refers to a mechanism to support the direct interactions of humans and machines with the objective for the synthesis of machine intelligence and autonomy. The demand to advance in this field of study for intelligence controls is continuously growing. Brain-Computer Interface (BCI) is one type of HMIs that utilizes a human brain to enable direct communication of the human subject with a machine. This technology is widely explored in different fields to control external devices using brain signals.</p><p dir="ltr">This thesis is driven by two key observations. The first one is the limited number of Degrees of Freedom (DoF) that existing BCI controls can control in an external device; it becomes necessary to assess the controllability when choosing a control instrument. The second one is the differences of decision spaces of human and machine when both of them try to control an external device. To fill the gaps in these two aspects, there is a need to design an additional functional module that is able to translate the commands issued by human into high-frequency control commands that can be understood by machines. These two aspects has not been investigated thoroughly in literatures.</p><p dir="ltr">This study focuses on training, detecting, and using humans’ intents to control intelligent machines. It uses brain signals which will be trained and detected in form of Electroencephalography (EEG), brain signals will be used to extract and classify human intents. A selected instrument, Emotiv Epoc X, is used for pattern training and recognition based on its controllability and features among other instruments. A functional module is then developed to bridge the gap of frequency differences between human intents and motion commands of machine. A selected robot, TinkerKit Braccio, is then used to illustrate the feasibility of the developed module through fully controlling the robotic arm using human’s intents solely.</p><p dir="ltr">Multiple experiments were done on the prototyped system to prove the feasibility of the proposed model. The accuracy to send each command, and hence the accuracy of the system to extract each intent, exceeded 75%. Then, the feasibility of the proposed model was also tested through controlling the robot to follow pre-defined paths, which was obtained through designing a Graphical-User Interface (GUI). The accuracy of each experiment exceeded 90%, which validated the feasibility of the proposed control model.</p>
|
5 |
BRAIN-COMPUTER INTERFACE FOR SUPERVISORY CONTROLS OF UNMANNED AERIAL VEHICLESAbdelrahman Osama Gad (17965229) 15 February 2024 (has links)
<p dir="ltr">This research explored a solution to a high accident rate in remotely operating Unmanned Aerial Vehicles (UAVs) in a complex environment; it presented a new Brain-Computer Interface (BCI) enabled supervisory control system to fuse human and machine intelligence seamlessly. This study was highly motivated by the critical need to enhance the safety and reliability of UAV operations, where accidents often stemmed from human errors during manual controls. Existing BCIs confronted the challenge of trading off a fully remote control by humans and an automated control by computers. This study met such a challenge with the proposed supervisory control system to optimize human-machine collaboration, prioritizing safety, adaptability, and precision in operation.</p><p dir="ltr">The research work included designing, training, and testing BCI and the BCI-enabled control system. It was customized to control a UAV where the user’s motion intents and cognitive states were monitored to implement hybrid human and machine controls. The DJI Tello drone was used as an intelligent machine to illustrate the application of the proposed control system and evaluate its effectiveness through two case studies. The first case study was designed to train a subject and assess the confidence level for BCI in capturing and classifying the subject’s motion intents. The second case study illustrated the application of BCI in controlling the drone to fulfill its missions.</p><p dir="ltr">The proposed supervisory control system was at the forefront of cognitive state monitoring to leverage the power of an ML model. This model was innovative compared to conventional methods in that it could capture complicated patterns within raw EEG data and make decisions to adopt an ensemble learning strategy with the XGBoost. One of the key innovations was capturing the user’s intents and interpreting these into control commands using the EmotivBCI app. Despite the headset's predefined set of detectable features, the system could train the user’s mind to generate control commands for all six degrees of freedom of adapting to the quadcopter by creatively combining and extending mental commands, particularly in the context of the Yaw rotation. This strategic manipulation of commands showcased the system's flexibility in accommodating the intricate control requirements of an automated machine.</p><p dir="ltr">Another innovation of the proposed system was its real-time adaptability. The supervisory control system continuously monitors the user's cognitive state, allowing instantaneous adjustments in response to changing conditions. This innovation ensured that the control system was responsive to the user’s intent and adept at prioritizing safety through the arbitrating mechanism when necessary.</p>
|
Page generated in 0.1119 seconds