• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 99
  • 13
  • 10
  • 9
  • 6
  • 2
  • 2
  • 1
  • Tagged with
  • 175
  • 175
  • 48
  • 37
  • 31
  • 31
  • 31
  • 30
  • 28
  • 26
  • 24
  • 22
  • 20
  • 19
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

User Testing/Co-Design of Current PIVOT Features

Katragadda, Monica 04 October 2021 (has links)
No description available.
62

Knowledge representation and stocastic multi-agent plan recognition

Suzic, Robert January 2005 (has links)
To incorporate new technical advances into military domain and make those processes more efficient in accuracy, time and cost, a new concept of Network Centric Warfare has been introduced in the US military forces. In Sweden a similar concept has been studied under the name Network Based Defence (NBD). Here we present one of the methodologies, called tactical plan recognition that is aimed to support NBD in future. Advances in sensor technology and modelling produce large sets of data for decision makers. To achieve decision superiority, decision makers have to act agile with proper, adequate and relevant information (data aggregates) available. Information fusion is a process aimed to support decision makers’ situation awareness. This involves a process of combining data and information from disparate sources with prior information or knowledge to obtain an improved state estimate about an agent or phenomena. Plan recognition is the term given to the process of inferring an agent’s intentions from a set of actions and is intended to support decision making. The aim of this work has been to introduce a methodology where prior (empirical) knowledge (e.g. behaviour, environment and organization) is represented and combined with sensor data to recognize plans/behaviours of an agent or group of agents. We call this methodology multi-agent plan recognition. It includes knowledge representation as well as imprecise and statistical inference issues. Successful plan recognition in large scale systems is heavily dependent on the data that is supplied. Therefore we introduce a bridge between the plan recognition and sensor management where results of our plan recognition are reused to the control of, give focus of attention to, the sensors that are supposed to acquire most important/relevant information. Here we combine different theoretical methods (Bayesian Networks, Unified Modeling Language and Plan Recognition) and apply them for tactical military situations for ground forces. The results achieved from several proof-ofconcept models show that it is possible to model and recognize behaviour of tank units. / QC 20101222
63

Where do people direct their attention while cycling? A comparison of adults and children

Melin, M. C., Peltomaa, E., Schildt, L., Lehtonen, E. 18 November 2020 (has links)
Cycling in urban environments requires the ability to distinguish between relevant and irrelevant targets quickly and reliably, so that potential hazards can be anticipated and avoided. In two experiments, we investigated where adults and children direct their attention when viewing videos filmed from a cyclist’s perspective. We wanted to see if there were any differences in the responses given by experienced adult cyclists, inexperienced adult cyclists, and child cyclists. In Experiment 1, 16 adults (19–33 years) were asked to watch ten videos and to point out things they would pay attention to by tapping a touchscreen (pointed out locations). Afterwards, they were asked to explain their answers. In Experiment 2, 17 adults (19–34 years) and 17 children (11–12 years) performed the same task with the same ten videos, but they were not asked to explain their answers afterwards. The data sets from these two experiments were pooled, creating three groups: ten experienced adult cyclists, 23 inexperienced adult cyclists and 17 children. A total of 23 clearly visible, traffic-relevant targets (pre-specified targets) had previously been identified in the videos. We investigated whether the participants’ pointed-out locations matched these targets (and if so, how fast they responded in pointing them out). We also investigated the number and vertical/horizontal dispersion of these pointed-out locations on the touchscreen. Adults pointed out more locations than children, especially pedestrians and cyclists. This result suggests that, while children focussed as well as adults on cars (arguably the most salient hazard), they were less able to identify other hazards (such as pedestrians or other cyclists). The children had also a larger vertical dispersion and a larger between-participant variation than the adults. Adults were faster at tapping the pre-specified targets and they missed them less often. Overall, the results suggest that 11–12 year old-cyclists have worse situation awareness in traffic than adults.
64

Exploratory Team Cognition and Resilience in Human Agent Teaming

January 2019 (has links)
abstract: Human-agent teams (HATs) are expected to play a larger role in future command and control systems where resilience is critical for team effectiveness. The question of how HATs interact to be effective in both normal and unexpected situations is worthy of further examination. Exploratory behaviors are one that way adaptive systems discover opportunities to expand and refine their performance. In this study, team interaction exploration is examined in a HAT composed of a human navigator, human photographer, and a synthetic pilot while they perform a remotely-piloted aerial reconnaissance task. Failures in automation and the synthetic pilot’s autonomy were injected throughout ten missions as roadblocks. Teams were clustered by performance into high-, middle-, and low-performing groups. It was hypothesized that high-performing teams would exchange more text-messages containing unique content or sender-recipient combinations than middle- and low-performing teams, and that teams would exchange less unique messages over time. The results indicate that high-performing teams had more unique team interactions than middle-performing teams. Additionally, teams generally had more exploratory team interactions in the first session of missions than the second session. Implications and suggestions for future work are discussed. / Dissertation/Thesis / Masters Thesis Human Systems Engineering 2019
65

Probabilistic Clustering Ensemble Evaluation for Intrusion Detection

McElwee, Steven M. 01 January 2018 (has links)
Intrusion detection is the practice of examining information from computers and networks to identify cyberattacks. It is an important topic in practice, since the frequency and consequences of cyberattacks continues to increase and affect organizations. It is important for research, since many problems exist for intrusion detection systems. Intrusion detection systems monitor large volumes of data and frequently generate false positives. This results in additional effort for security analysts to review and interpret alerts. After long hours spent reviewing alerts, security analysts become fatigued and make bad decisions. There is currently no approach to intrusion detection that reduces the workload of human analysts by providing a probabilistic prediction that a computer is experiencing a cyberattack. This research addressed this problem by estimating the probability that a computer system was being attacked, rather than alerting on individual events. This research combined concepts from cyber situation awareness by applying clustering ensembles, probability analysis, and active learning. The unique contribution of this research is that it provides a higher level of meaning for intrusion alerts than traditional approaches. Three experiments were conducted in the course of this research to demonstrate the feasibility of these concepts. The first experiment evaluated cluster generation approaches that provided multiple perspectives of network events using unsupervised machine learning. The second experiment developed and evaluated a method for detecting anomalies from the clustering results. This experiment also determined the probability that a computer system was being attacked. Finally, the third experiment integrated active learning into the anomaly detection results and evaluated its effectiveness in improving the accuracy. This research demonstrated that clustering ensembles with probabilistic analysis were effective for identifying normal events. Abnormal events remained uncertain and were assigned a belief. By aggregating the belief to find the probability that a computer system was under attack, the resulting probability was highly accurate for the source IP addresses and reasonably accurate for the destination IP addresses. Active learning, which simulated feedback from a human analyst, eliminated the residual error for the destination IP addresses with a low number of events that required labeling.
66

Mediating ICU patient situation-awareness with visual and tactile notifications

Srinivas, Preethi 29 March 2016 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Healthcare providers in hospital intensive care units (ICUs) maintain patient situation awareness by following task management and communication practices. They create and manipulate several paper-based and digital information sources, with the overall aim to constantly inform themselves and their colleagues of dynamically evolving patient conditions. However, when increased communication means that healthcare providers potentially interrupt each other, enhanced patient-situation awareness comes at a price. Prior research discusses both the use of technology to support increased communication and its unintended consequence of (wanted and unwanted) notification interruptions. Using qualitative research techniques, I investigated work practices that enhance the patient-situation awareness of physicians, fellows, residents, nurses, students, and pharmacists in a medical ICU. I used the Locales Framework to understand the observed task management and communication work practices. In this study, paper notes were observed to act as transitional artifacts that are later digitized to organize and coordinate tasks, goals, and patient-centric information at a team and organizational level. Non digital information is often not immediately digitized, and only select information is communicated between certain ICU team members through synchronous mechanisms such as face-to-face or telephone conversations. Thus, although ICU providers are exceptionally skilled at working together to improve a critically ill patient’s condition, the use of paper-based artifacts and synchronous communication mechanisms induces several interruptions while contextually situating a clinical team for patient care. In this dissertation, I also designed and evaluated a mobile health technology tool, known as PANI (Patient-centered Notes and Information Manager), guided by the Locales framework and the participatory involvement of ICU healthcare providers as co designers. PANI-supported task management induces minimal interruptions by: (1) rapidly generating, managing, and sharing clinical notes and action-items among clinicians and (2) supporting the collaboration and communication needs of clinicians through a novel visual and tactile notification system. The long-term contribution of this research suggests guidelines for designing mobile health technology interventions that enhance ICU patient situation-awareness and reduce unwanted interruptions to clinical workflow.
67

Human Robot Interaction Concepts for Human Supervisory Control and Telemaintenance Applications in an Industry 4.0 Environment / Mensch-Roboter-Interaktionskonzepte für Fernsteuerungs- und Fernwartungsanwendungen in einer Industrie 4.0 Umgebung

Aschenbrenner, Doris January 2017 (has links) (PDF)
While teleoperation of technical highly sophisticated systems has already been a wide field of research, especially for space and robotics applications, the automation industry has not yet benefited from its results. Besides the established fields of application, also production lines with industrial robots and the surrounding plant components are in need of being remotely accessible. This is especially critical for maintenance or if an unexpected problem cannot be solved by the local specialists. Special machine manufacturers, especially robotics companies, sell their technology worldwide. Some factories, for example in emerging economies, lack qualified personnel for repair and maintenance tasks. When a severe failure occurs, an expert of the manufacturer needs to fly there, which leads to long down times of the machine or even the whole production line. With the development of data networks, a huge part of those travels can be omitted, if appropriate teleoperation equipment is provided. This thesis describes the development of a telemaintenance system, which was established in an active production line for research purposes. The customer production site of Braun in Marktheidenfeld, a factory which belongs to Procter & Gamble, consists of a six-axis cartesian industrial robot by KUKA Industries, a two-component injection molding system and an assembly unit. The plant produces plastic parts for electric toothbrushes. In the research projects "MainTelRob" and "Bayern.digital", during which this plant was utilised, the Zentrum für Telematik e.V. (ZfT) and its project partners develop novel technical approaches and procedures for modern telemaintenance. The term "telemaintenance" hereby refers to the integration of computer science and communication technologies into the maintenance strategy. It is particularly interesting for high-grade capital-intensive goods like industrial robots. Typical telemaintenance tasks are for example the analysis of a robot failure or difficult repair operations. The service department of KUKA Industries is responsible for the worldwide distributed customers who own more than one robot. Currently such tasks are offered via phone support and service staff which travels abroad. They want to expand their service activities on telemaintenance and struggle with the high demands of teleoperation especially regarding security infrastructure. In addition, the facility in Marktheidenfeld has to keep up with the high international standards of Procter & Gamble and wants to minimize machine downtimes. Like 71.6 % of all German companies, P&G sees a huge potential for early information on their production system, but complains about the insufficient quality and the lack of currentness of data. The main research focus of this work lies on the human machine interface for all human tasks in a telemaintenance setup. This thesis provides own work in the use of a mobile device in context of maintenance, describes new tools on asynchronous remote analysis and puts all parts together in an integrated telemaintenance infrastructure. With the help of Augmented Reality, the user performance and satisfaction could be raised. A special regard is put upon the situation awareness of the remote expert realized by different camera viewpoints. In detail the work consists of: - Support of maintenance tasks with a mobile device - Development and evaluation of a context-aware inspection tool - Comparison of a new touch-based mobile robot programming device to the former teach pendant - Study on Augmented Reality support for repair tasks with a mobile device - Condition monitoring for a specific plant with industrial robot - Human computer interaction for remote analysis of a single plant cycle - A big data analysis tool for a multitude of cycles and similar plants - 3D process visualization for a specific plant cycle with additional virtual information - Network architecture in hardware, software and network infrastructure - Mobile device computer supported collaborative work for telemaintenance - Motor exchange telemaintenance example in running production environment - Augmented reality supported remote plant visualization for better situation awareness / Die Fernsteuerung technisch hochentwickelter Systeme ist seit vielen Jahren ein breites Forschungsfeld, vor allem im Bereich von Weltraum- und Robotikanwendungen. Allerdings hat die Automatisierungsindustrie bislang zu wenig von den Ergebnissen dieses Forschungsgebiets profitiert. Auch Fertigungslinien mit Industrierobotern und weiterer Anlagenkomponenten müssen über die Ferne zugänglich sein, besonders bei Wartungsfällen oder wenn unvorhergesehene Probleme nicht von den lokalen Spezialisten gelöst werden können. Hersteller von Sondermaschinen wie Robotikfirmen verkaufen ihre Technologie weltweit. Kunden dieser Firmen besitzen beispielsweise Fabriken in Schwellenländern, wo es an qualifizierten Personal für Reparatur und Wartung mangelt. Wenn ein ernster Fehler auftaucht, muss daher ein Experte des Sondermaschinenherstellers zum Kunden fliegen. Das führt zu langen Stillstandzeiten der Maschine. Durch die Weiterentwicklung der Datennetze könnte ein großer Teil dieser Reisen unterbleiben, wenn eine passende Fernwartungsinfrastruktur vorliegen würde. Diese Arbeit beschreibt die Entwicklung eines Fernwartungssystems, welches in einer aktiven Produktionsumgebung für Forschungszwecke eingerichtet wurde. Die Fertigungsanlage des Kunden wurde von Procter & Gamble in Marktheidenfeld zur Verfügung gestellt und besteht aus einem sechsachsigen, kartesischen Industrieroboter von KUKA Industries, einer Zweikomponentenspritzgussanlage und einer Montageeinheit. Die Anlage produziert Plastikteile für elektrische Zahnbürsten. Diese Anlage wurde im Rahmen der Forschungsprojekte "MainTelRob" und "Bayern.digital" verwendet, in denen das Zentrum für Telematik e.V. (ZfT) und seine Projektpartner neue Ansätze und Prozeduren für moderne Fernwartungs-Technologien entwickeln. Fernwartung bedeutet für uns die umfassende Integration von Informatik und Kommunikationstechnologien in der Wartungsstrategie. Das ist vor allem für hochentwickelte, kapitalintensive Güter wie Industrierobotern interessant. Typische Fernwartungsaufgaben sind beispielsweise die Analyse von Roboterfehlermeldungen oder schwierige Reparaturmaßnahmen. Die Service-Abteilung von KUKA Industries ist für die weltweit verteilten Kunden zuständig, die teilweise auch mehr als einen Roboter besitzen. Aktuell werden derartige Aufgaben per Telefonauskunft oder mobilen Servicekräften, die zum Kunden reisen, erledigt. Will man diese komplizierten Aufgaben durch Fernwartung ersetzen um die Serviceaktivitäten auszuweiten muss man mit den hohen Anforderungen von Fernsteuerung zurechtkommen, besonders in Bezug auf Security Infrastruktur. Eine derartige umfassende Herangehensweise an Fernwartung bietet aber auch einen lokalen Mehrwert beim Kunden: Die Fabrik in Marktheidenfeld muss den hohen internationalen Standards von Procter & Gamble folgen und will daher die Stillstandzeiten weiter verringern. Wie 71,6 Prozent aller deutschen Unternehmen sieht auch P&G Marktheidenfeld ein großes Potential für frühe Informationen aus ihrem Produktionssystem, haben aber aktuell noch Probleme mit der Aktualität und Qualität dieser Daten. Der Hauptfokus der hier vorgestellten Forschung liegt auf der Mensch-Maschine-Schnittstelle für alle Aufgaben eines umfassenden Fernwartungskontextes. Diese Arbeit stellt die eigene Arbeiten bei der Verwendung mobiler Endgeräte im Kontext der Wartung und neue Softwarewerkzeuge für die asynchrone Fernanalyse vor und integriert diese Aspekte in eine Fernwartungsinfrastruktur. In diesem Kontext kann gezeigt werden, dass der Einsatz von Augmented Reality die Nutzerleistung und gleichzeitig die Zufriedenheit steigern kann. Dabei wird auf das sogenannte "situative Bewusstsein" des entfernten Experten besonders Wert gelegt. Im Detail besteht die Arbeit aus: - Unterstützung von Wartungsaufgaben mit mobilen Endgeräten - Entwicklung und Evaluation kontextsensitiver Inspektionssoftware - Vergleich von touch-basierten Roboterprogrammierung mit der Vorgängerversion des Programmierhandgeräts - Studien über die Unterstützung von Reparaturaufgaben durch Augmented Reality - Zustandsüberwachung für eine spezielle Anlage mit Industrieroboter - Mensch-Maschine Interaktion für die Teleanalyse eines Produktionszyklus - Grafische Big Data Analyse einer Vielzahl von Produktionszyklen - 3D Prozess Visualisierung und Anreicherung mit virtuellen Informationen - Hardware, Software und Netzwerkarchitektur für die Fernwartung - Computerunterstützte Zusammenarbeit mit Verwendung mobiler Endgeräte für die Fernwartung - Fernwartungsbeispiel: Durchführung eines Motortauschs in der laufenden Produktion - Augmented Reality unterstütze Visualisierung des Anlagenkontextes für die Steigerung des situativen Bewusstseins
68

An Evidence-Based Strategy for the Use of Simulation to Assess Situation Awareness inApplicants to Nurse Anesthesia Programs

Lee, Angela January 2024 (has links)
No description available.
69

Supporting Flight Control for UAV-Assisted Wilderness Search and Rescue Through Human Centered Interface Design

Cooper, Joseph L. 15 November 2007 (has links) (PDF)
Inexpensive, rapidly deployable, camera-equipped Unmanned Aerial Vehicle (UAV) systems can potentially assist with a huge number of tasks. However, in many cases such as wilderness search and rescue (WiSAR), the potential users of the system may not be trained as pilots. Simple interface concepts can be used to build an interaction layer that allows an individual with minimal operator training to use the system to facilitate a search or inspection task. We describe an analysis of WiSAR as currently accomplished and show how a UAV system might fit into the existing structure. We then discuss preliminary system design efforts for making UAV-enabled search possible and practical. Finally, we present both a carefully controlled experiment and partially structured field trials that illustrate principles for making UAV-assisted search a reality. Our experiments show that the traditional method for controlling a camera-enabled UAV is significantly more difficult than integrated methods. Success and troubles during field trials illustrate several desiderata and information needs for a UAV search system.
70

The Effects Of Diagnostic Aiding On Situation Awareness Under Robot Unreliability

Schuster, David 01 January 2013 (has links)
In highly autonomous robotic systems, human operators are able to attend to their own, separate tasks, but robots still need occasional human intervention. In this scenario, it may be difficult for human operators to determine the status of the system and environment when called upon to aid the robot. The resulting lack of situation awareness (SA) is a problem common to other automated systems, and it can lead to poor performance and compromised safety. Existing research on this problem suggested that reliable automation of information processing, called diagnostic aiding, leads to better operator SA. The effects of unreliable diagnostic aiding, however, were not well understood. These effects are likely to depend on the ability of the operator to perform the task unaided. That is, under conditions in which the operator can reconcile their own sensing with that of the robot, the influence of unreliable diagnostic aiding may be more pronounced. When the robot is the only source of information for a task, these effects may be weaker or may reverse direction. The purpose of the current experiment was to determine if SA is differentially affected by unreliability at different levels of unaided human performance and at different stages of diagnostic aiding. This was accomplished by experimentally manipulating the stage of diagnostic aiding, robot reliability, and the ability of the operator to build SA unaided. Results indicated that while reliable diagnostic aiding is generally useful, unreliable diagnostic aiding has effects that depend on the amount of information available to operators in the environment. This research improves understanding of how robots can support operator SA and can guide the development of future robots so that humans are most likely to use them effectively.

Page generated in 0.1304 seconds