• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1695
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 8
  • 7
  • 7
  • Tagged with
  • 3604
  • 597
  • 432
  • 363
  • 358
  • 358
  • 346
  • 326
  • 326
  • 294
  • 282
  • 255
  • 214
  • 213
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Stressors, Quality of Life, and Psychosocial Outcomes: Managing Communication Uncertainty for Caregivers of Patients with End Stage Renal Disease

SHERWANI, SHARIQ I. 10 September 2021 (has links)
No description available.
162

Dealing with uncertainty

Clausen Mork, Jonas January 2012 (has links)
Uncertainty is, it seems, more or less constantly present in our lives. Even so, grasping the concept philosophically is far from trivial. In this doctoral thesis, uncertainty and its conceptual companion information are studied. Axiomatic analyses are provided and numerical measures suggested. In addition to these basic conceptual analyses, the widespread practice of so-called safety factor use in societal regulation is analyzed along with the interplay between science and policy in European regulation of chemicals and construction. / QC 20120202
163

A multi-fidelity analysis selection method using a constrained discrete optimization formulation

Stults, Ian Collier 17 August 2009 (has links)
The purpose of this research is to develop a method for selecting the fidelity of contributing analyses in computer simulations. Model uncertainty is a significant component of result validity, yet it is neglected in most conceptual design studies. When it is considered, it is done so in only a limited fashion, and therefore brings the validity of selections made based on these results into question. Neglecting model uncertainty can potentially cause costly redesigns of concepts later in the design process or can even cause program cancellation. Rather than neglecting it, if one were to instead not only realize the model uncertainty in tools being used but also use this information to select the tools for a contributing analysis, studies could be conducted more efficiently and trust in results could be quantified. Methods for performing this are generally not rigorous or traceable, and in many cases the improvement and additional time spent performing enhanced calculations are washed out by less accurate calculations performed downstream. The intent of this research is to resolve this issue by providing a method that will minimize the amount of time spent conducting computer simulations while meeting accuracy and concept resolution requirements for results.
164

An empirical approach to modeling uncertainty in intrusion analysis

Sakthivelmurugan, Sakthiyuvaraja January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / Xinming (Simon) Ou / A well-known problem in current intrusion detection tools is that they create too many low-level alerts and system administrators find it hard to cope up with the huge volume. Also, when they have to combine multiple sources of information to confirm an attack, there is a dramatic increase in the complexity. Attackers use sophisticated techniques to evade the detection and current system monitoring tools can only observe the symptoms or effects of malicious activities. When mingled with similar effects from normal or non-malicious behavior they lead intrusion analysis to conclusions of varying confidence and high false positive/negative rates. In this thesis work we present an empirical approach to the problem of modeling uncertainty where inferred security implications of low-level observations are captured in a simple logical language augmented with uncertainty tags. We have designed an automated reasoning process that enables us to combine multiple sources of system monitoring data and extract highly-confident attack traces from the numerous possible interpretations of low-level observations. We have developed our model empirically: the starting point was a true intrusion that happened on a campus network we studied to capture the essence of the human reasoning process that led to conclusions about the attack. We then used a Datalog-like language to encode the model and a Prolog system to carry out the reasoning process. Our model and reasoning system reached the same conclusions as the human administrator on the question of which machines were certainly compromised. We then automatically generated the reasoning model needed for handling Snort alerts from the natural-language descriptions in the Snort rule repository, and developed a Snort add-on to analyze Snort alerts. Keeping the reasoning model unchanged, we applied our reasoning system to two third-party data sets and one production network. Our results showed that the reasoning model is effective on these data sets as well. We believe such an empirical approach has the potential of codifying the seemingly ad-hoc human reasoning of uncertain events, and can yield useful tools for automated intrusion analysis.
165

Subdimensional Expansion: A Framework for Computationally Tractable Multirobot Path Planning

Wagner, Glenn 01 December 2015 (has links)
Planning optimal paths for large numbers of robots is computationally expensive. In this thesis, we present a new framework for multirobot path planning called subdimensional expansion, which initially plans for each robot individually, and then coordinates motion among the robots as needed. More specifically subdimensional expansion initially creates a one-dimensional search space embedded in the joint configuration space of the multirobot system. When the search space is found to be blocked during planning by a robot-robot collision, the dimensionality of the search space is locally increased to ensure that an alternative path can be found. As a result, robots are only coordinated when necessary, which reduces the computational cost of finding a path. Subdimensional expansion is a exible framework that can be used with multiple planning algorithms. For discrete planning problems, subdimensional expansion can be combined with A* to produce the M* algorithm, a complete and optimal multirobot path planning problem. When the configuration space of individual robots is too large to be explored effectively with A*, subdimensional expansion can be combined with probabilistic planning algorithms to produce sRRT and sPRM. M* is then extended to solve variants of the multirobot path planning algorithm. We present the Constraint Manifold Subsearch (CMS) algorithm to solve problems where robots must dynamically form and dissolve teams with other robots to perform cooperative tasks. Uncertainty M* (UM*) is a variant of M* that handles systems with probabilistic dynamics. Finally, we apply M* to multirobot sequential composition. Results are validated with extensive simulations and experiments on multiple physical robots.
166

A critical investigation into how independent and incubate entrepreneurs perceive their role and performance success

McGowan, Carmel Teresa January 2012 (has links)
No description available.
167

MULTIVARIATE SYSTEMS ANALYSIS

Wolting, Duane 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1985 / Riviera Hotel, Las Vegas, Nevada / In many engineering applications, a systems analysis is performed to study the effects of random error propagation throughout a system. Often these errors are not independent, and have joint behavior characterized by arbitrary covariance structure. The multivariate nature of such problems is compounded in complex systems, where overall system performance is described by a q-dimensional random vector. To address this problem, a computer program was developed which generates Taylor series approximations for multivariate system performance in the presence of random component variablilty. A summary of an application of this approach is given in which an analysis was performed to assess simultaneous design margins and to ensure optimal component selection.
168

Reducing Uncertainty in Production System Design through Discrete Event Simulation : A case study at Volvo Construction Equipment

Etxagibel Larrañaga, Asier, Loschkin, Julia January 2016 (has links)
In a market environment that is subject to continuous changes, companies need to adapttheir production systems in order to maintain the competitive edge. Current literatureshows that with a successful production system design, higher levels of output, eciencyand quality can be achieved.However, designing a production system is done infrequently and therefore tends tolack experience. As a result, design decisions have to be made under uncertainty due toa lack of information, structure and knowledge. In fact, the success of a design process isdirectly linked to the level of uncertainty.The purpose of this thesis is to reduce uncertainty in production system design throughDiscrete Event Simulation before an assembly system is implemented. Therefore, a theoreticalstudy was carried out dening types and sources of uncertainty in productionsystem design. Parallel to the theoretical study, a case study in Volvo ConstructionEquipment Operations Hallsberg was conducted. Discrete Event Simulation was testedas a tool to reduce uncertainty in production system design.The analysis illustrates the observed sources of uncertainty in production systemdesign cover a process, organizational, corporate, market and cultural context.The relevant uncertainty types identied in the case study in Volvo ConstructionEquipment Operations Hallsberg were environmental, system, technical, structural,temporal, lack of knowledge and lack of information. The information providedby the Discrete Event Simulation in order to reduce uncertainty are in form ofKPIs, process structure and visualization. The provided information had a positiveimpact on the degree of technical uncertainties, the lack of knowledge and thelack of information. As a result, the level of uncertainty in the Volvo ConstructionEquipment Operations Hallsberg future line designing process was reduced.
169

Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty

Saunders, Ian January 2012 (has links)
To make sense of our unpredictable world, humans use sensory information streaming through billions of peripheral neurons. Uncertainty and ambiguity plague each sensory stream, yet remarkably our perception of the world is seamless, robust and often optimal in the sense of minimising perceptual variability. Moreover, humans have a remarkable capacity for dexterous manipulation. Initiation of precise motor actions under uncertainty requires awareness of not only the statistics of our environment but also the reliability of our sensory and motor apparatus. What happens when our sensory and motor systems are disrupted? Upper-limb amputees tted with a state-of-the-art prostheses must learn to both control and make sense of their robotic replacement limb. Tactile feedback is not a standard feature of these open-loop limbs, fundamentally limiting the degree of rehabilitation. This thesis introduces a modular closed-loop upper-limb prosthesis, a modified Touch Bionics ilimb hand with a custom-built linear vibrotactile feedback array. To understand the utility of the feedback system in the presence of multisensory and sensorimotor influences, three fundamental open questions were addressed: (i) What are the mechanisms by which subjects compute sensory uncertainty? (ii) Do subjects integrate an artificial modality with visual feedback as a function of sensory uncertainty? (iii) What are the influences of open-loop and closed-loop uncertainty on prosthesis control? To optimally handle uncertainty in the environment people must acquire estimates of the mean and uncertainty of sensory cues over time. A novel visual tracking experiment was developed in order to explore the processes by which people acquire these statistical estimators. Subjects were required to simultaneously report their evolving estimate of the mean and uncertainty of visual stimuli over time. This revealed that subjects could accumulate noisy evidence over the course of a trial to form an optimal continuous estimate of the mean, hindered only by natural kinematic constraints. Although subjects had explicit access to a measure of their continuous objective uncertainty, acquired from sensory information available within a trial, this was limited by a conservative margin for error. In the Bayesian framework, sensory evidence (from multiple sensory cues) and prior beliefs (knowledge of the statistics of sensory cues) are combined to form a posterior estimate of the state of the world. Multiple studies have revealed that humans behave as optimal Bayesian observers when making binary decisions in forced-choice tasks. In this thesis these results were extended to a continuous spatial localisation task. Subjects could rapidly accumulate evidence presented via vibrotactile feedback (an artificial modality ), and integrate it with visual feedback. The weight attributed to each sensory modality was chosen so as to minimise the overall objective uncertainty. Since subjects were able to combine multiple sources of sensory information with respect to their sensory uncertainties, it was hypothesised that vibrotactile feedback would benefit prosthesis wearers in the presence of either sensory or motor uncertainty. The closed-loop prosthesis served as a novel manipulandum to examine the role of feed-forward and feed-back mechanisms for prosthesis control, known to be required for successful object manipulation in healthy humans. Subjects formed economical grasps in idealised (noise-free) conditions and this was maintained even when visual, tactile and both sources of feedback were removed. However, when uncertainty was introduced into the hand controller, performance degraded significantly in the absence of visual or tactile feedback. These results reveal the complementary nature of feed-forward and feed-back processes in simulated prosthesis wearers, and highlight the importance of tactile feedback for control of a prosthesis.
170

DATA VALIDATION: A PREREQUISITE TO PERFORMING DATA UNCERTAINTY ANALYSIS

Walter, Patrick L. 10 1900 (has links)
ITC/USA 2005 Conference Proceedings / The Forty-First Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2005 / Riviera Hotel & Convention Center, Las Vegas, Nevada / There are increasing demands, particularly from government agencies, to perform uncertainty analysis in order to assign accuracy bounds to telemetered data from environmental measuring transducers (pressure, acceleration, force, strain, temperature, etc.). Several requirements must be fulfilled before measurement uncertainty analysis is justified. These requirements include good measurement system design practices such as adequate low- and high-frequency response and data-sampling rates, appropriate anti-aliasing filter selection^(1), proper grounding and shielding, and many more. In addition, there are applications (e.g., flight test) in which the environment of the transducer varies with time and/or location. In these applications, it is a requisite that data-validation be performed to establish that an individual transducer responds only to the environmental stimulus that it is intended to measure. Without this validation component designed into the telemetry system, assigned accuracy bounds can be totally meaningless. This paper presents examples and describes techniques for data validation of signals from environmental measuring transducers.

Page generated in 0.062 seconds