• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 24
  • 24
  • 8
  • 8
  • 6
  • 6
  • 5
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Risk perception during conditionally automated driving in low fidelity simulator

D’Alessandro, Carmine January 2020 (has links)
This work focuses on the type-3 self-driving cars, partially autonomous vehicles which can control themselves for most of the time and may ask the driver to take control of the car in case of specific situations. The purpose of this study is to evaluate the perception of the simulated risk faced by the participants in a low fidelity simulation in relation with their background: the gaming and driving experience. The participants of the study drove in the simulator and answered a questionnaire about both the driving session and the background information. The simulated risk was assessed and compared with the information from the questionnaire. It was evaluated both the performance of each participant represented by the level of risk experienced while driving the simulation and the correct identification of the risk faced. The result data highlighted a positive correlation between the driving performance and the videogame experience.
12

MSThesis_twitzig.pdf

Tyler Alexander Witzig (14215754) 08 December 2022 (has links)
<p>  </p> <p>Knot tying boards are low fidelity surgical simulators used to practice tying suture, but devices on the market currently provide no feedback and no way of changing out bands. A simple-to-use knot tying board with interchangeable bands capable of measuring force was designed. This board is comparable in cost to products currently available on the market. The knot tying board was then prototyped and tested. Four MD students completed trials of one-handed and two-handed knot tying with three throws per trail. In testing, the knot tying board was capable of measuring force data, such as peak force during knot tying and the final force the knot exerts on the bands. The device used in conjunction with experienced surgical skills coaches could prove a powerful tool for providing feedback to trainees, and a similar approach could be used with other low fidelity surgical simulators to improve feedback.</p>
13

Estética da precariedade: a agência dos equipamentos técnicos na fotografia de Miroslav Tichý / Aesthetics of precariousness: the agency of technical equipments in Miroslav Tichý photography

Rezende, Paula Davies 02 June 2017 (has links)
Esta dissertação visa investigar a agência da câmera na criação fotográfica e a emergência de uma estética da precariedade nos dias atuais, tendo como estudo de caso o trabalho do fotógrafo checo Miroslav Tichý (1926-2011). A partir da descrição e classificação das câmeras de baixa fidelidade, buscou-se compreender como elas teriam contribuído para a subversão dos cânones da fotografia tradicional e para a consolidação de uma estética permeada por falhas denominada como estética da precariedade. As fotografias de Tichý, com seu aspecto ruidoso e deteriorado, decorrente de seu processo fotográfico peculiar, mostraram-se particularmente propícias para evidenciar, de forma concreta, a agência de elementos humanos e não-humanos na configuração dessa estética da precariedade. Além disso, será investigado como a obra de Miroslav Tichý e a estética da precariedade passaram pelo processo de artificação e foram alçados ao estatuto de arte por diversas instâncias de legitimação, como curadores e historiadores da arte. A pesquisa reflete, ainda, sobre a a apropriação, pasteurização e mercantilização da estética da precariedade por parte da indústria da fotografia digital, por meio de aplicativos e softwares voltados para a edição de imagens. Por fim, cabe ressaltar a intenção de abordar esses fenômenos a partir de um viés crítico, buscando identificar suas contradições e seus desdobramentos no campo da cultura contemporânea. / This dissertation aims to investigate the agency of the camera in the photographic creation and the emergence of an aesthetics of precariousness. The Czech photographer Miroslav Tichý (1926-2011) will be used as a case study. Beginning with the description and classification of \"low fidelity\" cameras, we sought to understand how they contributed to the subversion of the canons of traditional photography and to the consolidation of an aesthetic permeated by failures called the \"aesthetics of precariousness\". The photographs by Tichý, with their noisy and deteriorated appearance due to his peculiar photographic process, were particularly propitious to concretely evidence the agency of human and nonhuman elements in the configuration of this aesthetic of precariousness. In addition, it will be investigated how the work of Miroslav Tichý and the aesthetics of precariousness went through the process of artification and were elevated to the status of art by various instances of legitimation, as curators and art historians. This research also reflects about the appropriation, pasteurization and mercantilization of the aesthetics of precariousness by the digital photography industry, through image editing applications and software. Finally, it is worth emphasizing the intention to approach these phenomena from a critical point of view, seeking to identify their contradictions and their development in the field of contemporary culture.
14

Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

Razavi, Seyed Saman January 2013 (has links)
Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem.
15

Developing Efficient Strategies for Automatic Calibration of Computationally Intensive Environmental Models

Razavi, Seyed Saman January 2013 (has links)
Environmental simulation models have been playing a key role in civil and environmental engineering decision making processes for decades. The utility of an environmental model depends on how well the model is structured and calibrated. Model calibration is typically in an automated form where the simulation model is linked to a search mechanism (e.g., an optimization algorithm) such that the search mechanism iteratively generates many parameter sets (e.g., thousands of parameter sets) and evaluates them through running the model in an attempt to minimize differences between observed data and corresponding model outputs. The challenge rises when the environmental model is computationally intensive to run (with run-times of minutes to hours, for example) as then any automatic calibration attempt would impose a large computational burden. Such a challenge may make the model users accept sub-optimal solutions and not achieve the best model performance. The objective of this thesis is to develop innovative strategies to circumvent the computational burden associated with automatic calibration of computationally intensive environmental models. The first main contribution of this thesis is developing a strategy called “deterministic model preemption” which opportunistically evades unnecessary model evaluations in the course of a calibration experiment and can save a significant portion of the computational budget (even as much as 90% in some cases). Model preemption monitors the intermediate simulation results while the model is running and terminates (i.e., pre-empts) the simulation early if it recognizes that further running the model would not guide the search mechanism. This strategy is applicable to a range of automatic calibration algorithms (i.e., search mechanisms) and is deterministic in that it leads to exactly the same calibration results as when preemption is not applied. One other main contribution of this thesis is developing and utilizing the concept of “surrogate data” which is basically a reasonably small but representative proportion of a full set of calibration data. This concept is inspired by the existing surrogate modelling strategies where a surrogate model (also called a metamodel) is developed and utilized as a fast-to-run substitute of an original computationally intensive model. A framework is developed to efficiently calibrate hydrologic models to the full set of calibration data while running the original model only on surrogate data for the majority of candidate parameter sets, a strategy which leads to considerable computational saving. To this end, mapping relationships are developed to approximate the model performance on the full data based on the model performance on surrogate data. This framework can be applicable to the calibration of any environmental model where appropriate surrogate data and mapping relationships can be identified. As another main contribution, this thesis critically reviews and evaluates the large body of literature on surrogate modelling strategies from various disciplines as they are the most commonly used methods to relieve the computational burden associated with computationally intensive simulation models. To reliably evaluate these strategies, a comparative assessment and benchmarking framework is developed which presents a clear computational budget dependent definition for the success/failure of surrogate modelling strategies. Two large families of surrogate modelling strategies are critically scrutinized and evaluated: “response surface surrogate” modelling which involves statistical or data–driven function approximation techniques (e.g., kriging, radial basis functions, and neural networks) and “lower-fidelity physically-based surrogate” modelling strategies which develop and utilize simplified models of the original system (e.g., a groundwater model with a coarse mesh). This thesis raises fundamental concerns about response surface surrogate modelling and demonstrates that, although they might be less efficient, lower-fidelity physically-based surrogates are generally more reliable as they to-some-extent preserve the physics involved in the original model. Five different surface water and groundwater models are used across this thesis to test the performance of the developed strategies and elaborate the discussions. However, the strategies developed are typically simulation-model-independent and can be applied to the calibration of any computationally intensive simulation model that has the required characteristics. This thesis leaves the reader with a suite of strategies for efficient calibration of computationally intensive environmental models while providing some guidance on how to select, implement, and evaluate the appropriate strategy for a given environmental model calibration problem.
16

Estética da precariedade: a agência dos equipamentos técnicos na fotografia de Miroslav Tichý / Aesthetics of precariousness: the agency of technical equipments in Miroslav Tichý photography

Paula Davies Rezende 02 June 2017 (has links)
Esta dissertação visa investigar a agência da câmera na criação fotográfica e a emergência de uma estética da precariedade nos dias atuais, tendo como estudo de caso o trabalho do fotógrafo checo Miroslav Tichý (1926-2011). A partir da descrição e classificação das câmeras de baixa fidelidade, buscou-se compreender como elas teriam contribuído para a subversão dos cânones da fotografia tradicional e para a consolidação de uma estética permeada por falhas denominada como estética da precariedade. As fotografias de Tichý, com seu aspecto ruidoso e deteriorado, decorrente de seu processo fotográfico peculiar, mostraram-se particularmente propícias para evidenciar, de forma concreta, a agência de elementos humanos e não-humanos na configuração dessa estética da precariedade. Além disso, será investigado como a obra de Miroslav Tichý e a estética da precariedade passaram pelo processo de artificação e foram alçados ao estatuto de arte por diversas instâncias de legitimação, como curadores e historiadores da arte. A pesquisa reflete, ainda, sobre a a apropriação, pasteurização e mercantilização da estética da precariedade por parte da indústria da fotografia digital, por meio de aplicativos e softwares voltados para a edição de imagens. Por fim, cabe ressaltar a intenção de abordar esses fenômenos a partir de um viés crítico, buscando identificar suas contradições e seus desdobramentos no campo da cultura contemporânea. / This dissertation aims to investigate the agency of the camera in the photographic creation and the emergence of an aesthetics of precariousness. The Czech photographer Miroslav Tichý (1926-2011) will be used as a case study. Beginning with the description and classification of \"low fidelity\" cameras, we sought to understand how they contributed to the subversion of the canons of traditional photography and to the consolidation of an aesthetic permeated by failures called the \"aesthetics of precariousness\". The photographs by Tichý, with their noisy and deteriorated appearance due to his peculiar photographic process, were particularly propitious to concretely evidence the agency of human and nonhuman elements in the configuration of this aesthetic of precariousness. In addition, it will be investigated how the work of Miroslav Tichý and the aesthetics of precariousness went through the process of artification and were elevated to the status of art by various instances of legitimation, as curators and art historians. This research also reflects about the appropriation, pasteurization and mercantilization of the aesthetics of precariousness by the digital photography industry, through image editing applications and software. Finally, it is worth emphasizing the intention to approach these phenomena from a critical point of view, seeking to identify their contradictions and their development in the field of contemporary culture.
17

Entertainement [!] for faster driving takeovers : Designing games for faster and safer takeovers on level 3 self-driving cars

Di Luccio, Luca January 2020 (has links)
The upcoming level 3 generation of self-driving vehicles will be characterized by the freedom of not having the driver’s hands on the steering wheel. This acquired freedom is posing new challenges on the traditional passenger comfort paradigm as the drivers will spend a higher amount of time doing non-driving tasks (NDRT). Certain constraints must be imposed as the level 3 generation systems will not be able to drive all the time without active feedback from the user. The driver needs to stay active enough to do takeover in a situation where it is needed to. What effect will different NDRT have on the behavior of a driver in a self-driving car? In our low fidelity driving simulator, we tested different simple actions (e.g. playing a simple 2D game). We then evaluated them based on their accident avoidance and situation awareness in the post-transition period. The results show a significant difference between the reaction speeds of the drivers before and after an active task.
18

Bypass of <i>N<sup>2</sup></i>-Deoxyguanosinyl Adducts by DNA Polymerases and Kinetic Implications for Polymerase Switching

Efthimiopoulos, Georgia 06 August 2013 (has links)
No description available.
19

Construct validity of situational judgment tests: An examination of the effects of agreeableness, organizational leadership culture, and experience on SJT responses

Shoemaker, Jonathan Adam 01 June 2007 (has links)
Numerous factors are likely to influence response patterns to situational judgment tests, including agreeableness, leadership style, impression management, and job and organizational experience. This research presents background information and research on situational judgment tests and several constructs hypothesized to influence situational judgment test responses. A situational judgment test and manipulations to influence response patterns were developed and piloted with a small sample of management professionals and undergraduate students. Larger samples of management professionals and undergraduate students participated in the experimental research. Participants were asked to imagine that they are applying for a job. Each participant was presented with background information about a fictitious company, describing a company as either highly Participative/Supportive or highly Directive/Achieving in its leadership culture. A third description provided no information about leadership culture to serve as a control. Participants responded to a situational judgment test consisting of some commercially developed items and some new items. Then participants responded to an inventory comprised of items that measure the factors hypothesized to influence response patterns, specifically Agreeableness and Experience. Significant differences in response patterns were determined to be attributable to the Agreeableness and Experience variables, and the Leadership Culture manipulations, as well as the interaction between Experience and the Leadership Culture manipulations. No significant differences were clearly attributable to the Agreeableness by Leadership Culture interaction. The ramifications of these findings are discussed and recommendations for future research are presented.
20

Conception sous incertitudes de modèles avec prise en compte des tests futurs et des re-conceptions / Optimizing the safety margins governing a deterministic design process while considering the effect of a future test and redesign on epistemic model uncertainty

Price, Nathaniel Bouton 15 July 2016 (has links)
Au stade de projet amont, les ingénieurs utilisent souvent des modèles de basse fidélité possédant de larges erreurs. Les approches déterministes prennent implicitement en compte les erreurs par un choix conservatif des paramètres aléatoires et par l'ajout de facteurs de sécurité dans les contraintes de conception. Une fois qu'une solution est proposée, elle est analysée par un modèle haute fidélité (test futur): une re-conception peut s'avérer nécessaire pour restaurer la fiabilité ou améliorer la performance, et le modèle basse fidélité est calibré pour prendre en compte les résultats de l'analyse haute-fidélité. Mais une re-conception possède un coût financier et temporel. Dans ce travail, les effets possibles des tests futurs et des re-conceptions sont intégrés à une procédure de conception avec un modèle basse fidélité. Après les Chapitres 1 et 2 qui donnent le contexte de ce travail et l'état de l'art, le Chapitre 3 analyse le dilemme d'une conception initiale conservatrice en terme de fiabilité ou ambitieuse en termes de performances (avec les re-conceptions associées pour améliorer la performance ou la fiabilité). Le Chapitre 4 propose une méthode de simulation des tests futurs et de re-conception avec des erreurs épistémiques corrélées spatialement. Le Chapitre 5 décrit une application à une fusée sonde avec des erreurs à la fois aléatoires et de modèles. Le Chapitre 6 conclut le travail. / At the initial design stage, engineers often rely on low-fidelity models that have high uncertainty. In a deterministic safety-margin-based design approach, uncertainty is implicitly compensated for by using fixed conservative values in place of aleatory variables and ensuring the design satisfies a safety-margin with respect to design constraints. After an initial design is selected, high-fidelity modeling is performed to reduce epistemic uncertainty and ensure the design achieves the targeted levels of safety. High-fidelity modeling is used to calibrate low-fidelity models and prescribe redesign when tests are not passed. After calibration, reduced epistemic model uncertainty can be leveraged through redesign to restore safety or improve design performance; however, redesign may be associated with substantial costs or delays. In this work, the possible effects of a future test and redesign are considered while the initial design is optimized using only a low-fidelity model. The context of the work and a literature review make Chapters 1 and 2 of this manuscript. Chapter 3 analyzes the dilemma of whether to start with a more conservative initial design and possibly redesign for performance or to start with a less conservative initial design and risk redesigning to restore safety. Chapter 4 develops a generalized method for simulating a future test and possible redesign that accounts for spatial correlations in the epistemic model error. Chapter 5 discusses the application of the method to the design of a sounding rocket under mixed epistemic model uncertainty and aleatory parameter uncertainty. Chapter 6 concludes the work.

Page generated in 0.0369 seconds