• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 342
  • 128
  • 49
  • 38
  • 12
  • 10
  • 9
  • 7
  • 5
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 705
  • 183
  • 93
  • 88
  • 86
  • 76
  • 68
  • 53
  • 53
  • 53
  • 52
  • 51
  • 48
  • 41
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Optimisation-based verification process of obstacle avoidance systems for unmanned vehicles

Thedchanamoorthy, Sivaranjini January 2014 (has links)
This thesis deals with safety verification analysis of collision avoidance systems for unmanned vehicles. The safety of the vehicle is dependent on collision avoidance algorithms and associated control laws, and it must be proven that the collision avoidance algorithms and controllers are functioning correctly in all nominal conditions, various failure conditions and in the presence of possible variations in the vehicle and operational environment. The current widely used exhaustive search based approaches are not suitable for safety analysis of autonomous vehicles due to the large number of possible variations and the complexity of algorithms and the systems. To address this topic, a new optimisation-based verification method is developed to verify the safety of collision avoidance systems. The proposed verification method formulates the worst case analysis problem arising the verification of collision avoidance systems into an optimisation problem and employs optimisation algorithms to automatically search the worst cases. Minimum distance to the obstacle during the collision avoidance manoeuvre is defined as the objective function of the optimisation problem, and realistic simulation consisting of the detailed vehicle dynamics, the operational environment, the collision avoidance algorithm and low level control laws is embedded in the optimisation process. This enables the verification process to take into account the parameters variations in the vehicle, the change of the environment, the uncertainties in sensors, and in particular the mismatching between model used for developing the collision avoidance algorithms and the real vehicle. It is shown that the resultant simulation based optimisation problem is non-convex and there might be many local optima. To illustrate and investigate the proposed optimisation based verification process, the potential field method and decision making collision avoidance method are chosen as an obstacle avoidance candidate technique for verification study. Five benchmark case studies are investigated in this thesis: static obstacle avoidance system of a simple unicycle robot, moving obstacle avoidance system for a Pioneer 3DX robot, and a 6 Degrees of Freedom fixed wing Unmanned Aerial Vehicle with static and moving collision avoidance algorithms. It is proven that although a local optimisation method for nonlinear optimisation is quite efficient, it is not able to find the most dangerous situation. Results in this thesis show that, among all the global optimisation methods that have been investigated, the DIviding RECTangle method provides most promising performance for verification of collision avoidance functions in terms of guaranteed capability in searching worst scenarios.
132

Surrogate-assisted optimisation-based verification & validation

Kamath, Atul Krishna January 2014 (has links)
This thesis deals with the application of optimisation based Validation and Verification (V&V) analysis on aerospace vehicles in order to determine their worst case performance metrics. To this end, three aerospace models relating to satellite and launcher vehicles provided by European Space Agency (ESA) on various projects are utilised. As a means to quicken the process of optimisation based V&V analysis, surrogate models are developed using polynomial chaos method. Surro- gate models provide a quick way to ascertain the worst case directions as computation time required for evaluating them is very small. A sin- gle evaluation of a surrogate model takes less than a second. Another contribution of this thesis is the evaluation of operational safety margin metric with the help of surrogate models. Operational safety margin is a metric defined in the uncertain parameter space and is related to the distance between the nominal parameter value and the first instance of performance criteria violation. This metric can help to gauge the robustness of the controller but requires the evaluation of the model in the constraint function and hence could be computationally intensive. As surrogate models are computationally very cheap, they are utilised to rapidly compute the operational safety margin metric. But this metric focuses only on finding a safe region around the nominal parameter value and the possibility of other disjoint safe regions are not explored. In order to find other safe or failure regions in the param- eter space, the method of Bernstein expansion method is utilised on surrogate polynomial models to help characterise the uncertain param- eter space into safe and failure regions. Furthermore, Binomial failure analysis is used to assign failure probabilities to failure regions which might help the designer to determine if a re-design of the controller is required or not. The methodologies of optimisation based V&V, surrogate modelling, operational safety margin, Bernstein expansion method and risk assessment have been combined together to form the WCAT-II MATLAB toolbox.
133

Using Simulated Annealing for Robustness in Coevolutionary Genetic Algorithms

Weldon, Ruth 01 January 2014 (has links)
Simulated annealing is a useful heuristic for finding good solutions for difficult combinatorial optimization problems. In some engineering applications the quality of a solution is based upon how tolerant the solution is to changes in the environment. The concept of simulated annealing is based upon the metallurgical process of annealing where a material is tempered by heating and cooling. Genetic algorithms have been used to evolve solutions to complex problems by imitating the biological process of evolution using crossover and mutation to modify the candidate solutions. In coevolution a candidate solution is composed of multiple species each of which provides a portion of the candidate solution. Those individuals of a species that, in collaboration with the individuals from the other species, are evaluated as providing the most fit solution are the preferred individuals of a species. This work investigated whether robustness, defined as the ability of a solution to tolerate changes to the problem environment, could be improved by defining a neighborhood of fitness functions that are centered in the neighborhood of the nominal objective function. Simulated annealing was used to manage the subsequent narrowing of the neighborhood of fitness functions. Two robustness measures were developed that used samples from the neighborhood of objective functions; one employed the minimum fitness value, and the other employed the average fitness value. Coevolutionary genetic algorithms were used to generate candidate solutions employing the robustness measures. This study used three benchmark functions to evaluate the effects of the robustness measures. The results indicated that the robustness measures could produce solutions that were robust and, often, globally optimal for benchmark functions employed in the testing. Future work includes applying this framework to a broader class of optimization problems, investigating new neighborhood strategies, and devising new robustness measures.
134

Innovative configurable and collaborative approach to automation systems engineering for automotive powertrain assembly

Haq, Izhar Ul January 2009 (has links)
Presently the automotive industry is facing enormous pressure due to global competition and ever changing legislative, economic and customer demands. Both, agility and reconfiguration are widely recognised as important attributes for manufacturing systems to satisfy the needs of competitive global markets. To facilitate and accommodate unforeseen business changes within the automotive industry, a new proactive methodology is urgently required for the design, build, assembly and reconfiguration of automation systems. There is also need for the promotion of new technologies and engineering methods to enable true engineering concurrency between product and process development. Virtual construction and testing of new automation systems prior to build is now identified as a crucial requirement to enable system verification and to allow the investigation of design alternatives prior to building and testing physical systems. The main focus of this research was to design and develop reconfigurable assembly systems within the powertrain sector of the automotive industry by capturing and modelling relevant business and engineering processes. This research has proposed and developed a more process-efficient and robust automation system design, build and implementation approach via new engineering services and a standard library of reusable mechanisms. Existing research at Loughborough had created the basic technology for a component based approach to automation. However, no research had been previously undertaken on the application of this approach in a user engineering and business context. The objective of this research was therefore to utilise this prototype method and associated engineering tools and to devise novel business and engineering processes to enable the component-based approach to be applied in industry. This new approach has been named Configurable and Collaborative Automation Systems (CO AS). In particular this new research has studied the implications of migration to a COAS approach in terms of I) necessary changes to the end-users business processes, 2) potential to improve the robustness of the resultant system and 3) potential for improved efficiency and greater collaboration across the supply chain.
135

Assessments of phenotypic variations and variability as a tool for understanding evolutionary processes in echinoids

Schlüter, Nils 14 April 2016 (has links)
No description available.
136

From Timed Models to Timed Implementations

De Wulf, Martin 20 December 2006 (has links)
<p align="justify">Computer Science is currently facing a grand challenge : finding good design practices for embedded systems. Embedded systems are essentially computers interacting with some physical process. You could find one in a braking systems or in a nuclear power plant for example. They present several design difficulties : first they are reactive systems, interacting indefinitely with their environment. Second,they must satisfy real-time constraints specifying when they should respond, and not only how. Finally, their environment is often deeply continuous, presenting complex dynamics. The formal models of choice for specifying such systems are timed and hybrid automata for which model checking is pretty well studied.</p> <p align="justify">In a first part of this thesis, we study a complete design approach, including verification and code generation, for timed automata. We have to define a new semantics for timed automata, the AASAP semantics, that preserves the decidability properties for model checking and at the same time is implementable. Our notion of implementability is completely novel, and relies on the simulation of a semantics that is obviously implementable on a real platform. We wrote tools for the analysis and code generation and exemplify them on a case study about the well known Philips Audio Control Protocol.</p> <p align="justify">In a second part of this thesis, we study the problem of controller synthesis for an environment specified as a hybrid automaton. We give a new solution for discrete controllers having only an imperfect information about the state of the system. In the process, we defined a new algorithm, based on the monotonicity of the controllable predecessors operator, for efficiently finding a controller and we show some promising applications on a classical problem : the universality test for finite automata.
137

Robustness analysis with integral quadratic constraints, application to space launchers.

Chaudenson, Julien 04 December 2013 (has links) (PDF)
The introduction of analytical techniques along the steps of the development of a space launcher will allow significant reductions in terms of costs and manpower, and will enable, by a more systematical way of tuning and assessing control laws, to get flyable designs much faster. In this scope, IQC based tools already present promising result and show that they may be the most appropriate ones for the robustness analysis of large complex systems. They account for the system structure and allow dealing specifically with each subsystems, it means that we can improve the representation contained in the multipliers easily and reuse the set up to assess the improvements. The flexibility of the method is a huge advantage. We experienced it during two phases. The first was dedicated to the analysis of the three-degree-of-freedom uncertain nonlinear equation of motion of a rigid body. Secondly, we studied the influence of the pulse-width modulator behavior of the attitude control system on the launcher stability. IQC-based stability analysis allowed defining estimations of the stability domain with respect to uncertainties and system parameters. Moreover, the results obtained with IQC can go way beyond stability analysis with performance analysis with description of the particular performance criteria of the field with appropriate multipliers. Later on controller synthesis and merging of IQC method with worst-case search algorithms could extend greatly the frame of use of this analytical tool and give it the influence it deserves.
138

Regularization Using a Parameterized Trust Region Subproblem

Grodzevich, Oleg January 2004 (has links)
We present a new method for regularization of ill-conditioned problems that extends the traditional trust-region approach. Ill-conditioned problems arise, for example, in image restoration or mathematical processing of medical data, and involve matrices that are very ill-conditioned. The method makes use of the L-curve and L-curve maximum curvature criterion as a strategy recently proposed to find a good regularization parameter. We describe the method and show its application to an image restoration problem. We also provide a MATLAB code for the algorithm. Finally, a comparison to the CGLS approach is given and analyzed, and future research directions are proposed.
139

A Formal Approach for Designing Distributed Self-Adaptive Systems

Gil de la Iglesia, Didac January 2014 (has links)
Engineering contemporary distributed software applications is a challenging task due to the dynamic operating conditions in which these systems have to function. Examples are dynamic availability of resources, errors that are difficult to predict, and changing user requirements. These dynamics can affect a number of quality concerns of a system, such as robustness, openness, and performance. The challenges of engineering software systems with such dynamics have motivated the need for self-adaptation. Self-adaptation is based on the principle of separation of concerns, distinguishing two well defined systems: a managed system that deals with domain specific concerns and a managing system that deals with particular quality concerns of the managed system through adaptation with a feedback loop. State of the art in self- adaptation advocates the use of formal methods to specify and verify the system's behavior in order to provide evidence that the system's goals are satisfied. However, little work has been done on the consolidation of design knowledge to model and verify self-adaptation behaviors. To support designers, this thesis contributes with a set of formally specified templates for the specification and verification of self-adaptive behaviors of a family of distributed self-adaptive systems. The templates are based on the MAPE-K reference model (Monitor-Analyze-Plan-Execute plus Knowledge). The templates comprise: (1) behavior specification patterns for modeling the different MAPE components of a feedback loop, and (2) property specification patterns that support verification of the correctness of the adaptation behaviors. The target domain are distributed applications in which self-adaptation is used for managing resources for robustness and openness requirements. The templates are derived from expertise with developing several self-adaptive systems, including a collaborative mobile learning application in which we have applied self-adaptation to make the system robust to degrading GPS accuracy, and a robotic system in which we apply self-adaptation to support different types of openness requirements. We demonstrate the reusability of the templates in a number of case studies. / AMULETS
140

On the significance of neutral spaces in adaptive evolution

Schaper, Steffen January 2012 (has links)
Evolutionary dynamics arise from the interplay of mutation and selection. Fundamentally, these two processes operate at different levels: Mutations modify genetic information (the genotype), which is passed from parent to offspring. Selection is triggered by variation in reproductive success, which depends on the physical properties (the phenotype) of an organism and its environment. Thus the genotype-phenotype map determines if and how mutations can lead to selection. The aim of this dissertation is to incorporate this map explicitly into a theoretical description of evolutionary dynamics. The first part of the analysis presented here is concerned with the static properties of simple models of these maps, which are studied using exhaustive enumeration. The two most important observations are phenotypic bias – some phenotypes are realized by many more genotypes than most other phenotypes – and the existence of neutral spaces – genotypes with the same phenotype can often be reached from each other by single mutational steps. The remainder of the dissertation provides a theoretical description of evolutionary dynamics on and across neutral spaces. Two different mean-field approximations lead to simple analytic results for the first discovery of alternative phenotypes, highlighting the importance of phenotypic bias: Rare phenotypes are hard to find by evolutionary search. These results are used to discuss the relationship of robustness, the ability to withstand mutational change, and evolvability, the ability to create variation through mutation. Several types of fluctuations beyond the mean-field limit are studied, both theoretically and in simulations. The discrete structure of genotype spaces can lead to strong correlations in the spectra of phenotypes produced, increasing the probability that a particular phenotype is fixed in the population quickly after its discovery. Structural correlations between genotypes can increase the effect of phenotypic bias, while the qualitative features of the mean-field description remain valid. All these results highlight that neutral spaces impact evolutionary dynamics in many non-trivial ways, in particular by favouring phenotypes of high accessibly, but comparably low fitness over those phenotypes that are highly fit, but very hard to discover.

Page generated in 0.0757 seconds