• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3874
  • 1125
  • 353
  • 338
  • 324
  • 182
  • 162
  • 95
  • 64
  • 59
  • 48
  • 42
  • 38
  • 28
  • 28
  • Tagged with
  • 8691
  • 2532
  • 1656
  • 1510
  • 1443
  • 1287
  • 1228
  • 1200
  • 1039
  • 998
  • 974
  • 972
  • 945
  • 870
  • 826
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

Parallel genetic algorithms : an efficient model and applications in control systems

Muhammad, Ali January 1997 (has links)
Optimisation is an adaptiver process and it is widely applied in science and engineering from scheduling a manufacturing process to control of a spacecraft. Genetic algorithms are search and optimisation methods based on the mechanics of natural evolution. they provide promising results in non-linear and complex search problems. They have been proven to be parallel and global in nature. Genetic algorithms run slowly on sequential machines which is their mmmajor drawback. Most of the applications of genetic algorithm in engineering are in the area of design and schedule optimisation, where usually enough time is avialable to simulate the algorithm. The computer architecture is a main bottleneck since the sequential computation does not reflect the true spatial structure of the algorithm. There are a couple of parallel models and implementations available which improve the performance of these algorithms. The aim of this research is to develop a new model and/or to improve existing parallel models for real-time application of these methods in system identification and intelligent control. The desired features of this new model are: it should be independent of the optimisation problem, so it could be able to cope with a black box problem, it could be used in real-time applications, where the exact model of the system is unknown, and it should be implementable within the current technological framework. An extensive study of the current literasture on genetic algorithms has been carried out. A detailed review of the underlying theory of genetic algorithms has also been presented. A parallel model of genetic algorithm has been proposed and implemented on a transputer based system using ANSI C toolset for transputers. It has been tested for different strategies on the traditional suite of optimisation problems, ie DeJong's function and Deceptive functions. The results are compared using the performance measures proposed by DeJong. The performance and efficiency measures of the algorithm have been defined and worked out for the simulation results. The research has advanced the understanding of genetic algorithms as stochastic processes. A Markov chain based mathematical model has been developed. An informal study of convergence properties of the algorithm are presented from different points of view, ie time series, real analysis, Markov chains and metric topology. Gradient like information has been integrated into the genetic search in order to improve the performance and efficiency of the algorithm. A novel directional search method has been developed tested on the same set of problems and comapred using the same performance and efficiency measures as those reported in the recent publications. Unlike neural networks and fuzzy systems, genetic algorithms do not provide any general logic for system modelliung. Therefore system identification is achieved by using the fuzzy network for general logic and a genetic algorithm for parameter estimation giving as a result an evolving fuzzy network. This novel method has been applied to modelling of chaotic time series and it has been used to control a highly non-linear system, ie inverse pendulum. It is expected that with the advance of re-configurable electronics, evolutionary chips will be realised in the near future. They will play an important role in the development of genetic algorithms based control systems
52

The role of domain decomposition in the parallelisation of genetic search for multi-modal numerical optimisation

Vincent, Jonathan January 2001 (has links)
This thesis is concerned with the optimisation of multi-modal numerical problems using genetic algorithms. Genetic algorithms are an established technique, inspired by principles of genetics and evolution, and have been successfully utilised in a wide range of applications. However, they are computationally intensive and consequently, addressing problems of increasing size and complexity has led to research into parallelisation. this thesis is concerned with coarse-grained parallelism because of the growing importance of cluster computing. Current parallel genetic algorithm technology offers one coarse-grained approach, usually referred to as the island model. Parallelisation is concerned with the division of a computational system into components which can be executed concurrently on multiple processing nodes. It can be based on a decomposition of either the process or the domain on which it operates. The island model is a process based approach, which divides the genetic algorithm population into a number of co-operating sub-populations. This research examines an alternative approach based on domain decomposition - the search space is divided into a number of regions which are separately optimised. The aims of this research are to determine whether domain decomposition is useful in terms of search performance, and whether it is feasible when there is no a priori knowledge of the search space. It is established, empirically that domain decomposition offers a more robust sampling of the search space. It is further shown that the approach is beneficial when there is an element of deception in the problem. However, domain decomposition is non-trivial when the domain is irregular. the irregularities of the search space result in a computational load imbalance which would reduce the efficiency of a parallel implementation. To address this, a dynamic load-balancing algorithm is developed which adjusts the decomposition of the search space, at run time according to the fitness distribution. Using this algorithm, it is showm that domain decomposition is feasible, and offers significant search advantages in the field of multi-modal numerical optimisation. The performance is compared with a serial implementation and an island model parallel implementation on a number of non-trivial problems. It is concluded that the domain decomposition approach offers superior performance on these problems in terms of rapid convergence and final solution quality. Approaches to the extension and generalisation of the approach are suggested for further work.
53

Investigation of intelligent processing methods for surface electromyogram signals

Wu, Pihong January 2001 (has links)
Some electromyogram (EMG) signals include information from limb functions and have been used to control the movement of prosthesis. To develop practical malfunction myoelectically controlled prosthesis using an EMG signal, the best identification model for limb functions has to be established. The thesis presents an investigation and comparison between different identification methods for limb functions based on the use of statistical processing, neural networks, and fuzzy methods. The research focused on the establishment of robust identification models of limb functions in order to obtain objective comparison results. Therefore, some factors affecting the robustness of an identification model were investigated in detail, such as, how to choose the length of signal, extract and select the optimal features, and establish and select a model. The analysis of the power spectrum and eigenvalues of the autocorrelation matric of a surface EMG signal was used to decide the length of signals. A selection method of optimal features was presented through the analysis of the advantage and disadvantage of the Jeffries-Matusita distance measurement and the cluster separation index (CSI). The confidence interval of the recognition rate of a model was used to measure the robustness of the model and decide the size of the training and test sets. The different identification models were established using the optimal features using different pattern recognition methods, based on statistical, neural network and fuzzy methods. During ther modelling process of neural networks, the Bayesian technique was used to avoid the model tending to the idiosyncrasies of the test set. During the fuzzy logic modelling process, methods for extracting fuzzy rules were investigated to establish a good fuzzy identification model for limb functions. In the selection process of model, different selection methods of models were used according to the practical situation. Finally, noise analyses were completed to assess the sensitivity of some methods to noise. The comparisons were made between different identification methods based on the identification results, their confidence inv=tervals and their modelling methods. The best processing methods were concluded
54

Finite element analysis using fe-based neural networks

Xu, Guohe January 2001 (has links)
The thesis is based on the research work that was carried out to investigate Finite Element Analysis (FEA) using Artificial Neural Networks (ANN). A novel ANN model, Finite Element-based Neural Networks (FE-based NN) was proposed and applied to dynamic problems in mechanics. Firstly the variational approach to a functional in solid mechanics and the structural analogies between FEM and ANN were introduced. The computation energy functional of the FE-based NN was defined. Furthermore the architecture of FE-based NN was constructed and its algorithm was derived with the variational approach to the computational energy functional. The convergence of the FE-based NN was proved and the range of the main parameters was determined. The dynamic analysis of a beam element structure was considered as an application evaluator. The index of speedup was investigated for the measure of the computational efficiency of the FE-based NN. The simulation results were promising, which were verified by the experiment results and the computations with the commercial software package, ANSYS for the finite element analysis. Finally, the conclusion and the recommendations for further work of the investigation were discussed.
55

Artificial intelligence for identifying impacts on smart composites

Shan, Qingshan January 2004 (has links)
Identification of low-velocity impacts to composite structures has become increasingly important in the aerospace industry. Knowing when impacts have occurred would allow inspections to be scheduled only when necessary, and knowing the approximate impact location would allow for a localized search, saving time and expense. Additionally, an estimation of the impact magnitude could be used for damage prediction. This study experimentally investigated a methodology for impact identification. To achieve the approach, the following issues were covered in this study: impact detecting; signal processing; feature extractioon; impact identification. In impact detection, them smart stuctures, two piezoelectric sensors embedded in composite structures, are designed to measure impact signals caused by foreign object impact events. The impact signals were stored in computer system memory through the impact monitoring system developed in this study. In signal processing, the cross correlation method was used to process the measured impact signals. This processing built the correlation between the impact signals and location of impacts as well as impact magnitude. In feature extraction, the initial feature data were gained from the cross correlation results through the point and segmentation processing. thie final feature data were selected from the initial feature data with a fuzzy clustering method. In impact identification, the adaptive deuro fuzzy inference systems (ANFIS) were built with the feature data to identify abscissas of impact location, ordinates of impact location and impact magnitude. The parameters of the ANFISs were refined with a hybrid learning rule i.e. the combination of the Least Square Estimation and the Steepest Descent algorithm. Real time software developed in Visual Basic code manipulated the monitoring and control system for the impact experiments. Also a software package developed with MATLAB, implemented the impact identification and the system simulation.
56

Evaluating and integrating software process improvement models and security engineering principles

Li, Haiwen January 2005 (has links)
The research is concerned with the management of software quality and information system security in rapidly changing business environments. Project development life cycles are becoming more complex and e-commerce is growing rapidly. Suppliers will offer new and exciting services but decision makers are faced with the challenge of identifying the information security solutions and reducing business risks. Both customers and suppliers are interested in improving the development of security products, system and services. The field of security engineering has several generally accepted principles, but it currently lacks a comprehensive framework for evaluating security-engineering practices and integrating security engineering approaches with software quality improvment models. The aims of this research are 1) to evaluate existing security engineering principles and software process improvement models (such as ISO 15504, CMM, ISO 17799), to identify weaknesses through a comparison. 2) To analyse and investigate the current security management practices in the different organisations, to explore and identify the potential security risks. 3) To integrate and set up a bridge between software quality improvement processes and security engineering principles. 4) To design a model which can provide organisations with guidance on how to gain control of their processes for developing software quality improvement and information security management, and how to evolve towards a culture of security management process through overcoming the weaknesses in above models. The literature review has been conducted to study the existing software process assessment and information security management models. The well-known software process assessment models CMM, ISO 15504, BOOTSTRAP, the information security management standard ISO 17799 and the USD Generally Accepted Security System Pronciple (GASSP) and SSE-CMM have been analysed. The strengths and weakness of these models have been highlighted from model structure, major functions and frame analysis. Additionally journals and conferences proceedings provide information and a comprehensive knowledge and background for informatuion security management in rapidly changing and e-business environment. In this study surveys on information security management in rapidly changing and e-business environments have been conducted, focusing on exploring and investigating the security management processes and ISO 17799 information security standard usage in different kinds of organisations. The differences between UK and non-UK organisations have been analysed. Some major activities for info-security management and ISO 17799 current status are highlighted, the most important security risk management processes and potential weaknesses have also been analysed. Based on these results, recommendations and further considerations are presented for software houses, e-business companies, financial and security consultant organisations. To provide valuable input in the development of such an approach, an in-depth analysis of the information security management special issues and best practices has been carried out. This research also integrates the security engineering process into a project lifecycle. A new Security Engineering Process Improvement Approach (SEPIA) has been developed as a major contribution to the software industry that fills an important gap between software quality improvement modelling and security engineering principles. It includes more than 120 detailed process improvement and control areas. The SEPIA model has been validated and verified in a global organisation, details of five projects have been presented and analysed, the existing problems in the organisation have been highlighted based on the SEPIA model. After the verification and validation activities, more inputs were also gained to achieve the final SEPIA model. The new model provides organisations with guidance and extra audit reference on how to gain control of their processes for devloping software security management, and how to evolve towards a culture of security management process through overcoming the weaknesses in the existing guidelines
57

Performance improvement of quality of service routing under inaccurate link-state information

Wang, Qi January 2004 (has links)
It has been observed that the current best-effort IP packet delivery service in the global Internet is sometimes not good enough for emerging real-time multimedia applications. These resource intensive applications normally have more stringent requirements on bandwidth, delay, delay jitter etc. The quality of service (QoS) requirements of these applications raise new challenges for the development of new routing mechanisms. QoS routing can provide increased network utilisation compared to best-effort routing by efficiently regulating and managing resource sharing across a network. However, the benefit of QoS routing comes with complex routing computation costs and increased routing protocol overhead. It is impractical to collect detailed global state information and keep it up-to-date in large-scale dynamic networks, such as the Internet. As a result, inaccurate link-state information increases the flow blocking probability and makes source nodes select non-optimal paths. To maximise the link utilisation and meet application QoS requirements, routing algorithms need accurate link-state information to make routing decisions. This thesis investigates the statistical properties of time series of link utilisation. In particular, the evaluation focuses on the presence of autocorrelation in the time series. Further study under various link-state update policies, network and traffic configurations identifies the factors that may affect the statistical properties of the time series. Based on this analysis, a prediction-based link-state update policy is proposed to reduce the effect of inaccurate link-state information. The approach predicts the link-state utilisation trend based on past values. By advertising trend rather than instantaneous link utilisation, the routing algorithms may have more valuable information to make routing decisions instead of being affected by short lived sudden changes. An appropriate model that can satisfactorily fit the actual model is identified, estimated and validated. Finally, the performance of the proposed prediction-based link-state update policy is validated by simulation and compared with conventional update policies under a variety of network configurations. the results show that this approach is effective in improving routing performance.
58

Computing with meaning by operationalising socio-cognitive semantics

McArthur, Robert James January 2007 (has links)
This thesis is motivated by the desire to provide technological solutions to enhance human awareness in information processing tasks. The need is pressing. Paradoxically, as information piles up people become less and less aware due to perceived scarce cognitive resources. As a consequence, specialisations become ever more specialised, projects and individuals in organisations become ever more insular. Technology can enhance awareness by informing the individual about what is happening outside their speciality. Systems which can assist people in these ways need to make sense of human communication. The computer system must know about what it is that it is processing; it must follow a socio-cognitive framework and reason with it. It must compute with meanings not symbolic surface structures. The hypothesis of the thesis is that knowledge potentially useful for enhancing awareness can be derived from interactions between people using computational models based on socio-cognitive semantics. The goals are whether an appreciable approximation of conceptual spaces can be realised through semantic spaces, and whether such semantic spaces can develop representations of meaning which have the potential to enhance the awareness of users? The two thesis questions are how well the socio-cognitive framework of G¨ardenfors could be brought into operational reality, and if a bridge can be made, then what practical issues can be involved? The theory of conceptual spaces of Peter G¨ardenfors is combined with methods from cognitive science for creating geometric spaces to represent meaning. Hyperspace Analogue to Language and Latent Semantic Analysis are used as exemplars of the cognitive science algorithms. The algorithms are modified by a variety of syntactic processing schemes to overcome a paucity of data and hence lack of expressivity in representations of meaning: part-of-speech tagging, index expressions and anaphora resolution are effected and incorporated into the semantic space. The practical element of the thesis consists of five case studies. These are developed in two parts: studies describing how meaning changes and evolves in semantic spaces, and studies describing semantic space applications featuring knowledge discovery. These studies are in a variety of domains with a variety of data: online communities of interest using a mailing list, a health-based mailing list, organisational blogs, "hallway chatter", and organisational email. The data is real world utterances that provide the situational factors that cognitive systems need to answer queries and provide context. The amounts of data are significantly less than previously used by semantic space methods, hence the need for syntactic assistance. The particular problems examined in the case studies are corporate expertise management, social network discovery, tracking ebbs and flows of topics, and noticing the change in a person's sense-of-self over time. These are significantly different to those usually examined using semantic spaces. The key differentiator of this work stems from its focus on the geometrically-based computational realisation of meaning. This thesis takes semantic spaces out of the closet and into real-world information technology applications, with a roadtest in real life.
59

InfoSpaces : eine ubiquitäre Anwendung für dezentralen Datenaustausch und anonyme Peer-to-Peer-Zugriffskontrolle /

Voigt, Sebastian. January 2008 (has links)
Univ., Diss.--Hannover, 2007.
60

Management von verteilten ingenieurwissenschaftlichen Anwendungen in heterogenen Grid Umgebungen

Lindner, Peggy, January 2007 (has links)
Stuttgart, Univ., Diss., 2007.

Page generated in 0.0759 seconds