• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 487
  • 487
  • 487
  • 171
  • 157
  • 155
  • 155
  • 68
  • 57
  • 48
  • 33
  • 29
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Active database behaviour : the REFLEX approach

Naqvi, Waseem Hadder January 1995 (has links)
Modern day and new generation applications have more demanding requirements than traditional database management systems (DBMS) are able to support. Two of these requirements, timely responses to the change of database state and application domain knowledge stored within the database, are embodied within active database technology. Currently, there are a number of research prototype active database systems throughout the world. In order for an organisation to use any such prototype system, it may have to forsake existing products and resources and embark on substantial reinvestment in the new database products and associated resources and retraining costs. This approach would clearly be unfavourable as it is expensive both in terms of time and money. A more suitable approach would be to allow active behaviour to be added onto their existing systems. This scenario is addressed within this research. It investigates how best active behaviour can be augmented to existing DBMSs, so as to preserve the investments in an organisation's resources, by examining the following issues, (i.) what form the knowledge model should take, (ii.) should rules and events be modelled as first class objects, (iii.) how will the triggering events be specified, (iv.) how the user will interact with the system. Various design decisions were taken, which were investigated by implementation of a series of working prototypes, on the ONTOS DBMS platform. The resultant REFLEX model was successfully ported and adapted onto a second POET platform. The porting process uncovered some interesting issues regarding preconceived ideas about the portability of open systems.

Reducing deadline miss rate for grid workloads running in virtual machines : a deadline-aware and adaptive approach

Khalid, Omer January 2011 (has links)
This thesis explores three major areas of research; integration of virutalization into scientific grid infrastructures, evaluation of the virtualization overhead on HPC grid job’s performance, and optimization of job execution times to increase their throughput by reducing job deadline miss rate. Integration of the virtualization into the grid to deploy on-demand virtual machines for jobs in a way that is transparent to the end users and have minimum impact on the existing system poses a significant challenge. This involves the creation of virtual machines, decompression of the operating system image, adapting the virtual environment to satisfy software requirements of the job, constant update of the job state once it’s running with out modifying batch system or existing grid middleware, and finally bringing the host machine back to a consistent state. To facilitate this research, an existing and in production pilot job framework has been modified to deploy virtual machines on demand on the grid using virtualization administrative domain to handle all I/O to increase network throughput. This approach limits the change impact on the existing grid infrastructure while leveraging the execution and performance isolation capabilities of virtualization for job execution. This work led to evaluation of various scheduling strategies used by the Xen hypervisor to measure the sensitivity of job performance to the amount of CPU and memory allocated under various configurations. However, virtualization overhead is also a critical factor in determining job execution times. Grid jobs have a diverse set of requirements for machine resources such as CPU, Memory, Network and have inter-dependencies on other jobs in meeting their deadlines since the input of one job can be the output from the previous job. A novel resource provisioning model was devised to decrease the impact of virtualization overhead on job execution. Finally, dynamic deadline-aware optimization algorithms were introduced using exponential smoothing and rate limiting to predict job failure rates based on static and dynamic virtualization overhead. Statistical techniques were also integrated into the optimization algorithm to flag jobs that are at risk to miss their deadlines, and taking preventive action to increase overall job throughput.

Model updating of modal parameters from experimental data and applications in aerospace

Keye, Stefan January 2003 (has links)
The research in this thesis is associated with different aspects of experimental analyses of structural dynamic systems and the correction of the corresponding mathematical models using the results of experimental investigations as a reference. A comprehensive finite-element model updating software technology is assembled and various novel features are implemented. The software technology is integrated into an experimental test facility for structural dynamic identification and used in a number of real life aerospace applications which illustrate the advantages of the new features. To improve the quality of the experimental reference data a novel non-iterative method for the computation of optimised multi-point excitation force vectors for Phase Resonance Testing is introduced. The method is unique in that it is based entirely on experimental data, allows to determine both the locations and force components resulting in the highest phase purity, and enable to predict the corresponding mode indicator. A minimisation criterion for the real-part response of the test structure with respect to the total response is utilised and, unlike with the application of other methods, no further information such as a mass matrix from a finite-element model or assumptions on the structure's damping characteristics is required. Performance in comparison to existing methods is assessed in a numerical study using an analytical eleven-degrees-of-freedom model. Successful applications to a simple laboratory satellite structure and under realistic test conditions during the Ground Vibration Test on the European Space Agency's Polar Platform are described. Considerable improvements are achieved with respect to the phase purity of the identified mode shapes as compared to other methods or manual tuning strategies as well as the time and effort involved in the application during Ground Vibration Testing. Various aspects regarding the application of iterative model updating methods to aerospace-related test structures and live experimental data are discussed. A new iterative correction parameter selection technique enabling to create a physically correct updated analytical model and a novel approach for the correction of structural components with viscous material properties are proposed. A finite-element model of the GARTEUR SM-AG19 laboratory test structure is updated using experimental modal data from a Ground Vibration Test. In order to assess the accuracy and physical consistency of the updated model a novel approach is applied where only a fraction of the mode shapes and natural frequencies from the experimental data base is used in the model correction process and analytical and experimental modal data beyond the range utilised for updating are correlated. To evaluate the influence of experimental errors on the accuracy of finite-element model corrections a numerical simulation procedure is developed. The effects of measurement uncertainties on the substructure correction factors, natural frequency deviations, and mode shape correlation are investigated using simulated experimental modal data. Various numerical models are generated to study the effects of modelling error magnitudes and locations. As a result, the correction parameter uncertainty increases with the magnitude of the experimental errors and decreases with the number of modes involved in the updating process. Frequency errors, however, since they are not averaged during updating, must be measured with an adequately high precision. Next, the updating procedure is applied to an authentic industrial aerospace structure. The finite-element model of the EC 135 helicopter is utilised and a novel technique for the parameterisation of substructures with non-isotropic material properties is suggested. Experimental modal parameters are extracted from frequency responses recorded during a Shake Test on the EC 135-S01 prototype. In this test case, the correction process involves the handling of a high degree of modal and spatial incompleteness in the experimental reference data. Accordingly, new effective strategies for the selection of updating parameters are developed which are both physically significant and likewise have a sufficient sensitivity with respect to the analytical modal parameters. Finally, possible advantages of model updating in association with a model-based method for the identification and localisation of structural damage are investigated. A new technique for identifying and locating delamination damages in carbon fibre reinforced polymers is introduced. The method is based on a correlation of damage-induced modal damping variations from an elasto-mechanic structure to the corresponding data from a numerical model in order to derive information on the damage location. Using a numerical model enables the location of damages in a three-dimensional structure from experimental data obtained with only a single response sensor. To acquire sufficiently accurate experimental data a novel criterion for the determination of most appropriate actuator and sensor positions and a polynomial curve fitting technique are suggested. It will be shown that in order to achieve a good location precision the numerical model must retain a high degree of accuracy and physical consistency.

A programming system for end-user functional programming

Alam, Abu S. January 2015 (has links)
This research involves the construction of a programming system, HASKEU, to support end-user programming in a purely functional programming language. An end-user programmer is someone who may program a computer to get their job done, but has no interest in becoming a computer programmer. A purely functional programming language is one that does not require the expression of statement sequencing or variable updating. The end-user is offered two views of their functional program. The primary view is a visual one, in which the program is presented as a collection of boxes (representing processes) and lines (representing data flow). The secondary view is a textual one, in which the program is presented as a collection of written function definitions. It is expected that the end-user programmer will begin with the visual view, perhaps later moving on to the textual view. The task of the programming system is to ensure that the visual and textual views are kept consistent as the program is constructed. The foundation of the programming system is a implementation of the Model-View-Controller (MVC) design pattern as a reactive program using the elegant Functional Reactive Programming (FRP) framework. Human-Computer Interaction (HCI) principles and methods are considered in all design decisions. A usabilty study was made to �find out the effectiveness of the new system.

Development of a novel hybrid field and zone fire model

Burton, Daniel John January 2011 (has links)
This thesis describes the design and implementation of a novel hybrid field/zone fire model, linking a fire field model to a zone model. This novel concept was implemented using SMARTFIRE (a fire field model produced at the University of Greenwich) and two different zone models (CFAST which is produced by NIST and FSEG-ZONE which has been produced by the author during the course of this work). The intention of the hybrid model is to reduce the amount of computation incurred in using field models to simulate multi-compartment geometries, and it will be implemented to allow users to employ the zone component without having to make further technical considerations, in line with the existing paradigm of the SMARTFIRE suite. In using the hybrid model only the most important or complex parts of the geometry are fully modelled using the field model. Other suitable and less important parts of the geometry are modelled using the zone model. From the field model‘s perspective the zone model is represented as an accurate pressure boundary condition. From the zone model‘s perspective the energy and mass fluxes crossing the interface between the models are seen as point sources. The models are fully coupled and iterate towards a solution ensuring both global conservation along with conservation between the regions of different computational method. By using this approach a significant proportion of the computational cells can be replaced by a relatively simple zone model, saving computational time. The hybrid model can be used in a wide range of situations but will be especially applicable to large geometries, such as hotels, prisons, factories or ships, where the domain size typically proves to be extremely computationally expensive for treatment using a field model. The capability to model such geometries without the associated mesh overheads could eventually permit simulations to be run in ‘faster-real-time’, allowing the spread of fire and effluents to be modelled, along with a close coupling with evacuation software, to provide a tool not just for research objectives, but to allow real time incident management in emergency situations. Initial ‘proof of concept’ work began with the development of one way coupling regimes to demonstrate that a valid link between models could allow communication and conservation of the respective variables. This was extended to a two-way coupling regime using the CFAST zone model and results of this implementation are presented. Fundamental differences between the SMARTFIRE and CFAST models resulted in the development of the FSEG-ZONE model to address several issues; this implementation and numerous results are discussed at length. Finally, several additions were made to the FSEG-ZONE model that are necessary for an accurate consideration of fire simulations. The test cases presented in this thesis show that a good agreement with full- field results can be obtained through use of the hybrid model, while the reduction in computational time realised is approximately equivalent to the percentage of domain cells that are replaced by the zone calculations of the hybrid model.

Efficiency evaluation of external environments control using bio-signals

Kawala-Janik, Aleksandra January 2013 (has links)
There are many types of bio-signals with various control application prospects. This dissertation regards possible application domain of electroencephalographic signal. The implementation of EEG signals, as a source of information used for control of external devices, became recently a growing concern in the scientific world. Application of electroencephalographic signals in Brain-Computer Interfaces (BCI) (variant of Human-Computer Interfaces (HCI)) as an implement, which enables direct and fast communication between the human brain and an external device, has become recently very popular. Currently available on the market, BCI solutions require complex signal processing methodology, which results in the need of an expensive equipment with high computing power. In this work, a study on using various types of EEG equipment in order to apply the most appropriate one was conducted. The analysis of EEG signals is very complex due to the presence of various internal and external artifacts. The signals are also sensitive to disturbances and non-stochastic, what makes the analysis a complicated task. The research was performed on customised (built by the author of this dissertation) equipment, on professional medical device and on Emotiv EPOC headset. This work concentrated on application of an inexpensive, easy to use, Emotiv EPOC headset as a tool for gaining EEG signals. The project also involved application of embedded system platform - TS-7260. That solution caused limits in choosing an appropriate signal processing method, as embedded platforms characterise with a little efficiency and low computing power. That aspect was the most challenging part of the whole work. Implementation of the embedded platform enables to extend the possible future application of the proposed BCI. It also gives more flexibility, as the platform is able to simulate various environments. The study did not involve the use of traditional statistical or complex signal processing methods. The novelty of the solution relied on implementation of the basic mathematical operations. The efficiency of this method was also presented in this dissertation. Another important aspect of the conducted study is that the research was carried out not only in a laboratory, but also in an environment reflecting real-life conditions. The results proved efficiency and suitability of the implementation of the proposed solution in real-life environments. The further study will focus on improvement of the signal-processing method and application of other bio-signals - in order to extend the possible applicability and ameliorate its effectiveness.

Anytime deliberation for computer game agents

Hawes, Nicholas Andrew January 2004 (has links)
This thesis presents an approach to generating intelligent behaviour for agents in computer game-like worlds. Designing and implementing such agents is a difficult task because they are required to act in real-time and respond immediately to unpredictable changes in their environment. Such requirements have traditionally caused problems for AI techniques. To enable agents to generate intelligent behaviour in real-time, complex worlds, research has been carried out into two areas of agent construction. The first of these areas is the method used by the agent to plan future behaviour. To allow an agent to make efficient use of its processing time, a planner is presented that behaves as an anytime algorithm. This anytime planner is a hierarchical task network planner which allows a planning agent to interrupt its planning process at any time and trade-off planning time against plan quality. The second area of agent construction that has been researched is the design of agent architectures. This has resulted in an agent architecture with the functionality to support an anytime planner in a dynamic, complex world. A proof-of-concept implementation of this design is presented which plays Unreal Tournament and displays behaviour that varies intelligently as it is placed under pressure.

GCS approximation

Cross, Benjamin January 2014 (has links)
The discipline of Computer Aided Geometric Design (CAGD) deals with the computational aspects of geometric objects. This thesis is concerned with the construction of one of the most primitive geometric objects, curves. More specifically, it relates to the construction of a high quality planar curve. The Generalised Cornu Spiral (GCS) is a high quality planar curve that is beginning to show value in Computer Aided Design (CAD) and Computer Aided Manufacture (CAM) applications. However in its current form it is incompatible with current CAD/CAM systems. This thesis addresses the issue with the development of a robust and efficient polynomial replacement for the GCS.

Software based solutions for mobile positioning

Hamani, Sadek January 2013 (has links)
This thesis is concerned with the development of pure software-based solutions for cellular positioning. The proposed self-positioning solutions rely solely on the available network infrastructure and do not require additional hardware or any modifications in the cellular network. The main advantage of using RSS rather than timing measurements is to overcome the need for synchronisation between base stations. By exploiting the availability of RSS observations, the self-positioning methods presented in this thesis have been implemented as mobile software applications and tested in real world positioning experiments. The well-known Extended Kalman Filter can be used as a static positioning process while modeling the uncertainty in signal strength observations. The range estimation is performed using an empirical propagation model that has been calibrated using RSS measurements in the same trial areas where the positioning process is applied. In order to overcome the need for a priori maps of the GSM network, a novel cellular positioning method is proposed in this thesis. It is based on the concept of Simultaneous Localisation And Mapping (SLAM) which represents one of the greatest successes of autonomous navigation research. By merging target localisation and the mapping of unknown base stations into a single problem, Cellular SLAM allows a mobile phone to build a map of its environment and concurrently use this map to determine its position.

Business Process Access Control (BPAC) : workflow-based authorisation for complex systems

Newton, Derrick January 2012 (has links)
Segregation of duties and least privilege are two business principles that protect an organisation’s valuable data from information leak. In this thesis we demonstrate how these business principles can be addressed through workflow-based access control. We present Business Process Access Control (BPAC), a workflow-based access control modelling environment that properly enacts the key business principles through constraints and we implement BPAC in the applied pi calculus. We ensure that constraints are correctly applied within our BPAC implementation by introducing the concept of stores. We propose a selection of security properties in respect of the business principles and we develop tests for these properties. The collusion metric is introduced as a simple indicator as to the resistance of a workflow-based access control policy to fraudulent collusion. We identify an anonymity property for workflows as the inability of an outside observer to correctly match agents to workflow tasks and we propose that anonymity provides protection against collusion. We introduce a lightweight version of labelled bisimilarity: the abstraction test and we apply this test to workflow security properties. We develop a test for anonymity using labelled bisimilarity and we demonstrate its application through simple examples.

Page generated in 0.1011 seconds