• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1382
  • 192
  • 73
  • 30
  • 27
  • 11
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 3634
  • 3634
  • 1069
  • 940
  • 902
  • 715
  • 706
  • 510
  • 470
  • 447
  • 399
  • 357
  • 291
  • 267
  • 263
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

Evaluating configuration management tools for high assurance software development projects

Ziegenhagen, Lynzi 06 1900 (has links)
Approved for public release; distribution is unlimited / This thesis establishes a framework for evaluating automated configuration management tools for use in high assurance software development projects and uses the framework to evaluate eight tools. The evaluation framework identifies a dozen feature areas that affect a high assurance project team's ability to achieve its configuration management goals and evaluates the different methods that existing tools use to implement each feature area. Each implementation method is assigned a risk rating that approximates the relative risk that the method adds to the overall configuration management process. The tools with the lowest total ratings minimize risk to high assurance projects. The results of the evaluation show that although certain tools are less risky to use than other tools for high assurance projects, no tool minimizes risk in all feature areas. Furthermore, none of the existing tools are designed to leverage high assurance environments-i.e. none run on operating systems that have themselves been evaluated as meeting high assurance requirements. Thus, high assurance development projects that want to leverage the benefits of configuration management tools and achieve a sufficiently strong configuration management solution must employ existing tools in a protected environment that specifically addresses the risks created by the tools' implementation methods. / Civilian, United States Department of Defense
132

Evaluation of program specification and verification systems

Ubhayakar, Sonali S. 06 1900 (has links)
Computer systems that earn a high degree of trust must be backed by rigorous verification methods. A verification system is an interactive environment for writing formal specifications and checking formal proofs. Verification systems allow large complicated proofs to be managed and checked interactively. We desire evaluation criteria that provide a means of finding which verification system is suitable for a specific research environment and what needs of a particular project the tool satisfies. Therefore, the purpose of this thesis is to develop a methodology and set of evaluation criteria to evaluate verification systems for their suitability to improve the assurance that systems meet security objectives. A specific verification system is evaluated with respect to the defined methodology. The main goals are to evaluate whether the verification system has the capability to express the properties of software systems and to evaluate whether the verification system can provide inter-level mapping, a feature required for understanding how a system meets security objectives. / Naval Postgraduate School author (civilian).
133

Development of a novel hybrid field and zone fire model

Burton, Daniel John January 2011 (has links)
This thesis describes the design and implementation of a novel hybrid field/zone fire model, linking a fire field model to a zone model. This novel concept was implemented using SMARTFIRE (a fire field model produced at the University of Greenwich) and two different zone models (CFAST which is produced by NIST and FSEG-ZONE which has been produced by the author during the course of this work). The intention of the hybrid model is to reduce the amount of computation incurred in using field models to simulate multi-compartment geometries, and it will be implemented to allow users to employ the zone component without having to make further technical considerations, in line with the existing paradigm of the SMARTFIRE suite. In using the hybrid model only the most important or complex parts of the geometry are fully modelled using the field model. Other suitable and less important parts of the geometry are modelled using the zone model. From the field model‘s perspective the zone model is represented as an accurate pressure boundary condition. From the zone model‘s perspective the energy and mass fluxes crossing the interface between the models are seen as point sources. The models are fully coupled and iterate towards a solution ensuring both global conservation along with conservation between the regions of different computational method. By using this approach a significant proportion of the computational cells can be replaced by a relatively simple zone model, saving computational time. The hybrid model can be used in a wide range of situations but will be especially applicable to large geometries, such as hotels, prisons, factories or ships, where the domain size typically proves to be extremely computationally expensive for treatment using a field model. The capability to model such geometries without the associated mesh overheads could eventually permit simulations to be run in ‘faster-real-time’, allowing the spread of fire and effluents to be modelled, along with a close coupling with evacuation software, to provide a tool not just for research objectives, but to allow real time incident management in emergency situations. Initial ‘proof of concept’ work began with the development of one way coupling regimes to demonstrate that a valid link between models could allow communication and conservation of the respective variables. This was extended to a two-way coupling regime using the CFAST zone model and results of this implementation are presented. Fundamental differences between the SMARTFIRE and CFAST models resulted in the development of the FSEG-ZONE model to address several issues; this implementation and numerous results are discussed at length. Finally, several additions were made to the FSEG-ZONE model that are necessary for an accurate consideration of fire simulations. The test cases presented in this thesis show that a good agreement with full- field results can be obtained through use of the hybrid model, while the reduction in computational time realised is approximately equivalent to the percentage of domain cells that are replaced by the zone calculations of the hybrid model.
134

Efficiency evaluation of external environments control using bio-signals

Kawala-Janik, Aleksandra January 2013 (has links)
There are many types of bio-signals with various control application prospects. This dissertation regards possible application domain of electroencephalographic signal. The implementation of EEG signals, as a source of information used for control of external devices, became recently a growing concern in the scientific world. Application of electroencephalographic signals in Brain-Computer Interfaces (BCI) (variant of Human-Computer Interfaces (HCI)) as an implement, which enables direct and fast communication between the human brain and an external device, has become recently very popular. Currently available on the market, BCI solutions require complex signal processing methodology, which results in the need of an expensive equipment with high computing power. In this work, a study on using various types of EEG equipment in order to apply the most appropriate one was conducted. The analysis of EEG signals is very complex due to the presence of various internal and external artifacts. The signals are also sensitive to disturbances and non-stochastic, what makes the analysis a complicated task. The research was performed on customised (built by the author of this dissertation) equipment, on professional medical device and on Emotiv EPOC headset. This work concentrated on application of an inexpensive, easy to use, Emotiv EPOC headset as a tool for gaining EEG signals. The project also involved application of embedded system platform - TS-7260. That solution caused limits in choosing an appropriate signal processing method, as embedded platforms characterise with a little efficiency and low computing power. That aspect was the most challenging part of the whole work. Implementation of the embedded platform enables to extend the possible future application of the proposed BCI. It also gives more flexibility, as the platform is able to simulate various environments. The study did not involve the use of traditional statistical or complex signal processing methods. The novelty of the solution relied on implementation of the basic mathematical operations. The efficiency of this method was also presented in this dissertation. Another important aspect of the conducted study is that the research was carried out not only in a laboratory, but also in an environment reflecting real-life conditions. The results proved efficiency and suitability of the implementation of the proposed solution in real-life environments. The further study will focus on improvement of the signal-processing method and application of other bio-signals - in order to extend the possible applicability and ameliorate its effectiveness.
135

User participation during the implementation of software packages

Coetzee, Jean-Pierre January 1996 (has links)
A research report submitted to the Faculty of Commerce, University of the Witwatersrand, Johannesburg, in partial fulfillment of the requirements for the degree of Master of Commerce. / User participation is widely regarded as being important in the development of systems. Research has largely focused on describing participation in the context of traditional systems environments, such as transaction processing, management information and decision support. Very little research has concentrated on the implementation of software packages. The implementation of packaged software, as an alternative to traditional software development, has 'become increasingly popular and further research into this specific area could provide valuable insights into the user participation construct. [Abbreviated Abstract. Open document to view full version] / AC2017
136

Anytime deliberation for computer game agents

Hawes, Nicholas Andrew January 2004 (has links)
This thesis presents an approach to generating intelligent behaviour for agents in computer game-like worlds. Designing and implementing such agents is a difficult task because they are required to act in real-time and respond immediately to unpredictable changes in their environment. Such requirements have traditionally caused problems for AI techniques. To enable agents to generate intelligent behaviour in real-time, complex worlds, research has been carried out into two areas of agent construction. The first of these areas is the method used by the agent to plan future behaviour. To allow an agent to make efficient use of its processing time, a planner is presented that behaves as an anytime algorithm. This anytime planner is a hierarchical task network planner which allows a planning agent to interrupt its planning process at any time and trade-off planning time against plan quality. The second area of agent construction that has been researched is the design of agent architectures. This has resulted in an agent architecture with the functionality to support an anytime planner in a dynamic, complex world. A proof-of-concept implementation of this design is presented which plays Unreal Tournament and displays behaviour that varies intelligently as it is placed under pressure.
137

GCS approximation

Cross, Benjamin January 2014 (has links)
The discipline of Computer Aided Geometric Design (CAGD) deals with the computational aspects of geometric objects. This thesis is concerned with the construction of one of the most primitive geometric objects, curves. More specifically, it relates to the construction of a high quality planar curve. The Generalised Cornu Spiral (GCS) is a high quality planar curve that is beginning to show value in Computer Aided Design (CAD) and Computer Aided Manufacture (CAM) applications. However in its current form it is incompatible with current CAD/CAM systems. This thesis addresses the issue with the development of a robust and efficient polynomial replacement for the GCS.
138

Software based solutions for mobile positioning

Hamani, Sadek January 2013 (has links)
This thesis is concerned with the development of pure software-based solutions for cellular positioning. The proposed self-positioning solutions rely solely on the available network infrastructure and do not require additional hardware or any modifications in the cellular network. The main advantage of using RSS rather than timing measurements is to overcome the need for synchronisation between base stations. By exploiting the availability of RSS observations, the self-positioning methods presented in this thesis have been implemented as mobile software applications and tested in real world positioning experiments. The well-known Extended Kalman Filter can be used as a static positioning process while modeling the uncertainty in signal strength observations. The range estimation is performed using an empirical propagation model that has been calibrated using RSS measurements in the same trial areas where the positioning process is applied. In order to overcome the need for a priori maps of the GSM network, a novel cellular positioning method is proposed in this thesis. It is based on the concept of Simultaneous Localisation And Mapping (SLAM) which represents one of the greatest successes of autonomous navigation research. By merging target localisation and the mapping of unknown base stations into a single problem, Cellular SLAM allows a mobile phone to build a map of its environment and concurrently use this map to determine its position.
139

Business Process Access Control (BPAC) : workflow-based authorisation for complex systems

Newton, Derrick January 2012 (has links)
Segregation of duties and least privilege are two business principles that protect an organisation’s valuable data from information leak. In this thesis we demonstrate how these business principles can be addressed through workflow-based access control. We present Business Process Access Control (BPAC), a workflow-based access control modelling environment that properly enacts the key business principles through constraints and we implement BPAC in the applied pi calculus. We ensure that constraints are correctly applied within our BPAC implementation by introducing the concept of stores. We propose a selection of security properties in respect of the business principles and we develop tests for these properties. The collusion metric is introduced as a simple indicator as to the resistance of a workflow-based access control policy to fraudulent collusion. We identify an anonymity property for workflows as the inability of an outside observer to correctly match agents to workflow tasks and we propose that anonymity provides protection against collusion. We introduce a lightweight version of labelled bisimilarity: the abstraction test and we apply this test to workflow security properties. We develop a test for anonymity using labelled bisimilarity and we demonstrate its application through simple examples.
140

Prediction learning in robotic manipulation

Kopicki, Marek January 2010 (has links)
This thesis addresses an important problem in robotic manipulation, which is the ability to predict how objects behave under manipulative actions. This ability is useful for planning of object manipulations. Physics simulators can be used to do this, but they model many kinds of object interactions poorly, and unless there is a precise description of an object’s properties their predictions may be unreliable. An alternative is to learn a model for objects by interacting with them. This thesis specifically addresses the problem of learning to predict the interactions of rigid bodies in a probabilistic framework, and demonstrates results in the domain of robotic push manipulation. During training, a robotic manipulator applies pushes to objects and learns to predict their resulting motions. The learning does not make explicit use of physics knowledge, nor is it restricted to domains with any particular physical properties. The prediction problem is posed in terms of estimating probability densities over the possible rigid body transformations of an entire object as well as parts of an object under a known action. Density estimation is useful in that it enables predictions with multimodal outcomes, but it also enables compromise predictions for multiple combined expert predictors in a product of experts architecture. It is shown that a product of experts architecture can be learned and that it can produce generalization with respect to novel actions and object shapes, outperforming in most cases an approach based on regression. An alternative, non-learning, method of prediction is also presented, in which a simplified physics approach uses the minimum energy principle together with a particle-based representation of the object. A probabilistic formulation enables this simplified physics predictor to be combined with learned predictors in a product of experts. The thesis experimentally compares the performance of product of densities, regression, and simplified physics approaches. Performance is evaluated through a combination of virtual experiments in a physics simulator, and real experiments with a 5-axis arm equipped with a simple, rigid finger and a vision system used for tracking the manipulated object.

Page generated in 0.0524 seconds