• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 720
  • 715
  • 707
  • 398
  • 385
  • 382
  • 164
  • 97
  • 86
  • 82
  • 44
  • 42
  • 39
  • 30
  • 29
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

Modelling of liquid droplet dynamics in a high DC magnetic field

Easter, Stuart January 2012 (has links)
The oscillating droplet technique is an experimental technique that is used to measure the surface tension and viscous damping coefficients of a liquid droplet. This technique has been the subject of much analysis; theoretical, numerical, and experimental with a number of different external forces used to confine the droplet. These external forces are found to modify the oscillation frequency and damping rates, which need to be quantified in order for the measurement technique to be used. The dynamics of the droplet are three-dimensional but previous numerical work has largely focused on axisymmetric cases. This work uses numerical techniques to extend the previous analysis to include the full three-dimensional effects. In this work a three-dimensional numerical model is designed, developed and applied to study the dynamics of a liquid droplet both in free space and with a high DC magnetic field used to balance gravitational forces. The numerical model is a grid point formulation of the pseudo-spectral collocation method discretised in a spherical coordinate system with the implicit Euler method used to advance the solution in time. A coordinate transformation method is used to ensure the direct surface tracking required for modelling the surface shape oscillations. The study covers the laminar fluid flow regime within a droplet exhibiting translational and surface shape oscillations providing a greater understanding of the physical behaviour of the droplet along with a qualitative and quantitative comparison with theoretical results. Initially a droplet oscillating in free space is considered, with a range of surface oscillation modes used to demonstrate the three-dimensional dynamics. Then the influence of electromagnetic forces on a diamagnetic droplet is studied, this includes the field from a solenoid magnet used to levitate a diamagnetic droplet. Finally the dynamics of an electrically conducting droplet in an external static magnetic field are modelled. In each case a number of methods are used to analyse the surface displacement in order to determine the surface tension and viscous damping coefficients. The numerical study of a freely oscillating droplet shows good agreement with the low order theoretical results for droplets in the limit of low viscosity. The high accuracy of the surface tracking method allows the non-linear effects of mode coupling and frequency shift with amplitude to be observed. There is good agreement with the theoretical values available for inviscid axisymmetric oscillations and the numerical study provides the opportunity to determine these effects for three-dimensional viscous oscillations. The magnetic field from a solenoid is used to study the levitation of a diamagnetic droplet and the oscillation frequencies of the droplet are compared with a theoretical model. The magnetic field is analysed and the accuracy of the field calculation used when determining the modification to the oscillation frequencies is considered with the use of a theoretical model. Analysis is made into the splitting of the frequency spectrum due to the magnetic field. The theoretical model that is available for an electrically conducting droplet in a static magnetic field predicts different fluid flow within the droplet and oscillation frequency and damping rate changes. These changes are compared qualitatively and quantitatively with the numerical model results with good agreement.
152

Investigation of a teleo-reactive approach for the development of autonomic manager systems

Hawthorne, James January 2013 (has links)
As the demand for more capable and more feature-rich software increases, the complexity in design, implementation and maintenance also increases exponentially. This becomes a problem when the complexity prevents developers from writing, improving, fixing or otherwise maintaining software to meet specified demands whilst still reaching an acceptable level of robustness. When complexity becomes too great, the software becomes impossible to effectively be managed by even large teams of people. One way to address the problem is an Autonomic approach to software development. Autonomic software aims to tackle complexity by allowing the software to manage itself, thus reducing the need for human intervention and allowing it to reach a maintainable state. Many techniques have been investigated for development of autonomic systems including policy-based designs, utility-functions and advanced architectures. A unique approach to the problem is the teleo-reactive programming paradigm. This paradigm offers a robust and simple structure on which to develop systems. It allows the developer the freedom to express their intentions in a logical manner whilst the increased robustness reduces the maintenance cost. Teleo-Reactive programming is an established solution to low-level agent based problems such as robot navigation and obstacle avoidance, but this technique shows behaviour which is consistent with higher-level autonomic solutions. This project therefore investigates the extent of the applicability of teleo-reactive programming as an autonomic solution. Can the technique be adapted to allow a more ideal fitness for purpose' for autonomics whilst causing minimal changes to the tried and tested original structure and meaning? Does the technique introduce any additional problems and can these be addressed with improvements to the teleo-reactive framework? Teleo-Reactive programming is an interesting approach to autonomic computing because in a Teleo-Reactive program, its state is not predetermined at any moment in time and is based on a priority system where rules execute based on the current environmental context (i.e. not in any strict procedural way) whilst still aiming at the intended goal. This method has been shown to be very robust and exhibits some of the qualities of autonomic software.
153

A framework for social BPM based on social tagging

Rangiha, M. E. January 2016 (has links)
Traditional Business Process Management (BPM) has a number of limitations. The first one is the typical separation between process design and execution, which often causes discrepancies between the processes as they are designed and the way in which they are actually executed. Additionally, because of this separation, valuable first-hand knowledge generated during process execution may remain unused during process design and also prevented to be shared within the organisation. Social BPM, which predicates to integrate social software into the BPM lifecycle, has emerged as an answer to such limitations. Although there have been a number of approaches to Social BPM, they have not been able to address all the issues of traditional BPM. This thesis proposes a novel Social BPM framework in which social tagging is used to capture process knowledge emerging during the enactment and design of the processes. Process knowledge concerns both the type of activities chosen to fulfil a certain goal (i.e. what needs doing), and the skills and experience of users in executing specific tasks (i.e. skills which are needed to do it). Such knowledge is exploited by recommendation tools to support the design and enactment of future process instances. This framework overcomes the limitations of traditional BPM systems as it removes the barrier between the design and execution of the processes and also enables all users to be part of the different phases of the BPM lifecycle. We first provide an analysis of the literature to position our research area, and then we provide an overview of our framework discussing its specification and introducing a static conceptual model of its main entities. This framework is then elaborated further with a more dynamic model of the behaviour and, in particular, of the role and task recommendations, which are supported by social tagging. These mechanisms are then applied in a running example. Finally the framework is evaluated through the implementation of a prototype and its application in a case study. The thesis ends with a discussion about the different evaluation approaches of the proposed framework, limitations of our framework and future research.
154

Development of a structured method for knowledge elicitation

Swaffield, Gail January 1990 (has links)
The subject of this thesis is, broadly, knowledge elicitation for knowledge-based, or expert systems. The aims of the research were to investigate the transference of techniques of systems analysis to the knowledge elicitation process, and in so doing, to develop a structured method for knowledge elicitation. The main contributions to the area of knowledge elicitation made by the research are: i) The development of a method which has as a central part of it, the definition of an explicitness boundary, across which and within which all data and processes must be explicit. It is argued that in order to be explicit, the data must be in the form of limited data sets as opposed to continuous data. ii) The development of a method which forces the use of an intermediate representation, thus forcing a logical/physical design split, as in systems analysis for conventional data processing systems. iii) The concern for user independence in the resulting systems. The ability to increase user independence is enhanced by the use of limited data sets, and also by the involvement of designated users of the expert system, and testing of the intermediate representation, during knowledge elicitation. The starting point of the research is the lack of methods for knowledge elicitation, and the pitfalls of existing techniques. Many of the techniques to have emerged from other disciplines such as cognitive psychology are discussed with respect to the concerns of this thesis, and the proposed method. The specific techniques from systems analysis which are applied to knowledge elicitation are data flow analysis, entity-relationship analysis, and entity life cycle modelling. These three techniques form the framework of the method, which starts with a high-level analysis of the domain, and results in an implementation independent representation of the expert domain, equivalent to a logical model in systems analysis and design. The final part of the thesis shows the ease with which the resulting model is translated to two of the most commonly used knowledge representation schemes - production systems and frames.
155

An investigation into the feasibility, problems and benefits of re-engineering a legacy procedural CFD code into an event driven, object oriented system that allows dynamic user interaction

Ewer, John Andrew Clark January 2000 (has links)
This research started with questions about how the overall efficiency, reliability and ease-of-use of Computational Fluid Dynamics (CFD) codes could be improved using any available software engineering and Human Computer Interaction (HCI) techniques. Much of this research has been driven by the difficulties experienced by novice CFD users in the area of Fire Field Modelling where the introduction of performance based building regulations have led to a situation where non CFD experts are increasingly making use of CFD techniques, with varying degrees of effectiveness, for safety critical research. Formerly, such modelling has not been helped by the mode of use, high degree of expertise required from the user and the complexity of specifying a simulation case. Many of the early stages of this research were channelled by perceived limitations of the original legacy CFD software that was chosen as a framework for these investigations. These limitations included poor code clarity, bad overall efficiency due to the use of batch mode processing, poor assurance that the final results presented from the CFD code were correct and the requirement for considerable expertise on the part of users. The innovative incremental re-engineering techniques developed to reverse-engineer, re-engineer and improve the internal structure and usability of the software were arrived at as a by-product of the research into overcoming the problems discovered in the legacy software. The incremental reengineering methodology was considered to be of enough importance to warrant inclusion in this thesis. Various HCI techniques were employed to attempt to overcome the efficiency and solution correctness problems. These investigations have demonstrated that the quality, reliability and overall run-time efficiency of CFD software can be significantly improved by the introduction of run-time monitoring and interactive solution control. It should be noted that the re-engineered CFD code is observed to run more slowly than the original FORTRAN legacy code due, mostly, to the changes in calling architecture of the software and differences in compiler optimisation: but, it is argued that the overall effectiveness, reliability and ease-of-use of the prototype software are all greatly improved. Investigations into dynamic solution control (made possible by the open software architecture and the interactive control interface) have demonstrated considerable savings when using solution control optimisation. Such investigations have also demonstrated the potential for improved assurance of correct simulation when compared with the batch mode of processing found in most legacy CFD software. Investigations have also been conducted into the efficiency implications of using unstructured group solvers. These group solvers are a derivation of the simple point-by-point Jaccobi Over Relaxation (JOR) and Successive Over Relaxation (SOR) solvers [CROFT98] and using group solvers allows the computational processing to be more effectively targeted on regions or logical collections of cells that require more intensive computation. Considerable savings have been demonstrated for the use of both static- and dynamic- group membership when using these group solvers for a complex 3-imensional fire modelling scenario. Furthermore the improvements in the system architecture (brought about as a result of software re-engineering) have helped to create an open framework that is both easy to comprehend and extend. This is in spite of the underlying unstructured nature of the simulation mesh with all of the associated complexity that this brings to the data structures. The prototype CFD software framework has recently been used as the core processing module in a commercial Fire Field Modelling product (called "SMARTFIRE" [EWER99-1]). This CFD framework is also being used by researchers to investigate many diverse aspects of CFD technology including Knowledge Based Solution Control, Gaseous and Solid Phase Combustion, Adaptive Meshing and CAD file interpretation for ease of case specification.
156

Computational modelling of agent based path planning and the representation of human wayfinding behaviour within egress models

Veeraswamy, Anand January 2011 (has links)
The focus of this thesis is on wayfinding within buildings from an evacuation/circulation modelling perspective. Majority of the existing evacuation models simplify the process of wayfinding by assigning the shortest path to all agents. This is not a realistic representation of the actual route choices made by people in circulation/evacuation conditions. Wayfinding is a dynamic process and cannot be modelled as a static process by assigning pre-determined routes to the agents. Wayfinding is thus a very important aspect to be modelled accurately within evacuation/circulation models to simulate more realistic human behaviour. The main goal of this thesis is to develop an agent based wayfinding model for the buildingEXODUS evacuation/circulation model. There were four major problems to be solved: spatial representation of the environment, implementation of graph search algorithms to generate choice set of routes for the agents to choose from, determination of factors that influence people‟s wayfinding behaviour and the development/integration of the agent based wayfinding model within the buildingEXODUS evacuation/circulation model. The existing spatial representation technique in buildingEXODUS was modified to best suit the requirement of the wayfinding model. Various graph search algorithms such as A*, Dijkstra and Yen‟s algorithm were studied. Alternate algorithms were developed to quickly generate routes and were compared with the performance of the Yen‟s algorithm. Two surveys were then developed and published on line. A total of 1200 participants from various countries took the survey. The survey results were statistically analysed and was utilised to model the decision making behaviour of the agents in the wayfinding model. An agent based wayfinding model was then developed incorporating features such as: spatial representation in terms of a graph, application of route choice set generating algorithms, agents with their individual attributes using multi criteria decision analysis methods to choose routes and changing routes dynamically on encountering congestion or gaining new exit knowledge. This wayfinding model was then integrated within the buildingEXODUS model. The buildingEXODUS model passes spatial information and agent location to the wayfinding model at the start of the simulation. The wayfinding model applies the graph search algorithms to generate routes and assigns routes (a set of target locations) to the agents. The buildingEXODUS model generates events under certain circumstances: when agents reach a target location, encounter congestion or learn the location of a new exit. The wayfinding model listens to these events and assigns a new route to the agents if an alternate route is more favourable than the initially chosen one. Therefore, there is constant communication between the fine node buildingEXODUS and the coarse node wayfinding models, with the latter being responsible for assigning routes to the agents and the former being responsible for navigating the agents from one target location to the next. Thus, a sophisticated wayfinding model incorporating data from surveys has been developed using C++ and has been integrated into the buildingEXODUS evacuation model. The introduction of the wayfinding model brought about significant changes to the evacuation statistics produced by the buildingEXODUS model. The difference was more significant in buildings where there was more than one path to an exit. The default option of the existing evacuation models is to assign the shortest path to all the agents in the simulation whereas with the wayfinding model, agents choose alternative paths based on other wayfinding criteria as well such as time, number of turns, etc.
157

Investigation into the interaction of people with signage systems and its implementation within evacuation models

Xie, Hui January 2011 (has links)
Signage systems are widely used in buildings in accordance with safety legislation and building standards. These aim to provide general information and safety messages to occupants, and assist them in wayfinding during both circulation and evacuation. Despite the fact that signage systems are an important component in building wayfinding systems, there is a lack of relevant data concerning how occupants perceive, interpret and use the information conveyed by emergency signage. The effectiveness of signage systems is therefore difficult to assess and is not correctly represented in any existing evacuation models. In this dissertation, this issue is addressed through two experiments and the modelling of the interaction with emergency signage based on the empirical findings. The first experiment involved measuring the maximum viewing distance of standard signs at various angles to produce an empirical representation of signage catchment area. The second experiment involved measuring the impact of a signage system on a population of 68 test subjects who were instructed to individually vacate a building by their own efforts. The evacuation path involved a number of decision points at which emergency signage was available to identify the appropriate path. Through analysis of data derived from questionnaires and video footage, the number of people who perceived and utilised the signage information to assist their egress is determined. The experimental results are utilised to enhance the capability of the buildingEXODUS software. Firstly, the signage catchment area is revised to more accurately represent the visibility limits of signage than previously modelled according to the definition of signage visibility by regulations. Secondly, the impact of smoke on signage visibility is introduced and the representation of the impact of smoke on occupant evacuation performance is improved based on existing published data. Finally, the signage detection and compliance probabilities are assigned values based on the experimental data rather than the ideal values previously assumed. The impact that the enhanced signage model has on evacuation analysis is demonstrated in hypothetical evacuation scenarios. The new signage model is shown to produce a more representative and realistic estimate of expected egress times than previously. It is hoped that this dissertation will improve our understanding of a key phenomena – the interaction of people with signage, and allow interested parties (e.g. engineers, safety managers and designers, etc.) to more effectively and credibly examine the impact of signage systems upon pedestrian and evacuee movement.
158

Towards a general model for the design of virtual reality learning environments

Mason, Carol Ann January 2002 (has links)
Virtual reality (VR) has been described as a new and unique type of learning media primarily because it encourages active participation. However, a large number of VR worlds are barely more than passive 3D graphic visualisations. This might be due to the lack of guidelines for the design of interactive worlds, or to the learning preferences of the designers themselves. The literature indicates a number of principles, especially in the area of VR design and learning theory that could form the basis of appropriate design guidelines and this thesis presents these as a set of guidelines for VR designers. There is a lack of information about the learning preferences of VR designers or the design of appropriate help systems for VR learning media so four additional fieldwork studies were carried out to investigate the learning styles, communication styles, attitudes towards the use of VR in learning and training situations, and preferences for the design and use of VR help systems using a sample of VR designers and VR design students. The results indicated that the learning style and communication profiles of VR designers may not be suitable for the design of active learning material. It was also found that VR designers had positive attitudes towards the development of VR in general but less so for learning situations. VR designers tended to provide mainly text-based (visual) instruction in their designs, which may be linked to their predominantly visual learning modalities. However, the results suggested that visual-dominant VR design students were equally likely to prefer voiced (auditory) instructions when used naturally within a VR world. The findings from these four studies were incorporated into a broad set of top-level guidelines that form the first step towards a general model for the design of active, participatory VR learning environments.
159

A strategy for mapping unstructured mesh computational mechanics programs onto distributed memory parallel architectures

McManus, Kevin January 1996 (has links)
The motivation of this thesis was to develop strategies that would enable unstructured mesh based computational mechanics codes to exploit the computational advantages offered by distributed memory parallel processors. Strategies that successfully map structured mesh codes onto parallel machines have been developed over the previous decade and used to build a toolkit for automation of the parallelisation process. Extension of the capabilities of this toolkit to include unstructured mesh codes requires new strategies to be developed. This thesis examines the method of parallelisation by geometric domain decomposition using the single program multi data programming paradigm with explicit message passing. This technique involves splitting (decomposing) the problem definition into P parts that may be distributed over P processors in a parallel machine. Each processor runs the same program and operates only on its part of the problem. Messages passed between the processors allow data exchange to maintain consistency with the original algorithm. The strategies developed to parallelise unstructured mesh codes should meet a number of requirements: The algorithms are faithfully reproduced in parallel. The code is largely unaltered in the parallel version. The parallel efficiency is maximised. The techniques should scale to highly parallel systems. The parallelisation process should become automated. Techniques and strategies that meet these requirements are developed and tested in this dissertation using a state of the art integrated computational fluid dynamics and solid mechanics code. The results presented demonstrate the importance of the problem partition in the definition of inter-processor communication and hence parallel performance. The classical measure of partition quality based on the number of cut edges in the mesh partition can be inadequate for real parallel machines. Consideration of the topology of the parallel machine in the mesh partition is demonstrated to be a more significant factor than the number of cut edges in the achieved parallel efficiency. It is shown to be advantageous to allow an increase in the volume of communication in order to achieve an efficient mapping dominated by localised communications. The limitation to parallel performance resulting from communication startup latency is clearly revealed together with strategies to minimise the effect. The generic application of the techniques to other unstructured mesh codes is discussed in the context of automation of the parallelisation process. Automation of parallelisation based on the developed strategies is presented as possible through the use of run time inspector loops to accurately determine the dependencies that define the necessary inter-processor communication.
160

Mesh generation by domain bisection

Lawrence, Peter James January 1994 (has links)
The research reported in this dissertation was undertaken to investigate efficient computational methods of automatically generating three dimensional unstructured tetrahedral meshes. The work on two dimensional triangular unstructured grid generation by Lewis and Robinson [LeR76] is first examined, in which a recursive bisection technique of computational order nlog(n) was implemented. This technique is then extended to incorporate new methods of geometry input and the automatic handling of multiconnected regions. The method of two dimensional recursive mesh bisection is then further modified to incorporate an improved strategy for the selection of bisections. This enables an automatic nodal placement technique to be implemented in conjunction with the grid generator. The dissertation then investigates methods of generating triangular grids over parametric surfaces. This includes a new definition of surface Delaunay triangulation with the extension of grid improvement techniques to surfaces. Based on the assumption that all surface grids of objects form polyhedral domains, a three dimensional mesh generation technique is derived. This technique is a hybrid of recursive domain bisection coupled with a min-max heuristic triangulation algorithm. This is done to achieve a computationlly efficient and reliable algorithm coupled with a fast nodal placement technique. The algorithm generates three dimensional unstructured tetrahedral grids over polyhedral domains with multi-connected regions in an average computational order of less than nlog(n).

Page generated in 0.039 seconds