• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 487
  • 487
  • 487
  • 171
  • 157
  • 155
  • 155
  • 68
  • 57
  • 48
  • 33
  • 29
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Computer-aided design, synthesis and evaluation of novel antiviral compounds

Cancellieri, Michela January 2014 (has links)
RNA viruses are a major cause of disease that in the last fifteen years counted for frequent outbreaks, infecting both humans and animals. Examples of emerging or ri-emerging viral pathogens are the Foot-and- Mouth disease virus (FMDV) for animals, Chikungunya virus (CHIKV), Coxsackie virus B3 (CVB3) and Respiratory Syncytial virus (RSV) for humans, all responsible for infections associated with mild to severe complications. Although both vaccines and small-molecule compounds are at different stages of development, no selective antiviral drugs have been approved so far, therefore for all four these viruses improved treatment strategies are required. Promising targets are the viral non-structural proteins, which are commonly evaluated for the identification of new antivirals. Starting from the study of different viral proteins, several computer-aided techniques were applied, aiming to identify hit molecules first, and secondly to synthesise new series of potential antiviral compounds. The available crystal structures of some of the proteins that play a role in viral replication were used for structure- and ligand-based virtual screenings of commercially available compounds against CVB3, FMDV and RSV. New families of potential anti-CHIKV compounds were rationally designed and synthesized, in order to establish a structureactivity relationship study on a lead structure previously found in our group. Finally, a de-novo drug design approach was performed to find a suitable scaffold for the synthesis of a series of zinc-ejecting compounds against RSV. Inhibition of virus replication was evaluated for all the new compounds, of which different showed antiviral potential.
142

Mapping unstructured mesh codes onto local memory parallel architectures

Jones, Beryl Wyn January 1994 (has links)
Initial work on mapping CFD codes onto parallel systems focused upon software which employed structured meshes. Increasingly, many large scale CFD codes are being based upon unstructured meshes. One of the key problems when implementing such large scale unstructured problems on a distributed memory machine is the question of how to partition the underlying computational domain efficiently. It is important that all processors are kept busy for as large a proportion of the time as possible and that the amount, level and frequency of communication should be kept to a minimum. Proposed techniques for solving the mapping problem have separated out the solution into two distinct phases. The first phase is to partition the computational domain into cohesive sub-regions. The second phase consists of embedding these sub-regions onto the processors. However, it has been shown that performing these two operations in isolation can lead to poor mappings and much less optimal communication time. In this thesis we develop a technique which simultaneously takes account of the processor topology whilst identifying the cohesive sub-regions. Our approach is based on an unstructured mesh decomposition method that was originally developed by Sadayappan et al [SER90] for a hypercube. This technique forms a basis for a method which enables a decomposition to an arbitrary number of processors on a specified processor network topology. Whilst partitioning the mesh, the optimisation method takes into account the processor topology by minimising the total interprocessor communication. The problem with this technique is that it is not suitable for dealing with very large meshes since the calculations often require prodigious amounts of computing processing power.
143

Ontology-based semantic reminiscence support system

Shi, Lei January 2012 (has links)
This thesis addresses the needs of people who find reminiscence helpful in focusing on the development of a computerised reminiscence support system, which facilitates the access to and retrieval of stored memories used as the basis for positive interactions between elderly and young, and also between people with cognitive impairment and members of their family or caregivers. To model users’ background knowledge, this research defines a light weight useroriented ontology and its building principles. The ontology is flexible, and has simplified knowledge structure populated with semantically homogeneous ontology concepts. The user-oriented ontology is different from generic ontology models, as it does not rely on knowledge experts. Its structure enables users to browse, edit and create new entries on their own. To solve the semantic gap problem in personal information retrieval, this thesis proposes a semantic ontology-based feature matching method. It involves natural language processing and semantic feature extraction/selection using the user-oriented ontology. It comprises four stages: (i) user-oriented ontology building, (ii) semantic feature extraction for building vectors representing information objects, (iii) semantic feature selection using the user-oriented ontology, and (iv) measuring the similarity between the information objects. To facilitate personal information management and dynamic generation of content, the system uses ontologies and advanced algorithms for semantic feature matching. An algorithm named Onto-SVD is also proposed, which uses the user-oriented ontology to automatically detect the semantic relations within the stored memories. It combines semantic feature selection with matrix factorisation and k-means clustering to achieve topic identification based on semantic relations. The thesis further proposes an ontology-based personalised retrieval mechanism for the system. It aims to assist people to recall, browse and re-discover events from their lives by considering their profiles and background knowledge, and providing them v with customised retrieval results. Furthermore, a user profile space model is defined, and its construction method is also described. The model combines multiple useroriented ontologies and has a self-organised structure based on relevance feedback. The identification of person’s search intentions in this mechanism is on the conceptual level and involves the person’s background knowledge. Based on the identified search intentions, knowledge spanning trees are automatically generated from the ontologies or user profile spaces. The knowledge spanning trees are used to expand and reform queries, which enhance the queries’ semantic representations by applying domain knowledge. The crowdsourcing-based system evaluation measures users’ satisfaction on the generated content of Sem-LSB. It compares the advantage and disadvantage of three types of content presentations (i.e. unstructured, LSB-based and semantic/knowledgebased). Based on users’ feedback, the semantic/knowledge-based presentation is considered to have higher overall satisfaction and stronger reminiscing support effects than the others.
144

Analysis of musical structures : an approach utilising monadic parser combinators

Anderson, Alasdair J. January 2011 (has links)
The work of this thesis seeks to further the use of computation in musical analysis. To a lesser extent it is hoped that it will provide some little evidence of a new angle on creating analytic elements through inference, and cast light onto some areas where analysis may be used anew. Parsers for musical information are small in number, none have been implemented in functional languages, nor using monadic combination techniques. Few analytic systems are capable of, or even consider it necessary to, represent semantic ambiguity, and this is even more true of parsing systems. The work herein presented provides a system of unique monadic parsers built on combination that are capable of delivering several different types and depths of results. Many computational-analytic systems are based on theories of similarity. The work presented here provides for analytic structures to be created through inference i.e. in the absence of known structures. This is believed to be the first instance of this type of structure generation in the field of music.
145

Modelling trauma hip fracture hospital activities

Voake, Cheryl January 2012 (has links)
Hip fracture is the most common reason for an elderly person to be admitted to an acute orthopaedic ward. The main aim of this research is to provide a statistical evaluation of a hip fracture database, and then to use Operational Research (OR) techniques, using the statistical output, to model activities associated with the care of hip fracture patients. OR techniques employed in this thesis include simulation and queuing theory. This research focuses on hip fracture admissions to the University Hospital of Wales in Cardiff, with a primary aim of ascertaining whether the time between admission and surgical intervention has any impact upon patient outcome. Outcome is considered in terms of mortality, hospital length of stay and discharge destination. Statistical analyses are performed, via regression and CART analysis, to investigate length of stay and mortality variables. The results from these statistical tests are compiled, compared and investigated in more depth. Additionally, a principal component analysis is performed to investigate whether it would be feasible to reduce the dimensionality of the dataset, and subsequently principal component regression methodology is used to complement the output. Simulation is used to model activities in both the hip fracture ward and the trauma theatre. These models incorporate output from the statistical analysis and encompass complexities within the patient group and theatre process. The models are then used to test a number of ‘what-if’ type scenarios, including the future anticipated increase in demand. Finally, results from queuing theory are applied to the trauma theatre in order to determine a desired daily theatre allocation for these patients. Specifically, the M | G | 1 queuing system and results from queues with vacations are utilised. The thesis concludes with some discussion of how this research could be further expanded. In particular, two areas are considered; risk scoring systems and the Fenton-Wilkinson approximation.
146

An intelligent system for facility management

Dibley, Michael James January 2011 (has links)
A software system has been developed that monitors and interprets temporally changing (internal) building environments and generates related knowledge that can assist in facility management (FM) decision making. The use of the multi agent paradigm renders a system that delivers demonstrable rationality and is robust within the dynamic environment that it operates. Agent behaviour directed at working toward goals is rendered intelligent with semantic web technologies. The capture of semantics though formal expression to model the environment, adds a richness that the agents exploit to intelligently determine behaviours to satisfy goals that are flexible and adaptable. The agent goals are to generate knowledge about building space usage as well as environmental conditions by elaborating and combining near real time sensor data and information from conventional building models. Additionally further inferences are facilitated including those about wasted resources such as unnecessary lighting and heating for example. In contrast, current FM tools, lacking automatic synchronisation with the domain and rich semantic modelling, are limited to the simpler querying of manually maintained models.
147

Modelling of liquid droplet dynamics in a high DC magnetic field

Easter, Stuart January 2012 (has links)
The oscillating droplet technique is an experimental technique that is used to measure the surface tension and viscous damping coefficients of a liquid droplet. This technique has been the subject of much analysis; theoretical, numerical, and experimental with a number of different external forces used to confine the droplet. These external forces are found to modify the oscillation frequency and damping rates, which need to be quantified in order for the measurement technique to be used. The dynamics of the droplet are three-dimensional but previous numerical work has largely focused on axisymmetric cases. This work uses numerical techniques to extend the previous analysis to include the full three-dimensional effects. In this work a three-dimensional numerical model is designed, developed and applied to study the dynamics of a liquid droplet both in free space and with a high DC magnetic field used to balance gravitational forces. The numerical model is a grid point formulation of the pseudo-spectral collocation method discretised in a spherical coordinate system with the implicit Euler method used to advance the solution in time. A coordinate transformation method is used to ensure the direct surface tracking required for modelling the surface shape oscillations. The study covers the laminar fluid flow regime within a droplet exhibiting translational and surface shape oscillations providing a greater understanding of the physical behaviour of the droplet along with a qualitative and quantitative comparison with theoretical results. Initially a droplet oscillating in free space is considered, with a range of surface oscillation modes used to demonstrate the three-dimensional dynamics. Then the influence of electromagnetic forces on a diamagnetic droplet is studied, this includes the field from a solenoid magnet used to levitate a diamagnetic droplet. Finally the dynamics of an electrically conducting droplet in an external static magnetic field are modelled. In each case a number of methods are used to analyse the surface displacement in order to determine the surface tension and viscous damping coefficients. The numerical study of a freely oscillating droplet shows good agreement with the low order theoretical results for droplets in the limit of low viscosity. The high accuracy of the surface tracking method allows the non-linear effects of mode coupling and frequency shift with amplitude to be observed. There is good agreement with the theoretical values available for inviscid axisymmetric oscillations and the numerical study provides the opportunity to determine these effects for three-dimensional viscous oscillations. The magnetic field from a solenoid is used to study the levitation of a diamagnetic droplet and the oscillation frequencies of the droplet are compared with a theoretical model. The magnetic field is analysed and the accuracy of the field calculation used when determining the modification to the oscillation frequencies is considered with the use of a theoretical model. Analysis is made into the splitting of the frequency spectrum due to the magnetic field. The theoretical model that is available for an electrically conducting droplet in a static magnetic field predicts different fluid flow within the droplet and oscillation frequency and damping rate changes. These changes are compared qualitatively and quantitatively with the numerical model results with good agreement.
148

Investigation of a teleo-reactive approach for the development of autonomic manager systems

Hawthorne, James January 2013 (has links)
As the demand for more capable and more feature-rich software increases, the complexity in design, implementation and maintenance also increases exponentially. This becomes a problem when the complexity prevents developers from writing, improving, fixing or otherwise maintaining software to meet specified demands whilst still reaching an acceptable level of robustness. When complexity becomes too great, the software becomes impossible to effectively be managed by even large teams of people. One way to address the problem is an Autonomic approach to software development. Autonomic software aims to tackle complexity by allowing the software to manage itself, thus reducing the need for human intervention and allowing it to reach a maintainable state. Many techniques have been investigated for development of autonomic systems including policy-based designs, utility-functions and advanced architectures. A unique approach to the problem is the teleo-reactive programming paradigm. This paradigm offers a robust and simple structure on which to develop systems. It allows the developer the freedom to express their intentions in a logical manner whilst the increased robustness reduces the maintenance cost. Teleo-Reactive programming is an established solution to low-level agent based problems such as robot navigation and obstacle avoidance, but this technique shows behaviour which is consistent with higher-level autonomic solutions. This project therefore investigates the extent of the applicability of teleo-reactive programming as an autonomic solution. Can the technique be adapted to allow a more ideal fitness for purpose' for autonomics whilst causing minimal changes to the tried and tested original structure and meaning? Does the technique introduce any additional problems and can these be addressed with improvements to the teleo-reactive framework? Teleo-Reactive programming is an interesting approach to autonomic computing because in a Teleo-Reactive program, its state is not predetermined at any moment in time and is based on a priority system where rules execute based on the current environmental context (i.e. not in any strict procedural way) whilst still aiming at the intended goal. This method has been shown to be very robust and exhibits some of the qualities of autonomic software.
149

A framework for social BPM based on social tagging

Rangiha, M. E. January 2016 (has links)
Traditional Business Process Management (BPM) has a number of limitations. The first one is the typical separation between process design and execution, which often causes discrepancies between the processes as they are designed and the way in which they are actually executed. Additionally, because of this separation, valuable first-hand knowledge generated during process execution may remain unused during process design and also prevented to be shared within the organisation. Social BPM, which predicates to integrate social software into the BPM lifecycle, has emerged as an answer to such limitations. Although there have been a number of approaches to Social BPM, they have not been able to address all the issues of traditional BPM. This thesis proposes a novel Social BPM framework in which social tagging is used to capture process knowledge emerging during the enactment and design of the processes. Process knowledge concerns both the type of activities chosen to fulfil a certain goal (i.e. what needs doing), and the skills and experience of users in executing specific tasks (i.e. skills which are needed to do it). Such knowledge is exploited by recommendation tools to support the design and enactment of future process instances. This framework overcomes the limitations of traditional BPM systems as it removes the barrier between the design and execution of the processes and also enables all users to be part of the different phases of the BPM lifecycle. We first provide an analysis of the literature to position our research area, and then we provide an overview of our framework discussing its specification and introducing a static conceptual model of its main entities. This framework is then elaborated further with a more dynamic model of the behaviour and, in particular, of the role and task recommendations, which are supported by social tagging. These mechanisms are then applied in a running example. Finally the framework is evaluated through the implementation of a prototype and its application in a case study. The thesis ends with a discussion about the different evaluation approaches of the proposed framework, limitations of our framework and future research.
150

Development of a structured method for knowledge elicitation

Swaffield, Gail January 1990 (has links)
The subject of this thesis is, broadly, knowledge elicitation for knowledge-based, or expert systems. The aims of the research were to investigate the transference of techniques of systems analysis to the knowledge elicitation process, and in so doing, to develop a structured method for knowledge elicitation. The main contributions to the area of knowledge elicitation made by the research are: i) The development of a method which has as a central part of it, the definition of an explicitness boundary, across which and within which all data and processes must be explicit. It is argued that in order to be explicit, the data must be in the form of limited data sets as opposed to continuous data. ii) The development of a method which forces the use of an intermediate representation, thus forcing a logical/physical design split, as in systems analysis for conventional data processing systems. iii) The concern for user independence in the resulting systems. The ability to increase user independence is enhanced by the use of limited data sets, and also by the involvement of designated users of the expert system, and testing of the intermediate representation, during knowledge elicitation. The starting point of the research is the lack of methods for knowledge elicitation, and the pitfalls of existing techniques. Many of the techniques to have emerged from other disciplines such as cognitive psychology are discussed with respect to the concerns of this thesis, and the proposed method. The specific techniques from systems analysis which are applied to knowledge elicitation are data flow analysis, entity-relationship analysis, and entity life cycle modelling. These three techniques form the framework of the method, which starts with a high-level analysis of the domain, and results in an implementation independent representation of the expert domain, equivalent to a logical model in systems analysis and design. The final part of the thesis shows the ease with which the resulting model is translated to two of the most commonly used knowledge representation schemes - production systems and frames.

Page generated in 0.0568 seconds