• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • 1
  • Tagged with
  • 30
  • 30
  • 27
  • 14
  • 13
  • 12
  • 12
  • 10
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 8
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

An open source HPC-enabled model of cardiac defibrillation of the human heart

Bernabeu Llinares, Miguel Oscar January 2011 (has links)
Sudden cardiac death following cardiac arrest is a major killer in the industrialised world. The leading cause of sudden cardiac death are disturbances in the normal electrical activation of cardiac tissue, known as cardiac arrhythmia, which severely compromise the ability of the heart to fulfill the body's demand of oxygen. Ventricular fibrillation (VF) is the most deadly form of cardiac arrhythmia. Furthermore, electrical defibrillation through the application of strong electric shocks to the heart is the only effective therapy against VF. Over the past decades, a large body of research has dealt with the study of the mechanisms underpinning the success or failure of defibrillation shocks. The main mechanism of shock failure involves shocks terminating VF but leaving the appropriate electrical substrate for new VF episodes to rapidly follow (i.e. shock-induced arrhythmogenesis). A large number of models have been developed for the in silico study of shock-induced arrhythmogenesis, ranging from single cell models to three-dimensional ventricular models of small mammalian species. However, no extrapolation of the results obtained in the aforementioned studies has been done in human models of ventricular electrophysiology. The main reason is the large computational requirements associated with the solution of the bidomain equations of cardiac electrophysiology over large anatomically-accurate geometrical models including representation of fibre orientation and transmembrane kinetics. In this Thesis we develop simulation technology for the study of cardiac defibrillation in the human heart in the framework of the open source simulation environment Chaste. The advances include the development of novel computational and numerical techniques for the solution of the bidomain equations in large-scale high performance computing resources. More specifically, we have considered the implementation of effective domain decomposition, the development of new numerical techniques for the reduction of communication in Chaste's finite element method (FEM) solver, and the development of mesh-independent preconditioners for the solution of the linear system arising from the FEM discretisation of the bidomain equations. The developments presented in this Thesis have brought Chaste to the level of performance and functionality required to perform bidomain simulations with large three-dimensional cardiac geometries made of tens of millions of nodes and including accurate representation of fibre orientation and membrane kinetics. This advances have enabled the in silico study of shock-induced arrhythmogenesis for the first time in the human heart, therefore bridging an important gap in the field of cardiac defibrillation research.
22

Structural modelling of transmembrane domains

Kelm, Sebastian January 2011 (has links)
Membrane proteins represent about one third of all known vertebrate proteins and over half of the current drug targets. Knowledge of their three-dimensional (3D) structure is worth millions of pounds to the pharmaceutical industry. Yet experimental structure elucidation of membrane proteins is a slow and expensive process. In the absence of experimental data, computational modelling tools can be used to close the gap between the numbers of known protein sequences and structures. However, currently available structure prediction tools were developed with globular soluble proteins in mind and perform poorly on membrane proteins. This thesis describes the development of a modelling approach able to predict accurately the structure of transmembrane domains of proteins. In this thesis we build a template-based modelling framework especially for membrane proteins, which uses membrane protein-specific information to inform the modelling process.Firstly, we develop a tool to accurately determine a given membrane protein structure's orientation within the membrane. We offer an analysis of the preferred substitution patterns within the membrane, as opposed to non-membrane environments, and how these differences influence the structures observed. This information is then used to build a set of tools that produce better sequence alignments of membrane proteins, compared to previously available methods, as well as more accurate predictions of their 3D structures. Each chapter describes one new piece of software or information and uses the tools and knowledge described in previous chapters to build up to a complete accurate model of a transmembrane domain.
23

A decision support system for the reading of ancient documents

Roued-Cunliffe, Henriette January 2011 (has links)
The research presented in this thesis is based in the Humanities discipline of Ancient History and begins by attempting to understand the interpretation process involved in reading ancient documents and how this process can be aided by computer systems such as Decision Support Systems (DSS). The thesis balances between the use of IT tools to aid Humanities research and the understanding that Humanities research must involve human beings. It does not attempt to develop a system that can automate the reading of ancient documents. Instead it seeks to demonstrate and develop tools that can support this process in the five areas: remembering complex reasoning, searching huge datasets, international collaboration, publishing editions, and image enhancement. This research contains a large practical element involving the development of a DSS prototype. The prototype is used to illustrate how a DSS, by remembering complex reasoning, can aid the process of interpretation that is reading ancient documents. It is based on the idea that the interpretation process goes through a network of interpretation. The network of interpretation illustrates a recursive process where scholars move between reading levels such as ‘these strokes look like the letter c’ or ‘these five letters must be the word primo’. Furthermore, the thesis demonstrates how technology such as Web Services and XML can be used to make a DSS even more powerful through the development of the APPELLO word search Web Service. Finally, the conclusion includes a suggestion for a future development of a working DSS that incorporates the idea of a layer-based system and focuses strongly on user interaction.
24

Kompetence a jejich rozvoj v novém skautském výchovném programu / Competences and their development in the new scout educational program

Kuberová, Kateřina January 2011 (has links)
The thesis deals with the development of key competences through Scout training. It focuses on the transformation of educational program in the Junák - Association of Scouts and Guides of the Czech Republic. The thesis is divided into two parts - theoretical and practical. The first part describes the history, the mission and ideals of Scouting. It deals with the development of Scouting in Czechoslovakia (or the Czech Republic) after 1990 and also explains how Junák is perceived by the public. It was found out by studies that were done in 2003. This thesis compares the historical and current educative documents that are used to support the development of children and young people's personalities. The thesis describes the initial impulses that led to the program's modification. Also the system of competences, which is essential for the new program, is developed in the thesis. The practical part contains instructions and suggestions for effective development of skills and deals with the further education for scout leaders. The work is accompanied by attachments that contain examples of previous and modern materials and a table summarizing the development of competences.
25

Modelling embankment breaching due to overflow

van Damme, Myron January 2014 (has links)
Correct modelling of embankment breach formation is essential for an accurate assessment of the associated flood risk. Modelling breach formation due to overflow requires a thorough understanding of the geotechnical processes in unsaturated soils as well as erosion processes under supercritical flow conditions. This thesis describes 1D slope stability analysis performed for unsaturated soils whose moisture content changes with time. The analysis performed shows that sediment-laden gravity flows play an important role in the erosion behaviour of embankments. The thesis also describes a practical, fast breach model based on a simplified description of the physical processes that can be used in modelling and decision support frameworks for flooding. To predict the breach hydrograph, the rapid model distinguishes between breach formation due to headcut erosion and surface erosion in the case of failure due to overflow. The model also predicts the breach hydrograph in the case of failure due to piping. The assumptions with respect to breach flow modelling are reviewed, and result in a new set of breadth-integrated Navier-Stokes equations, that account for wall shear stresses and a variable breadth geometry. The vertical 2D flow field described by the equations can be used to calculate accurately the stresses on the embankment during the early stages of breach formation. Pressure-correction methods are given for solving the 2D Navier-Stokes equations for a variable breadth, and good agreement is found when validating the flow model against analytical solutions.
26

Towards a more versatile dynamic-music for video games : approaches to compositional considerations and techniques for continuous music

Davies, Huw January 2015 (has links)
This study contributes to practical discussions on the composition of dynamic music for video games from the composer’s perspective. Creating greater levels of immersion in players is used as a justification for the proposals of the thesis. It lays down foundational aesthetic elements in order to proceed with a logical methodology. The aim of this paper is to build upon, and further hybridise, two techniques used by composers and by video game designers to increase further the reactive agility and memorability of the music for the player. Each chapter of this paper explores a different technique for joining two (possibly disparate) types of gameplay, or gamestates, with appropriate continuous music. In each, I discuss a particular musical engine capable of implementing continuous music. Chapter One will discuss a branching-music engine, which uses a precomposed musical mosaic (or musical pixels) to create a linear score with the potential to diverge at appropriate moments accompanying onscreen action. I use the case study of the Final Fantasy battle system to show how the implementation of a branching-music engine could assist in maintaining the continuity of gameplay experience that current disjointed scores, which appear in many games, create. To aid this argument I have implemented a branching-music engine, using the graphical object oriented programming environment MaxMSP, in the style of the battle music composed by Nobuo Uematsu, the composer of the early Final Fantasy series. The reader can find this in the accompanying demonstrations patch. In Chapter Two I consider how a generative-music engine can also implement a continuous music and also address some of the limitations of the branching-music engine. Further I describe a technique for an effective generative music for video games that creates musical ‘personalities’ that can mimic a particular style of music for a limited period of time. Crucially, this engine is able to transition between any two personalities to create musical coincidence with the game. GMGEn (<b>G</b>ame <b>M</b>usic <b>G</b>eneration <b>E</b>ngine) is a program I have created in MaxMSP to act as an example of this concept. GMGEn is available in the Demonstrations_Application. Chapter Three will discuss potential limitations of the branching music engine described in Chapter One and the generative music engine described in Chapter Two, and highlights how these issues can be solved by way of a third engine, which hybridises both. As this engine has an indeterminate musical state it is termed the intermittent-music engine. I go on to discuss the implementation of this engine in two different game scenarios and how emergent structures of this music will appear. The final outcome is to formulate a new compositional approach delivering dynamic music, which accompanies the onscreen action with greater agility than currently present in the field, increasing the memorability and therefore the immersive effect of the video-game music.
27

Integration of rationale management with multi-criteria decision analysis, probabilistic forecasting and semantics : application to the UK energy sector

Hunt, Julian David January 2013 (has links)
This thesis presents a new integrated tool and decision support framework to approach complex problems resulting from the interaction of many multi-criteria issues. The framework is embedded in an integrated tool called OUTDO (Oxford University Tool for Decision Organisation). OUTDO integrates Multi-Criteria Decision Analysis (MCDA), decision rationale management with a modified Issue-Based Information Systems (IBIS) representation, and probabilistic forecasting to effectively capture the essential reasons why decisions are made and to dynamically re-use the rationale. In doing so, it allows exploration of how changes in external parameters affect complicated and uncertain decision making processes in the present and in the future. Once the decision maker constructs his or her own decision process, OUTDO checks if the decision process is consistent and coherent and looks for possible ways to improve it using three new semantic-based decision support approaches. For this reason, two ontologies (the Decision Ontology and the Energy Ontology) were integrated into OUTDO to provide it with these semantic capabilities. The Decision Ontology keeps a record of the decision rationale extracted from OUTDO and the Energy Ontology describes the energy generation domain, focusing on the water requirement in thermoelectric power plants. A case study, with the objective of recommending electricity generation and steam condensation technologies for ten different regions in the UK, is used to verify OUTDO’s features and reach conclusions about the overall work.
28

A high order Discontinuous Galerkin - Fourier incompressible 3D Navier-Stokes solver with rotating sliding meshes for simulating cross-flow turbines

Ferrer, Esteban January 2012 (has links)
This thesis details the development, verification and validation of an unsteady unstructured high order (≥ 3) h/p Discontinuous Galerkin - Fourier solver for the incompressible Navier-Stokes equations on static and rotating meshes in two and three dimensions. This general purpose solver is used to provide insight into cross-flow (wind or tidal) turbine physical phenomena. Simulation of this type of turbine for renewable energy generation needs to account for the rotational motion of the blades with respect to the fixed environment. This rotational motion implies azimuthal changes in blade aero/hydro-dynamics that result in complex flow phenomena such as stalled flows, vortex shedding and blade-vortex interactions. Simulation of these flow features necessitates the use of a high order code exhibiting low numerical errors. This thesis presents the development of such a high order solver, which has been conceived and implemented from scratch by the author during his doctoral work. To account for the relative mesh motion, the incompressible Navier-Stokes equations are written in arbitrary Lagrangian-Eulerian form and a non-conformal Discontinuous Galerkin (DG) formulation (i.e. Symmetric Interior Penalty Galerkin) is used for spatial discretisation. The DG method, together with a novel sliding mesh technique, allows direct linking of rotating and static meshes through the numerical fluxes. This technique shows spectral accuracy and no degradation of temporal convergence rates if rotational motion is applied to a region of the mesh. In addition, analytical mappings are introduced to account for curved external boundaries representing circular shapes and NACA foils. To simulate 3D flows, the 2D DG solver is parallelised and extended using Fourier series. This extension allows for laminar and turbulent regimes to be simulated through Direct Numerical Simulation and Large Eddy Simulation (LES) type approaches. Two LES methodologies are proposed. Various 2D and 3D cases are presented for laminar and turbulent regimes. Among others, solutions for: Stokes flows, the Taylor vortex problem, flows around square and circular cylinders, flows around static and rotating NACA foils and flows through rotating cross-flow turbines, are presented.
29

Medical relevance and functional consequences of protein truncating variants

Rivas Cruz, Manuel A. January 2015 (has links)
Genome-wide association studies have greatly improved our understanding of the contribution of common variants to the genetic architecture of complex traits. However, two major limitations have been highlighted. First, common variant associations typically do not identify the causal variant and/or the gene that it is exerting its effect on to influence a trait. Second, common variant associations usually consist of variants with small effects. As a consequence, it is more challenging to harness their translational impact. Association studies of rare variants and complex traits may be able to help address these limitations. Empirical population genetic data shows that deleterious variants are rare. More specifically, there is a very strong depletion of common protein truncating variants (PTVs, commonly referred to as loss-of-function variants) in the genome, a group of variants that have been shown to have large effect on gene function, are enriched for severe disease-causing mutations, but in other instances may actually be protective against disease. This thesis is divided into three parts dedicated to the study of protein truncating variants, their medical relevance, and their functional consequences. First, I present statistical, bioinformatic, and computational methods developed for the study of protein truncating variants and their association to complex traits, and their functional consequences. Second, I present application of the methods to a number of case-control and quantitative trait studies discovering new variants and genes associated to breast and ovarian cancer, type 1 diabetes, lipids, and metabolic traits measured with NMR spectroscopy. Third, I present work on improving annotation of protein truncating variants by studying their functional consequences. Taken together, these results highlight the utility of interrogating protein truncating variants in medical and functional genomic studies.
30

Exploiting whole-PDB analysis in novel bioinformatics applications

Ramraj, Varun January 2014 (has links)
The Protein Data Bank (PDB) is the definitive electronic repository for experimentally-derived protein structures, composed mainly of those determined by X-ray crystallography. Approximately 200 new structures are added weekly to the PDB, and at the time of writing, it contains approximately 97,000 structures. This represents an expanding wealth of high-quality information but there seem to be few bioinformatics tools that consider and analyse these data as an ensemble. This thesis explores the development of three efficient, fast algorithms and software implementations to study protein structure using the entire PDB. The first project is a crystal-form matching tool that takes a unit cell and quickly (< 1 second) retrieves the most related matches from the PDB. The unit cell matches are combined with sequence alignments using a novel Family Clustering Algorithm to display the results in a user-friendly way. The software tool, Nearest-cell, has been incorporated into the X-ray data collection pipeline at the Diamond Light Source, and is also available as a public web service. The bulk of the thesis is devoted to the study and prediction of protein disorder. Initially, trying to update and extend an existing predictor, RONN, the limitations of the method were exposed and a novel predictor (called MoreRONN) was developed that incorporates a novel sequence-based clustering approach to disorder data inferred from the PDB and DisProt. MoreRONN is now clearly the best-in-class disorder predictor and will soon be offered as a public web service. The third project explores the development of a clustering algorithm for protein structural fragments that can work on the scale of the whole PDB. While protein structures have long been clustered into loose families, there has to date been no comprehensive analytical clustering of short (~6 residue) fragments. A novel fragment clustering tool was built that is now leading to a public database of fragment families and representative structural fragments that should prove extremely helpful for both basic understanding and experimentation. Together, these three projects exemplify how cutting-edge computational approaches applied to extensive protein structure libraries can provide user-friendly tools that address critical everyday issues for structural biologists.

Page generated in 0.0886 seconds