• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 130
  • 66
  • 66
  • 66
  • 66
  • 66
  • 20
  • 5
  • 5
  • 1
  • Tagged with
  • 254
  • 254
  • 254
  • 129
  • 123
  • 44
  • 36
  • 31
  • 27
  • 25
  • 24
  • 23
  • 21
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
241

Cell fate mechanisms in colorectal cancer

Kay, Sophie Kate January 2014 (has links)
Colorectal cancer (CRC) arises in part from the dysregulation of cellular proliferation, associated with the canonical Wnt pathway, and differentiation, effected by the Notch signalling network. In this thesis, we develop a mathematical model of ordinary differential equations (ODEs) for the coupled interaction of the Notch and Wnt pathways in cells of the human intestinal epithelium. Our central aim is to understand the role of such crosstalk in the genesis and treatment of CRC. An embedding of this model in cells of a simulated colonic tissue enables computational exploration of the cell fate response to spatially inhomogeneous growth cues in the healthy intestinal epithelium. We also examine an alternative, rule-based model from the literature, which employs a simple binary approach to pathway activity, in which the Notch and Wnt pathways are constitutively on or off. Comparison of the two models demonstrates the substantial advantages of the equation-based paradigm, through its delivery of stable and robust cell fate patterning, and its versatility for exploring the multiscale consequences of a variety of subcellular phenomena. Extension of the ODE-based model to include mutant cells facilitates the study of Notch-mediated therapeutic approaches to CRC. We find a marked synergy between the application of γ-secretase inhibitors and Hath1 stabilisers in the treatment of early-stage intestinal polyps. This combined treatment is an efficient means of inducing mitotic arrest in the cell population of the intestinal epithelium through enforced conversion to a secretory phenotype and is highlighted as a viable route for further theoretical, experimental and clinical study.
242

Analysis of 3D echocardiography

Chykeyuk, Kiryl January 2014 (has links)
Heart disease is the major cause of death in the developed world. Due to its fast, portable, low-cost and harmless way of imaging the heart, echocardiography has become the most frequent tool for diagnosis of cardiac function in clinical routine. However, visual assessment of heart function from echocardiography is challenging, highly operatordependant and is subject to intra- and inter observer errors. Therefore, development of automated methods for echocardiography analysis is important towards accurate assessment of cardiac function. In this thesis we develop new ways to model echocardiography data using Bayesian machine learning methods and concern three problems: (i) wall motion analysis in 2D stress echocardiography, (ii) segmentation of the myocardium in 3D echocardiography, and (iii) standard views extraction from 3D echocardiography. Firstly, we propose and compare four discriminative methods for feature extraction and wall motion classification of 2D stress echocardiography (images of the heart taken at rest and after exercise or pharmalogical stress). The four methods are based on (i) Support Vector Machines, (ii) Relevance Vector Machines, (iii) Lasso algorithm and Regularised Least Squares, (iv) Elastic Net regularisation and Regularised Least Squares. Although all the methods are shown to have superior performance to the state-of-the-art, one conclusion is that good segmentation of the myocardium in echocardiography is key for accurate assessment of cardiac wall motion. We investigate the application of one of the most promising current machine learning techniques, called Decision Random Forests, to segment the myocardium from 3D echocardiograms. We demonstrate that more reliable and ultrasound specific descriptors are needed in order to achieve the best results. Specifically, we introduce two sets of new features to improve the segmentation results: (i) LoCo and GloCo features with a local and a global shape constraint on coupled endoand epicardial boundaries, and (ii) FA features, which use the Feature Asymmetry measure to highlight step-like edges in echocardiographic images. We also reinforce the traditional features such as Haar and Rectangular features by aligning 3D echocardiograms. For that we develop a new registration technique, which is based on aligning centre lines of the left ventricles. We show that with alignment performance is boosted by approximately 15%. Finally, a novel approach to detect planes in 3D images using regression voting is proposed. To the best of our knowledge we are the first to use a one-step regression approach for the task of plane detection in 3D images. We investigate the application to standard views extraction from 3D echocardiography to facilitate efficient clinical inspection of cardiac abnormalities and diseases. We further develop a new method, called the Class- Specific Regression Forest, where class label information is incorporating into the training phase to reinforce the learning from semantically relevant to the problem classes. During testing the votes from irrelevant classes are excluded from voting to maximise the confidence of output predictors. We demonstrate that the Class-Specific Regression Random Forest outperforms the classic Regression Random Forest and produces results comparable to the manual annotations.
243

Compositional distributional semantics with compact closed categories and Frobenius algebras

Kartsaklis, Dimitrios January 2014 (has links)
The provision of compositionality in distributional models of meaning, where a word is represented as a vector of co-occurrence counts with every other word in the vocabulary, offers a solution to the fact that no text corpus, regardless of its size, is capable of providing reliable co-occurrence statistics for anything but very short text constituents. The purpose of a compositional distributional model is to provide a function that composes the vectors for the words within a sentence, in order to create a vectorial representation that re ects its meaning. Using the abstract mathematical framework of category theory, Coecke, Sadrzadeh and Clark showed that this function can directly depend on the grammatical structure of the sentence, providing an elegant mathematical counterpart of the formal semantics view. The framework is general and compositional but stays abstract to a large extent. This thesis contributes to ongoing research related to the above categorical model in three ways: Firstly, I propose a concrete instantiation of the abstract framework based on Frobenius algebras (joint work with Sadrzadeh). The theory improves shortcomings of previous proposals, extends the coverage of the language, and is supported by experimental work that improves existing results. The proposed framework describes a new class of compositional models thatfind intuitive interpretations for a number of linguistic phenomena. Secondly, I propose and evaluate in practice a new compositional methodology which explicitly deals with the different levels of lexical ambiguity (joint work with Pulman). A concrete algorithm is presented, based on the separation of vector disambiguation from composition in an explicit prior step. Extensive experimental work shows that the proposed methodology indeed results in more accurate composite representations for the framework of Coecke et al. in particular and every other class of compositional models in general. As a last contribution, I formalize the explicit treatment of lexical ambiguity in the context of the categorical framework by resorting to categorical quantum mechanics (joint work with Coecke). In the proposed extension, the concept of a distributional vector is replaced with that of a density matrix, which compactly represents a probability distribution over the potential different meanings of the specific word. Composition takes the form of quantum measurements, leading to interesting analogies between quantum physics and linguistics.
244

Contextuality and noncommutative geometry in quantum mechanics

de Silva, Nadish January 2015 (has links)
It is argued that the geometric dual of a noncommutative operator algebra represents a notion of quantum state space which differs from existing notions by representing observables as maps from states to outcomes rather than from states to distributions on outcomes. A program of solving for an explicitly geometric manifestation of quantum state space by adapting the spectral presheaf, a construction meant to analyze contextuality in quantum mechanics, to derive simple reconstructions of noncommutative topological tools from their topological prototypes is presented. We associate to each unital C&ast;-algebra A a geometric object--a diagram of topological spaces representing quotient spaces of the noncommutative space underlying A&mdash;meant to serve the role of a generalized Gel'fand spectrum. After showing that any functor F from compact Hausdorff spaces to a suitable target category C can be applied directly to these geometric objects to automatically yield an extension F<sup>&sim;</sup> which acts on all unital C&ast;-algebras, we compare a novel formulation of the operator K<sub>0</sub> functor to the extension K<sup>&sim;</sup> of the topological K-functor. We then conjecture that the extension of the functor assigning a topological space its topological lattice assigns a unital C&ast;-algebra the topological lattice of its primary ideal spectrum and prove the von Neumann algebraic analogue of this conjecture.
245

Iterative Local Model Selection for tracking and mapping

Segal, Aleksandr V. January 2014 (has links)
The past decade has seen great progress in research on large scale mapping and perception in static environments. Real world perception requires handling uncertain situations with multiple possible interpretations: e.g. changing appearances, dynamic objects, and varying motion models. These aspects of perception have been largely avoided through the use of heuristics and preprocessing. This thesis is motivated by the challenge of including discrete reasoning directly into the estimation process. We approach the problem by using Conditional Linear Gaussian Networks (CLGNs) as a generalization of least-squares estimation which allows the inclusion of discrete model selection variables. CLGNs are a powerful framework for modeling sparse multi-modal inference problems, but are difficult to solve efficiently. We propose the Iterative Local Model Selection (ILMS) algorithm as a general approximation strategy specifically geared towards the large scale problems encountered in tracking and mapping. Chapter 4 introduces the ILMS algorithm and compares its performance to traditional approximate inference techniques for Switching Linear Dynamical Systems (SLDSs). These evaluations validate the characteristics of the algorithm which make it particularly attractive for applications in robot perception. Chief among these is reliability of convergence, consistent performance, and a reasonable trade off between accuracy and efficiency. In Chapter 5, we show how the data association problem in multi-target tracking can be formulated as an SLDS and effectively solved using ILMS. The SLDS formulation allows the addition of additional discrete variables which model outliers and clutter in the scene. Evaluations on standard pedestrian tracking sequences demonstrates performance competitive with the state of the art. Chapter 6 applies the ILMS algorithm to robust pose graph estimation. A non-linear CLGN is constructed by introducing outlier indicator variables for all loop closures. The standard Gauss-Newton optimization algorithm is modified to use ILMS as an inference algorithm in between linearizations. Experiments demonstrate a large improvement over state-of-the-art robust techniques. The ILMS strategy presented in this thesis is simple and general, but still works surprisingly well. We argue that these properties are encouraging for wider applicability to problems in robot perception.
246

Vasculature reconstruction from 3D cryomicrotome images

Goyal, Ayush January 2013 (has links)
Background: Research in heart disease can be aided by modelling myocardial hemodynamics with knowledge of coronary pressure and vascular resistance measured from the geometry and morphometry of coronary vasculature. This study presents methods to automatically reconstruct accurate detailed coronary vascular anatomical models from high-resolution three-dimensional optical fluorescence cryomicrotomography image volumes for simulating blood flow in coronary arterial trees. Methods: Images of fluorescent cast and bead particles perfused into the same heart comprise the vasculature and microsphere datasets, employed in a novel combined approach to measure vasculature and simulate a flow model on the extracted coronary vascular tree for estimating regional myocardial perfusion. The microspheres are used in two capacities - as fiducial biomarker point sources for measuring the image formation in order to accurately measure the vasculature dataset and as flowing particles for measuring regional myocardial perfusion through the reconstructed vasculature. A new model-based template-matching method of vascular radius estimation is proposed that incorporates a model of the optical fluorescent image formation measured from the microspheres and a template of the vessels’ tubular geometry. Results: The new method reduced the error in vessel radius estimation from 42.9% to 0.6% in a 170 micrometer vessel as compared to the Full-Width Half Maximum method. Whole-organ porcine coronary vascular trees, automatically reconstructed with the proposed method, contained on the order of 92,000+ vessel segments in the range 0.03 – 1.9 mm radius. Discrepancy between the microsphere perfusion measurements and regional flow estimated with a 1-D steady state linear static blood flow simulation on the reconstructed vasculature was modelled with daughter-to-parent area ratio and branching angle as the parameters. Correcting the flow simulation by incorporating this model of disproportionate distribution of microspheres reduced the error from 24% to 7.4% in the estimation of fractional microsphere distribution in oblique branches with angles of 100°-120°.
247

Biomimetic and autonomic server ensemble orchestration

Nakrani, Sunil January 2005 (has links)
This thesis addresses orchestration of servers amongst multiple co-hosted internet services such as e-Banking, e-Auction and e-Retail in hosting centres. The hosting paradigm entails levying fees for hosting third party internet services on servers at guaranteed levels of service performance. The orchestration of server ensemble in hosting centres is considered in the context of maximising the hosting centre's revenue over a lengthy time horizon. The inspiration for the server orchestration approach proposed in this thesis is drawn from nature and generally classed as swarm intelligence, specifically, sophisticated collective behaviour of social insects borne out of primitive interactions amongst members of the group to solve problems beyond the capability of individual members. Consequently, the approach is self-organising, adaptive and robust. A new scheme for server ensemble orchestration is introduced in this thesis. This scheme exploits the many similarities between server orchestration in an internet hosting centre and forager allocation in a honeybee (Apis mellifera) colony. The scheme mimics the way a honeybee colony distributes foragers amongst flower patches to maximise nectar influx, to orchestrate servers amongst hosted internet services to maximise revenue. The scheme is extended by further exploiting inherent feedback loops within the colony to introduce self-tuning and energy-aware server ensemble orchestration. In order to evaluate the new server ensemble orchestration scheme, a collection of server ensemble orchestration methods is developed, including a classical technique that relies on past history to make time varying orchestration decisions and two theoretical techniques that omnisciently make optimal time varying orchestration decisions or an optimal static orchestration decision based on complete knowledge of the future. The efficacy of the new biomimetic scheme is assessed in terms of adaptiveness and versatility. The performance study uses representative classes of internet traffic stream behaviour, service user's behaviour, demand intensity, multiple services co-hosting as well as differentiated hosting fee schedule. The biomimetic orchestration scheme is compared with the classical and the theoretical optimal orchestration techniques in terms of revenue stream. This study reveals that the new server ensemble orchestration approach is adaptive in a widely varying external internet environments. The study also highlights the versatility of the biomimetic approach over the classical technique. The self-tuning scheme improves on the original performance. The energy-aware scheme is able to conserve significant energy with minimal revenue performance degradation. The simulation results also indicate that the new scheme is competitive or better than classical and static methods.
248

Methods, rules and limits of successful self-assembly

Williamson, Alexander James January 2011 (has links)
The self-assembly of structured particles into monodisperse clusters is a challenge on the nano-, micro- and even macro-scale. While biological systems are able to self-assemble with comparative ease, many aspects of this self-assembly are not fully understood. In this thesis, we look at the strategies and rules that can be applied to encourage the formation of monodisperse clusters. Though much of the inspiration is biological in nature, the simulations use a simple minimal patchy particle model and are thus applicable to a wide range of systems. The topics that this thesis addresses include: Encapsulation: We show how clusters can be used to encapsulate objects and demonstrate that such `templates' can be used to control the assembly mechanisms and enhance the formation of more complex objects. Hierarchical self-assembly: We investigate the use of hierarchical mechanisms in enhancing the formation of clusters. We find that, while we are able to extend the ranges where we see successful assembly by using a hierarchical assembly pathway, it does not straightforwardly provide a route to enhance the complexity of structures that can be formed. Pore formation: We use our simple model to investigate a particular biological example, namely the self-assembly and formation of heptameric alpha-haemolysin pores, and show that pore insertion is key to rationalising experimental results on this system. Phase re-entrance: We look at the computation of equilibrium phase diagrams for self-assembling systems, particularly focusing on the possible presence of an unusual liquid-vapour phase re-entrance that has been suggested by dynamical simulations, using a variety of techniques.
249

Stratagems for effective function evaluation in computational chemistry

Skone, Gwyn S. January 2010 (has links)
In recent years, the potential benefits of high-throughput virtual screening to the drug discovery community have been recognized, bringing an increase in the number of tools developed for this purpose. These programs have to process large quantities of data, searching for an optimal solution in a vast combinatorial range. This is particularly the case for protein-ligand docking, since proteins are sophisticated structures with complicated interactions for which either molecule might reshape itself. Even the very limited flexibility model to be considered here, using ligand conformation ensembles, requires six dimensions of exploration - three translations and three rotations - per rigid conformation. The functions for evaluating pose suitability can also be complex to calculate. Consequently, the programs being written for these biochemical simulations are extremely resource-intensive. This work introduces a pure computer science approach to the field, developing techniques to improve the effectiveness of such tools. Their architecture is generalized to an abstract pattern of nested layers for discussion, covering scoring functions, search methods, and screening overall. Based on this, new stratagems for molecular docking software design are described, including lazy or partial evaluation, geometric analysis, and parallel processing implementation. In addition, a range of novel algorithms are presented for applications such as active site detection with linear complexity (PIES) and small molecule shape description (PASTRY) for pre-alignment of ligands. The various stratagems are assessed individually and in combination, using several modified versions of an existing docking program, to demonstrate their benefit to virtual screening in practical contexts. In particular, the importance of appropriate precision in calculations is highlighted.
250

Stochastic modelling and simulation in cell biology

Szekely, Tamas January 2014 (has links)
Modelling and simulation are essential to modern research in cell biology. This thesis follows a journey starting from the construction of new stochastic methods for discrete biochemical systems to using them to simulate a population of interacting haematopoietic stem cell lineages. The first part of this thesis is on discrete stochastic methods. We develop two new methods, the stochastic extrapolation framework and the Stochastic Bulirsch-Stoer methods. These are based on the Richardson extrapolation technique, which is widely used in ordinary differential equation solvers. We believed that it would also be useful in the stochastic regime, and this turned out to be true. The stochastic extrapolation framework is a scheme that admits any stochastic method with a fixed stepsize and known global error expansion. It can improve the weak order of the moments of these methods by cancelling the leading terms in the global error. Using numerical simulations, we demonstrate that this is the case up to second order, and postulate that this also follows for higher order. Our simulations show that extrapolation can greatly improve the accuracy of a numerical method. The Stochastic Bulirsch-Stoer method is another highly accurate stochastic solver. Furthermore, using numerical simulations we find that it is able to better retain its high accuracy for larger timesteps than competing methods, meaning it remains accurate even when simulation time is speeded up. This is a useful property for simulating the complex systems that researchers are often interested in today. The second part of the thesis is concerned with modelling a haematopoietic stem cell system, which consists of many interacting niche lineages. We use a vectorised tau-leap method to examine the differences between a deterministic and a stochastic model of the system, and investigate how coupling niche lineages affects the dynamics of the system at the homeostatic state as well as after a perturbation. We find that larger coupling allows the system to find the optimal steady state blood cell levels. In addition, when the perturbation is applied randomly to the entire system, larger coupling also results in smaller post-perturbation cell fluctuations compared to non-coupled cells. In brief, this thesis contains four main sets of contributions: two new high-accuracy discrete stochastic methods that have been numerically tested, an improvement that can be used with any leaping method that introduces vectorisation as well as how to use a common stepsize adapting scheme, and an investigation of the effects of coupling lineages in a heterogeneous population of haematopoietic stem cell niche lineages.

Page generated in 0.127 seconds