121 |
Mathematical and computational modelling of tissue engineered bone in a hydrostatic bioreactorLeonard, Katherine H. L. January 2014 (has links)
In vitro tissue engineering is a method for developing living and functional tissues external to the body, often within a device called a bioreactor to control the chemical and mechanical environment. However, the quality of bone tissue engineered products is currently inadequate for clinical use as the implant cannot bear weight. In an effort to improve the quality of the construct, hydrostatic pressure, the pressure in a fluid at equilibrium that is required to balance the force exerted by the weight of the fluid above, has been investigated as a mechanical stimulus for promoting extracellular matrix deposition and mineralisation within bone tissue. Thus far, little research has been performed into understanding the response of bone tissue cells to mechanical stimulation. In this thesis we investigate an in vitro bone tissue engineering experimental setup, whereby human mesenchymal stem cells are seeded within a collagen gel and cultured in a hydrostatic pressure bioreactor. In collaboration with experimentalists a suite of mathematical models of increasing complexity is developed and appropriate numerical methods are used to simulate these models. Each of the models investigates different aspects of the experimental setup, from focusing on global quantities of interest through to investigating their detailed local spatial distribution. The aim of this work is to increase understanding of the underlying physical processes which drive the growth and development of the construct, and identify which factors contribute to the highly heterogeneous spatial distribution of the mineralised extracellular matrix seen experimentally. The first model considered is a purely temporal model, where the evolution of cells, solid substrate, which accounts for the initial collagen scaffold and deposited extracellular matrix along with attendant mineralisation, and fluid in response to the applied pressure are examined. We demonstrate that including the history of the mechanical loading of cells is important in determining the quantity of deposited substrate. The second and third models extend this non-spatial model, and examine biochemically and biomechanically-induced spatial patterning separately. The first of these spatial models demonstrates that nutrient diffusion along with nutrient-dependent mass transfer terms qualitatively reproduces the heterogeneous spatial effects seen experimentally. The second multiphase model is used to investigate whether the magnitude of the shear stresses generated by fluid flow, can qualitatively explain the heterogeneous mineralisation seen in the experiments. Numerical simulations reveal that the spatial distribution of the fluid shear stress magnitude is highly heterogeneous, which could be related to the spatial heterogeneity in the mineralisation seen experimentally.
|
122 |
Mathematical modelling of metabolism and acidity in cancerMcGillen, Jessica Buono January 2014 (has links)
Human cancers exhibit the common phenotype of elevated glycolytic metabolism, which causes acidification of the tissue microenvironment and may facilitate tumour invasion. In this thesis, we use mathematical models to address a series of open problems underlying the glycolytic tumour phenotype and its attendant acidity. We first explore tissue-scale consequences of metabolically-derived acid. Incorporating more biological detail into a canonical model of acidity at the tumour-host interface, we extend the range of tumour behaviours captured by the modelling framework. We then carry out an asymptotic travelling wave analysis to express invasive tumour properties in terms of fundamental parameters, and find that interstitial gaps between an advancing tumour and retreating healthy tissue, characteristic of aggressive invasion and comprising a controversial feature of the original model, are less significant under our generalised formulation. Subsequently, we evaluate a potential role of lactate---historically assumed to be a passive byproduct of glycolytic metabolism---in a perfusion-dependent metabolic symbiosis that was recently proposed as a beneficial tumour behaviour. Upon developing a minimal model of dual glucose-lactate consumption in vivo and employing a multidimensional sensitivity analysis, we find that symbiosis may not be straightforwardly beneficial for our model tumour. Moreover, new in vitro experiments, carried out by an experimental collaborator, place U87 glioblastoma tumours in a weakly symbiotic parameter regime despite their clinical malignancy. These results suggest that intratumoural metabolic cooperation is unlikely to be an important role for lactate. Finally, we examine the complex pH regulation system that governs expulsion of metabolically derived acid loads across tumour cell membranes. This system differs from the healthy system by expression of only a few key proteins, yet its dynamics are non-intuitive in the crowded and poorly perfused in vivo environment. We systematically develop a model of tumour pH regulation, beginning with a single-cell scenario and progressing to a spheroid, within a Bayesian framework that incorporates information from in vitro data contributed by a second experimental collaborator. We predict that a net effect of pH regulation is a straightforward transmembrane pH gradient, but also that existing treatments are unable to disrupt the system strongly enough to cause tumour cell death. Taken together, our models help to elucidate previously unresolved features of glycolytic tumour metabolism, and illustrate the utility of a combined mathematical, statistical, and experimental approach for testing biological hypotheses. Opportunities for further investigation are discussed.
|
123 |
Analysis of non-steady state physiological and pathological processesHill, Nathan R. January 2008 (has links)
The analysis of non steady state physiological and pathological processes concerns the abstraction, extraction, formalisation and analysis of information from physiological systems that is obscured, hidden or unable to be assessed using traditional methods. Time Series Analysis (TSA) techniques were developed and built into a software program, Easy TSA, with the aim of examining the oscillations of hormonal concentrations in respect to their temporal aspects – periodicity, phase, pulsatility. The Easy TSA program was validated using constructed data sets and used in a clinical study to examine the relationship between insulin and obesity in people without diabetes. In this study fifty-six non-diabetic subjects (28M, 28F) were examined using data from a number of protocols. Fourier Transform and Autocorrelation techniques determined that there was a critical effect of the level of BMI on the frequency, amplitude and regularity of insulin oscillations. Second, information systems formed the background to the development of an algorithm to examine glycaemic variability and a new methodology termed the Glycaemic Risk in Diabetes Equation (GRADE) was developed. The aim was to report an integrated glycaemic risk score from glucose profiles that would complement summary measures of glycaemia, such as the HbA1c. GRADE was applied retrospectively to blood glucose data sets to determine if it was clinically relevant. Subjects with type 1 and type 2 diabetes had higher GRADE scores than the non-diabetic population and the contribution of hypo- and hyperglycaemic episodes to risk was demonstrated. A prospective study was then designed with the aim to apply GRADE in a clinical context and to measure the statistical reproducibility of using GRADE. Fifty-three (Male 26, Female 27) subjects measured their blood glucose 4 times daily for twenty-one days. The results were that lower HbA1c’s correlated with an increased risk of hypoglycaemia and higher HbA1c’s correlated with an increased risk of hyperglycaemia. Some subjects had HbA1c of 7.0 but had median GRADE values ranging from 2.2 to 10.5. The GRADE score summarized diverse glycaemic profiles into a single assessment of risk. Well-controlled glucose profiles yielded GRADE scores <= 5 and higher GRADE scores represented increased clinical risk from hypo or hyperglycaemia. Third, an information system was developed to analyse data-rich multi-variable retinal images using the concept of assessment of change rather than specific lesion recognition. A fully Automated Retinal Image Differencing (ARID) computer system was developed to highlight change between retinal images over time. ARID was validated using a study and then a retrospective study sought to determine if the use of the ARID software was an aid to the retinal screener. One hundred and sixty images (80 image pairs) were obtained from Gloucestershire Diabetic Eye Screening Programme. Images pairs were graded manually and categorised according to how each type of lesion had progressed, regressed, or not changed between image A and image B. After a 30 day washout period image pairs were graded using ARID and the results compared. The comparison of manual grading to grading using ARID (Table 4.3) demonstrated an increased sensitivity and specificity. The mean sensitivity of ARID (87.9%) was increased significantly in comparison to manually grading sensitivity (84.1%) (p<0.05). The specificity of the automated analysis (87.5%) increased significantly from the specificity (56.3%) achieved by manually grading (p<0.05). The conclusion was that automatic display of an ARID differenced image where sequential photographs are available would allow rapid assessment and appropriate triage. Forth, non-linear dynamic systems analysis methods were utilised to build a system to assess the extent of chaos characteristics within the insulin-glucose feedback domain. Biological systems exist that are deterministic yet are neither predictable nor repeatable. Instead they exhibit chaos, where a small change in the initial conditions produces a wholly different outcome. The glucose regulatory system is a dynamic system that maintains glucose homeostasis through the feedback mechanism of glucose, insulin, and contributory hormones and was ideally suited to chaos analysis. To investigate this system a new algorithm was created to assess the Normalised Area of Attraction (NAA). The NAA was calculated by defining an oval using the 95% CI of glucose & Insulin (the limit cycle) on a phasic plot. Thirty non-diabetic subjects and four subjects with type 2 diabetes were analysed. The NAA indicated a smaller range for glucose and insulin excursions with the non-diabetics subjects (p<0.05). The conclusion was that the evaluation of glucose metabolism in terms of homeostatic integrity and not in term of cut-off values may enable a more realistic approach to the effective treatment and prevention of diabetes and its complications.
|
124 |
Models for adaptive feeding and population dynamics in planktonPiltz, Sofia Helena January 2014 (has links)
Traditionally, differential-equation models for population dynamics have considered organisms as "fixed" entities in terms of their behaviour and characteristics. However, there have been many observations of adaptivity in organisms, both at the level of behaviour and as an evolutionary change of traits, in response to the environmental conditions. Taking such adaptiveness into account alters the qualitative dynamics of traditional models and is an important factor to be included, for example, when developing reliable model predictions under changing environmental conditions. In this thesis, we consider piecewise-smooth and smooth dynamical systems to represent adaptive change in a 1 predator-2 prey system. First, we derive a novel piecewise-smooth dynamical system for a predator switching between its preferred and alternative prey type in response to prey abundance. We consider a linear ecological trade-off and discover a novel bifurcation as we change the slope of the trade-off. Second, we reformulate the piecewise-smooth system as two novel 1 predator-2 prey smooth dynamical systems. As opposed to the piecewise-smooth system that includes a discontinuity in the vector fields and assumes that a predator switches its feeding strategy instantaneously, we relax this assumption in these systems and consider continuous change in a predator trait. We use plankton as our reference organism because they serve as an important model system. We compare the model simulations with data from Lake Constance on the German-Swiss-Austrian border and suggest possible mechanistic explanations for cycles in plankton concentrations in spring.
|
125 |
Mathematical models of the retina in health and diseaseRoberts, Paul Allen January 2015 (has links)
The retina is the ocular tissue responsible for the detection of light. Its extensive demand for oxygen, coupled with a concomitant elevated supply, renders this tissue prone to both hypoxia and hyperoxia. In this thesis, we construct mathematical models of the retina, formulated as systems of reaction-diffusion equations, investigating its oxygen-related dynamics in healthy and diseased states. In the healthy state, we model the oxygen distribution across the human retina, examining the efficacy of the protein neuroglobin in the prevention of hypoxia. It has been suggested that neuroglobin could prevent hypoxia, either by transporting oxygen from regions where it is rich to those where it is poor, or by storing oxygen during periods of diminished supply or increased uptake. Numerical solutions demonstrate that neuroglobin may be effective in preventing or alleviating hypoxia via oxygen transport, but that its capacity for oxygen storage is essentially negligible, whilst asymptotic analysis reveals that, contrary to the prevailing assumption, neuroglobin's oxygen affinity is near optimal for oxygen transport. A further asymptotic analysis justifies the common approximation of a piecewise constant oxygen uptake across the retina, placing existing models upon a stronger theoretical foundation. In the diseased state, we explore the effect of hyperoxia upon the progression of the inherited retinal diseases, known collectively as retinitis pigmentosa. Both numerical solutions and asymptotic analyses show that this mechanism may replicate many of the patterns of retinal degeneration seen in vivo, but that others are inaccessible to it, demonstrating both the strengths and weaknesses of the oxygen toxicity hypothesis. It is shown that the wave speed of hyperoxic degeneration is negatively correlated with the local photoreceptor density, high density regions acting as a barrier to the spread of photoreceptor loss. The effects of capillary degeneration and treatment with antioxidants or trophic factors are also investigated, demonstrating that each has the potential to delay, halt or partially reverse photoreceptor loss. In addition to answering questions that are not accessible to experimental investigation, these models generate a number of experimentally testable predictions, forming the first loop in what has the potential to be a fruitful experimental/modelling cycle.
|
126 |
Colouring, centrality and core-periphery structure in graphsRombach, Michaela Puck January 2013 (has links)
Krivelevich and Patkós conjectured in 2009 that χ(G(n, p)) ∼ χ=(G(n, p)) ∼ χ∗=(G(n, p)) for C/n < p < 1 − ε, where ε > 0. We prove this conjecture for n−1+ε1 < p < 1 − ε2 where ε1, ε2 > 0. We investigate several measures that have been proposed to indicate centrality of nodes in networks, and find examples of networks where they fail to distinguish any of the vertices nodes from one another. We develop a new method to investigate core-periphery structure, which entails identifying densely-connected core nodes and sparsely-connected periphery nodes. Finally, we present an experiment and an analysis of empirical networks, functional human brain networks. We found that reconfiguration patterns of dynamic communities can be used to classify nodes into a stiff core, a flexible periphery, and a bulk. The separation between this stiff core and flexible periphery changes as a person learns a simple motor skill and, importantly, it is a good predictor of how successful the person is at learning the skill. This temporally defined core-periphery organisation corresponds well with the core- periphery detected by the method that we proposed earlier the static networks created by averaging over the subjects dynamic functional brain networks.
|
127 |
Accurate telemonitoring of Parkinson's disease symptom severity using nonlinear speech signal processing and statistical machine learningTsanas, Athanasios January 2012 (has links)
This study focuses on the development of an objective, automated method to extract clinically useful information from sustained vowel phonations in the context of Parkinson’s disease (PD). The aim is twofold: (a) differentiate PD subjects from healthy controls, and (b) replicate the Unified Parkinson’s Disease Rating Scale (UPDRS) metric which provides a clinical impression of PD symptom severity. This metric spans the range 0 to 176, where 0 denotes a healthy person and 176 total disability. Currently, UPDRS assessment requires the physical presence of the subject in the clinic, is subjective relying on the clinical rater’s expertise, and logistically costly for national health systems. Hence, the practical frequency of symptom tracking is typically confined to once every several months, hindering recruitment for large-scale clinical trials and under-representing the true time scale of PD fluctuations. We develop a comprehensive framework to analyze speech signals by: (1) extracting novel, distinctive signal features, (2) using robust feature selection techniques to obtain a parsimonious subset of those features, and (3a) differentiating PD subjects from healthy controls, or (3b) determining UPDRS using powerful statistical machine learning tools. Towards this aim, we also investigate 10 existing fundamental frequency (F_0) estimation algorithms to determine the most useful algorithm for this application, and propose a novel ensemble F_0 estimation algorithm which leads to a 10% improvement in accuracy over the best individual approach. Moreover, we propose novel feature selection schemes which are shown to be very competitive against widely-used schemes which are more complex. We demonstrate that we can successfully differentiate PD subjects from healthy controls with 98.5% overall accuracy, and also provide rapid, objective, and remote replication of UPDRS assessment with clinically useful accuracy (approximately 2 UPDRS points from the clinicians’ estimates), using only simple, self-administered, and non-invasive speech tests. The findings of this study strongly support the use of speech signal analysis as an objective basis for practical clinical decision support tools in the context of PD assessment.
|
128 |
Evolvability : a formal approachGallagher, Alexis January 2009 (has links)
This dissertation clarifies the concept of evolvability, the increased capacity of some organisms or systems to support evolution, especially the evolution of life-like complexity. I survey the literature, which is spread over the fields of population genetics, developmental biology, artificial life, and microbial and molecular evolution. Finding that researchers have often used the term vaguely and incompatibly I identify five distinct kinds or senses of evolvability. I also identify five key constituent ideas, which I discuss in the context of organismic evolvability, a sense of evolvability with deep roots in the traditional fields of animal development and macroevolution. In these fields research into evolvability has historically been hampered by an insufficiently detailed knowledge of development. Research in molecular evolution has produced a thorough knowledge of the folding of RNA into secondary structure, which can be regarded as a model of development. This has motivated new approaches to evolvability based on representing development via a single genotype-phenotype mapping function. I build on these approaches to invent new mathematical methods to formalise the traditional ideas. I create an exact model illustrating a classic example of evolvability, the capacity for repeated segmentation and simple modularity. I analyse this with two new formal approaches. First is the genospace algebra, a propositional calculus based on graph theory. It is a formal language for describing genotype-phenotype maps. It provides a system for making calculations, proofs, and diagrams about mutational structures in genotype space, and it is flexible enough to allow description at arbitrary degrees of resolution. Second is a pair of concepts, the genetic leverage and the genetic fulcrum. The leverage provides a crude numerical measure of evolvability, and the fulcrum provides a heuristic for identifying the genomic and developmental causes of evolvability. Besides its specific relevance to diversification and development, evolvability is also crucial to the fundamental question of how evolution produces ordinary biological life. Simulation systems that implement only a conventional textbook model of evolution -– systems possessing only variation, inheritance, and selection –- fail to evolve anything resembling the complexity of the biological world. Research into evolvability is our best bet to illuminate the "missing ingredient" for life-like evolution.
|
129 |
Ventricular function under LVAD supportMcCormick, Matthew January 2012 (has links)
This thesis presents a finite element methodology for simulating fluid–solid interactions in the left ventricle (LV) under LVAD support. The developed model was utilised to study the passive and active characteristics of ventricular function in anatomically accurate LV geometries constructed from normal and patient image data. A non–conforming ALE Navier–Stokes/finite–elasticity fluid–solid coupling system formed the core of the numerical scheme, onto which several novel numerical additions were made. These included a fictitious domain (FD) Lagrange multiplier method to capture the interactions between immersed rigid bodies and encasing elastic solids (required for the LVAD cannula), as well as modifications to the Newton–Raphson/line search algorithm (which provided a 2 to 10 fold reduction in simulation time). Additional developments involved methods for extending the model to ventricular simulations. This required the creation of coupling methods, for both fluid and solid problems, to enable the integration of a lumped parameter representation of the systemic and pulmonary circulatory networks; the implementation and tuning of models of passive and active myocardial behaviour; as well as the testing of appropriate element types for coupling non–conforming fluid– solid finite element models under high interface tractions (finding that curvilinear spatial interpolations of the fluid geometry perform best). The behaviour of the resulting numerical scheme was investigated in a series of canonical test problems and found to be convergent and stable. The FD convergence studies also found that discontinuous pressure elements were better at capturing pressure gradients across FD boundaries. The ventricular simulations focused firstly on studying the passive diastolic behaviour of the LV both with and without LVAD support. Substantially different vortical flow features were observed when LVAD outflow was included. Additionally, a study of LVAD cannula lengths, using a particle tracking algorithm to determine recirculation rates of blood within the LV, found that shorter cannulas improved the recirculation of blood from the LV apex. Incorporating myocardial contraction, the model was extended to simulate the full cardiac cycle, converging on a repeating pressure–volume loop over 2 heart beats. Studies on the normal LV geometry found that LVAD implementation restricts the recirculation of early diastolic inflow, and that fluid–solid coupled models introduce greater heterogeneity of myocardial work than was observed in equivalent solid only models. A patient study was undertaken using a myocardial geometry constructed using image data from an LVAD implant recipient. A series of different LVAD flow regimes were tested. It was found that the opening of the aortic valve had a homogenising effect on the spatial variation of work, indicating that the synchronisation of LVAD outflow with the cardiac cycle is more important if the valve remains shut. Additionally, increasing LVAD outflow during systole and decreasing it during diastole led to improved mixing of blood in the ventricular cavity – compared with either the inverse, or holding outflow constant. Validation of these findings has the potential to impact the treatment protocols of LVAD patients.
|
130 |
Efficient numerical methods for ultrasound elastographySquires, Timothy Richard January 2012 (has links)
In this thesis, two algorithms are introduced for use in ultrasound elastography. Ultrasound elastography is a technique developed in the last 20 years by which anomalous regions in soft tissue are located and diagnosed without the need for biopsy. Due to this, the relativity cheap cost of ultrasound imaging and the high level of accuracy in the methods, ultrasound elastography methods have shown great potential for the diagnosis of cancer in soft tissues. The algorithms introduced in this thesis represent an advance in this field. The first algorithm is a two-step iteration procedure consisting of two minimization problems - displacement estimation and elastic parameter calculation that allow for diagnosis of any anomalous regions within soft tissue. The algorithm represents an improvement on existing methods in several ways. A weighting factor is introduced for each different point in the tissue dependent on the confidence in the accuracy of the data at that point, an exponential substitution is made for the elasticity modulus, an adjoint method is used for efficient calculation of the gradient vector and a total variation regularization technique is used. Most importantly, an adaptive mesh refinement strategy is introduced that allows highly efficient calculation of the elasticity distribution of the tissue though using a number of degrees of freedom several orders lower than methods that use a uniform mesh refinement strategy. Results are presented that show the algorithm is robust even in the presence of significant noise and that it can locate a tumour of 4mm in diameter within a 5cm square region of tissue. Also, the algorithm is extended into 3 dimensions and results are presented that show that it can calculate a 3 dimensional elasticity distribution efficiently. This extension into 3-d is a significant advance in the field. The second algorithm is a one-step algorithm that seeks to combine the two problems of elasticity distribution and displacement calculation into one. As in the two-step algorithm, a weighting factor, exponential substitution for the elasticity parameter, adjoint method for calculation of the gradient vector, total variation regularization and adaptive mesh refinement strategy are incorporated. Results are presented that show that this original approach can locate tumours of varying sizes and shapes in the presence of varying levels of added artificial noise and that it can determine the presence of a tumour in images taken from breast tissue in vivo.
|
Page generated in 0.0916 seconds