Spelling suggestions: "subject:"[een] MATHEMATICAL MODELING"" "subject:"[enn] MATHEMATICAL MODELING""
51 |
Trunk Stability during Postural Control: Tool Development and AnalysisVette, Albert H. 06 December 2012 (has links)
Trunk instability is a major problem for people with spinal cord injury (SCI); it not only limits their independence, but also leads to secondary health complications such as kyphosis, pressure sores, and respiratory dysfunction. In exploring mechanisms that may facilitate or compromise postural stability, dynamic models are very useful because the spine dynamics are difficult to study in vivo compared to other structures of the body. Therefore, one objective of this work was to develop a detailed three-dimensional dynamic model of the human trunk as a tool for investigating the neural-mechanical control strategy that healthy people apply to maintain trunk stability during various tasks. Since trunk control is fairly complex, however, another objective of this work was to provide insights into the balance control strategy of a simpler neuro-musculo-skeletal system that may facilitate future studies on trunk control. For this purpose, the control of the ankle joint complex during quiet standing (anterior-posterior degree of freedom) was studied in place of the trunk. The obtained results reveal that a neural-mechanical control scheme using a proportional-derivative controller as the neural control strategy can overcome a large sensory-motor (feedback) time delay and stabilize the ankle joint during quiet standing. Moreover, a detailed dynamic model of the trunk has been developed that is: (1) based on highly accurate geometric models; and (2) universally applicable. Thus, this work also responds to the postulation that structurally more complex models are needed to better characterize the biomechanics of multifaceted systems. Combining the developed biomechanical tools for the trunk with the postural control insights for the ankle joint during standing will be beneficial for: (1) understanding the neural-mechanical control strategy that facilitates trunk stability in healthy people; and for (2) developing neuroprostheses for trunk stability after SCI and other neurological disorders.
|
52 |
Modeling of Brain Tumors: Effects of Microenvironment and Associated Therapeutic StrategiesPowathil, Gibin George January 2009 (has links)
Gliomas are the most common and aggressive primary brain tumors. The most common treatment protocols for these brain tumors are combinations of surgery, chemotherapy and radiotherapy. However, even with the most aggressive combination of surgery and
radiotherapy and/or chemotherapy schedules, gliomas almost always recur resulting in a median survival time for patients of not more
than 12 months. This highly diffusive and invasive nature of brain tumors makes it very important to study the effects of these
combined therapeutic strategies in an effort to improve the survival time of patients. It is also important to study the tumor microenvironment, since the complex nature of the cerebral vasculature, including the blood brain barrier and several other
tumor-induced conditions such as hypoxia, high interstitial pressure, and cerebral edema affect drug delivery as well as the
effectiveness of radiotherapy. Recently, a novel strategy using antiangiogenic therapy has been studied for the treatment of brain
tumors. Antiangiogenic therapy interferes with the development of tumor vasculature and indirectly helps in the control of tumor
growth. Recent clinical trials suggest that anti-angiogenic therapy is usually more effective when given in combination with
other therapeutic strategies.
In an effort to study the effects of the aforementioned therapeutic strategies, a spatio-temporal model is considered here
that incorporates the tumor cell growth and the effects of radiotherapy and chemotherapy. The effects of different schedules of radiation therapy is then studied using a generalized linear
quadratic model and compared against the published clinical data. The model is then extended to include the interactions of tumor
vasculature and oxygen concentration, to explain tumor hypoxia and to study various methods of hypoxia characterizations including biomarker estimates and needle electrode measurements. The model predicted hypoxia is also used to analyze the effects of tumor oxygenation status on radiation response as it is known that tumor hypoxia negatively influences the radiotherapy outcome. This thesis also presents a detailed analysis of the effects of heterogenous tumor vasculature on tumor interstitial fluid pressure and interstitial fluid velocity. A mathematical modeling
approach is then used to analyze the changes in interstitial fluid pressure with or without antiangiogenic therapy.
|
53 |
Modeling of Brain Tumors: Effects of Microenvironment and Associated Therapeutic StrategiesPowathil, Gibin George January 2009 (has links)
Gliomas are the most common and aggressive primary brain tumors. The most common treatment protocols for these brain tumors are combinations of surgery, chemotherapy and radiotherapy. However, even with the most aggressive combination of surgery and
radiotherapy and/or chemotherapy schedules, gliomas almost always recur resulting in a median survival time for patients of not more
than 12 months. This highly diffusive and invasive nature of brain tumors makes it very important to study the effects of these
combined therapeutic strategies in an effort to improve the survival time of patients. It is also important to study the tumor microenvironment, since the complex nature of the cerebral vasculature, including the blood brain barrier and several other
tumor-induced conditions such as hypoxia, high interstitial pressure, and cerebral edema affect drug delivery as well as the
effectiveness of radiotherapy. Recently, a novel strategy using antiangiogenic therapy has been studied for the treatment of brain
tumors. Antiangiogenic therapy interferes with the development of tumor vasculature and indirectly helps in the control of tumor
growth. Recent clinical trials suggest that anti-angiogenic therapy is usually more effective when given in combination with
other therapeutic strategies.
In an effort to study the effects of the aforementioned therapeutic strategies, a spatio-temporal model is considered here
that incorporates the tumor cell growth and the effects of radiotherapy and chemotherapy. The effects of different schedules of radiation therapy is then studied using a generalized linear
quadratic model and compared against the published clinical data. The model is then extended to include the interactions of tumor
vasculature and oxygen concentration, to explain tumor hypoxia and to study various methods of hypoxia characterizations including biomarker estimates and needle electrode measurements. The model predicted hypoxia is also used to analyze the effects of tumor oxygenation status on radiation response as it is known that tumor hypoxia negatively influences the radiotherapy outcome. This thesis also presents a detailed analysis of the effects of heterogenous tumor vasculature on tumor interstitial fluid pressure and interstitial fluid velocity. A mathematical modeling
approach is then used to analyze the changes in interstitial fluid pressure with or without antiangiogenic therapy.
|
54 |
Experimental and computational investigations of therapeutic drug release from biodegradable poly(lactide-co-glycolide) (plg) microspheresBerchane, Nader Samir 15 May 2009 (has links)
The need to tailor release-rate profiles from polymeric microspheres remains one of
the leading challenges in controlled drug delivery. Microsphere size, which has a
significant effect on drug release rate, can potentially be varied to design a controlled
drug delivery system with desired release profile. In addition, drug release rate from
polymeric microspheres is dependent on material properties such as polymer molecular
weight. Mathematical modeling provides insight into the fundamental processes that
govern the release, and once validated with experimental results, it can be used to tailor a
desired controlled drug delivery system.
To these ends, PLG microspheres were fabricated using the oil-in-water emulsion
technique. A quantitative study that describes the size distribution of poly(lactide-coglycolide)
(PLG) microspheres is presented. A fluid mechanics-based correlation that
predicts the mean microsphere diameter is formulated based on the theory of
emulsification in turbulent flow. The effects of microspheres’ mean diameter,
polydispersity, and polymer molecular weight on therapeutic drug release rate from poly(lactide-co-glycolide) (PLG) microspheres were investigated experimentally. Based
on the experimental results, a suitable mathematical theory has been developed that
incorporates the effect of microsphere size distribution and polymer degradation on drug
release. In addition, a numerical optimization technique, based on the least squares
method, was developed to achieve desired therapeutic drug release profiles by
combining individual microsphere populations.
The fluid mechanics-based mathematical correlation that predicts microsphere mean
diameter provided a close fit to the experimental results. We show from in vitro release
experiments that microsphere size has a significant effect on drug release rate. The initial
release rate decreased with an increase in microsphere size. In addition, the release
profile changed from first order to concave-upward (sigmoidal) as the microsphere size
was increased. The mathematical model gave a good fit to the experimental release data.
Using the numerical optimization technique, it was possible to achieve desired release
profiles, in particular zero-order and pulsatile release, by combining individual
microsphere populations at the appropriate proportions.
Overall, this work shows that engineering polymeric microsphere populations having
predetermined characteristics is an effective means to obtain desired therapeutic drug
release patterns, relevant for controlled drug delivery.
|
55 |
Mathematical Modeling Of Fbcs Co-fired With Lignite And BiomassMorali, Ekrem Mehmet 01 July 2007 (has links) (PDF)
Increasing environmental legislations on pollutant emissions originated from fossil fuel combustion and intention of increasing the life of existing fossil fuels give rise to the use of renewable sources. Biomass at this juncture, with its renewable nature and lower pollutant emission levels becomes an attractive energy resource. However, only seasonal availability of biomass and operation problems caused by high alkaline content of biomass ash restrict its combustion alone. These problems can be overcome by co-combustion of biomass with lignite. With its high fuel flexibility and high combustion efficiency, fluidized bed combustion is the most promising technology for co-firing. To improve and optimize the operation of co-firing systems a detailed understanding of co-combustion of coal and biomass is necessary, which can be achieved both with experiments and modeling studies. For this purpose, a comprehensive system model of fluidized bed combustor, previously developed and tested for prediction of combustion behaviour of fluidized bed combustors fired with lignite was extended to co-firing lignite with biomass by incorporating volatile release, char combustion and population balance for biomass.
The model predictions were validated against experimental measurements taken on METU 0.3 MWt AFBC fired with lignite only, lignite with limestone addition and about 50/50 lignite/olive residue mixture with limestone addition. Predicted and measured temperatures and concentrations of gaseous species along the combustor were found to be in good agreement. Introduction of biomass to lignite was found to decrease SO2 emissions but did not affect NO emissions significantly.
|
56 |
Evaluation And Comparison Of Helicopter Simulation Models With Different FidelitiesYilmaz, Deniz 01 July 2008 (has links) (PDF)
This thesis concerns the development, evaluation, comparison and testing of a UH-1H helicopter
simulation model with various fidelity levels. In particular, the well known minimum
complexity simulation model is updated with various higher fidelity simulation components,
such as the Peters-He inflow model, horizontal tail contribution, improved tail rotor model,
control mapping, ground eect, fuselage interactions, ground reactions etc. Results are compared
with available flight test data. The dynamic model is integrated into the open source
simulation environment called Flight Gear. Finally, the model is cross-checked through evaluations
using test pilots.
|
57 |
Dynamics of p53 tetramers in live single cellsGaglia, Giorgio 06 June 2014 (has links)
Protein homo-oligomerization is the process through which identical peptides bind together to form higher order complexes. Self-interactions in many cases are constitutive and stable, used as building blocks for biological structures, such as rings, filaments and membranes. Further, homo-oligomerization can also be a regulatory process that influences the proteins' function such as change in transcriptional activities for transcription factors. Innovative methods to measure oligomerization in live cells are needed in order to understand regulation and function of homooligomerization in the native cellular context. This thesis examines the case of the tumor suppressor p53, whose homo-tetramerization greatly influences its activity as a transcription factor. We develop methods to quantify p53's self-interaction in individual living cells and follow it in time after DNA damage. The two methods we developed have complementary qualities and different applications. We first use fluorescent correlation spectroscopy to study the molecular events occurring in the first three hours of the p53 in response to double strand breaks. We find that in the absence of stress p53 is present in a mixture of, monomers, dimers and tetramers. When damage is sensed, oligomerization is rapidly induced and nearly all p53 is found bound in tetramers. We combine our data with a mathematical framework to propose the existence of a dedicated mechanism triggering p53 oligomerization independently of protein stabilization. Next, we use bimolecular fluorescent complementation to probe for tetramerization in the longer timescales of p53's response to ultraviolet radiation. In this context we find that even though the rate of p53 accumulation increases with the dose of radiation, p53 tetramers are formed at a steady rate. We hence propose the existence of an inhibitory mechanism that prevents the oligomerization reaction from following a linear input-output relation. We identify ARC, a known cofactor of p53, as part of this inhibitory mechanism. Downregulation of ARC restore the linear relation between to total and tetrameric p53. Finally, in both experimental setups higher oligomerization lead to an increase in p53 activity, underscoring the connection between regulation of oligomerization and the transcriptional activity of p53 in cancer cells. Collectively, this work emphasizes the importance of precise measurements to investigate the regulation and function of higher order complexes and provides generally applicable methods to quantify homo-oligomerization in live single cells.
|
58 |
Multilevel Methodology For Simulation Of Spatio-Temporal Systems With Heterogeneous Activity: Application To Spread Of Valley Fever FungusJammalamadaka, Rajanikanth January 2008 (has links)
Spatio-temporal systems with heterogeneity in their structure and behavior have two major problems. The first one is that such systems extend over very large spatial and temporal domains and consume a lot of resources to simulate that they are infeasible to study with current platforms. The second one is that the data available for understanding such systems is limited. This also makes it difficult to get the data for validation of their constituent processes while simultaneously considering their global behavior. For example, the valley fever fungus considered in this dissertation is spread over a large spatial grid in the arid Southwest and typically needs to be simulated over several decades of time to obtain useful information. It is also hard to get the temperature and moisture data at every grid point of the spatial domain over the region of study. In order to address the first problem, we develop a method based on the discrete event system specification which exploits the heterogeneity in the activity of the spatio-temporal system and which has been shown to be effective in solving relatively simple partial differential equation systems. The benefit of addressing the first problem is that it now makes it feasible to address the second problem.We address the second problem by making use of a multilevel methodology based on modeling and simulation and systems theory. This methodology helps us in the construction of models with different resolutions (base and lumped models). This allows us to refine an initially constructed lumped model with detailed physics-based process models and assess whether they improve on the original lumped models. For that assessment, we use the concept of experimental frame to delimit where the improvement is needed. This allows us to work with the available data, improve the component models in their own experimental frame and then move them to the overall frame. In this dissertation, we develop a multilevel methodology and apply it to a valley fever model. Moreover, we study the model's behavior in a particular experimental frame of interest, namely the formation of new sporing sites.
|
59 |
Demand fulfillment flexibility in capacitated production planningCharnsirisakskul, Kasarin 08 1900 (has links)
No description available.
|
60 |
Analysing concerted criteria for local dynamic properties of metabolic systemsGirbig, Dorothee January 2014 (has links)
Metabolic systems tend to exhibit steady states that can be measured in terms of their concentrations and fluxes. These measurements can be regarded as a phenotypic representation of all the complex interactions and regulatory mechanisms taking place in the underlying metabolic network. Such interactions determine the system's response to external perturbations and are responsible, for example, for its asymptotic stability or for oscillatory trajectories around the steady state. However, determining these perturbation responses in the absence of fully specified kinetic models remains an important challenge of computational systems biology.
Structural kinetic modeling (SKM) is a framework to analyse whether a metabolic steady state remains stable under perturbation, without requiring detailed knowledge about individual rate equations. It provides a parameterised representation of the system's Jacobian matrix in which the model parameters encode information about the enzyme-metabolite interactions. Stability criteria can be derived by generating a large number of structural kinetic models (SK-models) with randomly sampled parameter sets and evaluating the resulting Jacobian matrices. The parameter space can be analysed statistically in order to detect network positions that contribute significantly to the perturbation response. Because the sampled parameters are equivalent to the elasticities used in metabolic control analysis (MCA), the results are easy to interpret biologically.
In this project, the SKM framework was extended by several novel methodological improvements. These improvements were evaluated in a simulation study using a set of small example pathways with simple Michaelis Menten rate laws. Afterwards, a detailed analysis of the dynamic properties of the neuronal TCA cycle was performed in order to demonstrate how the new insights obtained in this work could be used for the study of complex metabolic systems.
The first improvement was achieved by examining the biological feasibility of the elasticity combinations created during Monte Carlo sampling. Using a set of small example systems, the findings showed that the majority of sampled SK-models would yield negative kinetic parameters if they were translated back into kinetic models. To overcome this problem, a simple criterion was formulated that mitigates such infeasible models and the application of this criterion changed the conclusions of the SKM experiment.
The second improvement of this work was the application of supervised machine-learning approaches in order to analyse SKM experiments. So far, SKM experiments have focused on the detection of individual enzymes to identify single reactions important for maintaining the stability or oscillatory trajectories. In this work, this approach was extended by demonstrating how SKM enables the detection of ensembles of enzymes or metabolites that act together in an orchestrated manner to coordinate the pathways response to perturbations. In doing so, stable and unstable states served as class labels, and classifiers were trained to detect elasticity regions associated with stability and instability. Classification was performed using decision trees and relevance vector machines (RVMs). The decision trees produced good classification accuracy in terms of model bias and generalizability. RVMs outperformed decision trees when applied to small models, but encountered severe problems when applied to larger systems because of their high runtime requirements. The decision tree rulesets were analysed statistically and individually in order to explore the role of individual enzymes or metabolites in controlling the system's trajectories around steady states.
The third improvement of this work was the establishment of a relationship between the SKM framework and the related field of MCA. In particular, it was shown how the sampled elasticities could be converted to flux control coefficients, which were then investigated for their predictive information content in classifier training.
After evaluation on the small example pathways, the methodology was used to study two steady states of the neuronal TCA cycle with respect to their intrinsic mechanisms responsible for stability or instability. The findings showed that several elasticities were jointly coordinated to control stability and that the main source for potential instabilities were mutations in the enzyme alpha-ketoglutarate dehydrogenase. / Metabolische Systeme neigen zur Ausbildung von Fließgleichgewichten, deren Konzentrationen und Reaktionsflüsse experimentell charakterisierbar sind. Derartige Messungen bieten eine phänotypische Repräsentation der zahlreichen Interaktionen und regulatorischen Mechanismen des zugrundeliegenden metabolischen Netzwerks. Diese Interaktionen bestimmen die Reaktion des Systems auf externe Perturbationen, wie z.B. dessen asymptotische Stabilität und Oszillationen. Die Charakterisierung solcher Eigenschaften ist jedoch schwierig, wenn kein entsprechendes kinetisches Modell mit allen Ratengleichungen und kinetischen Parametern für das untersuchte System zur Verfügung steht.
Die strukturelle kinetische Modellierung (SKM) ermöglicht die Untersuchung dynamischer Eigenschaften wie Stabilität oder Oszillationen, ohne die Ratengleichungen und zugehörigen Parameter im Detail zu kennen. Statt dessen liefert sie eine parametrisierte Repräsentation der Jacobimatrix, in welcher die einzelnen Parameter Informationen über die Sättigung der Enzyme des Systems mit ihren Substraten kodieren. Die Parameter entsprechen dabei den Elastizitäten aus der metabolischen Kontrollanalyse, was ihre biologische Interpretation vereinfacht. Stabilitätskriterien werden durch Monte Carlo Verfahren hergeleitet, wobei zunächst eine große Anzahl struktureller kinetische Modelle (SK-Modelle) mit zufällig gezogenen Parametermengen generiert, und anschließend die resultierenden Jacobimatrizen evaluiert werden. Im Anschluss kann der Parameterraum statistisch analysiert werden, um Enzyme und Metabolite mit signifikantem Einfluss auf die Stabilität zu detektieren.
In der vorliegenden Arbeit wurde das bisherige SKM-Verfahren durch neue methodische Verbesserungen erweitert. Diese Verbesserungen wurden anhand einer Simulationsstudie evaluiert, welche auf kleinen Beispielsystemen mit einfachen Michaelis Menten Kinetiken basierte. Im Anschluss wurden sie für eine detaillierte Analyse der dynamischen Eigenschaften des Zitratzyklus verwendet.
Die erste Erweiterung der bestehenden Methodik wurde durch Untersuchung der biologischen Machbarkeit der zufällig erzeugten Elastizitäten erreicht. Es konnte gezeigt werden, dass die Mehrheit der zufällig erzeugten SK-Modelle zu negativen Michaeliskonstanten führt. Um dieses Problem anzugehen, wurde ein einfaches Kriterium formuliert, welches das Auftreten solcher biologisch unrealistischer SK-Modelle verhindert. Es konnte gezeigt werden, dass die Anwendung des Kriteriums die Ergebnisse von SKM Experimenten stark beeinflussen kann.
Der zweite Beitrag bezog sich auf die Analyse von SKM-Experimenten mit Hilfe überwachter maschineller Lernverfahren. Bisherige SKM-Studien konzentrierten sich meist auf die Detektion individueller Elastizitäten, um einzelne Reaktionen mit Einfluss auf das Stabilitäts- oder oszillatorische Verhalten zu identifizieren. In dieser Arbeit wurde demonstriert, wie SKM Experimente im Hinblick auf multivariate Muster analysiert werden können, um Elastizitäten zu entdecken, die gemeinsam auf orchestrierte und koordinierte Weise die Eigenschaften des Systems bestimmen. Sowohl Entscheidungsbäume als auch Relevanzvektormaschinen (RVMs) wurden als Klassifikatoren eingesetzt. Während Entscheidungsbäume im allgemeinen gute Klassifikationsergebnisse lieferten, scheiterten RVMs an ihren großen Laufzeitbedürfnissen bei Anwendung auf ein komplexes System wie den Zitratzyklus. Hergeleitete Entscheidungsbaumregeln wurden sowohl statistisch als auch individuell analysiert, um die Koordination von Enzymen und Metaboliten in der Kontrolle von Trajektorien des Systems zu untersuchen.
Der dritte Beitrag, welcher in dieser Arbeit vorgestellt wurde, war die Etablierung der Beziehung zwischen SKM und der metabolischer Kontrollanalyse. Insbesondere wurde gezeigt, wie die zufällig generierten Elastizitäten in Flusskontrollkoeffizienten umgewandelt werden. Diese wurden im Anschluss bezüglich ihres Informationsgehaltes zum Klassifikationstraining untersucht.
Nach der Evaluierung anhand einiger kleiner Beispielsysteme wurde die neue Methodik auf die Studie zweier Fließgleichgewichte des neuronalen Zitratzyklus angewandt, um intrinsische Mechanismen für Stabilität oder Instabilität zu finden. Die Ergebnisse identifizierten Mutationen im Enzym alpha-ketoglutarate dehydrogenase als wahrscheinlichste Quelle füur Instabilitäten.
|
Page generated in 0.09 seconds