• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 2
  • Tagged with
  • 25
  • 25
  • 25
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Data-Driven Variational Multiscale Reduced Order Modeling of Turbulent Flows

Mou, Changhong 16 June 2021 (has links)
In this dissertation, we consider two different strategies for improving the projection-based reduced order model (ROM) accuracy: (I) adding closure terms to the standard ROM; (II) using Lagrangian data to improve the ROM basis. Following strategy (I), we propose a new data-driven reduced order model (ROM) framework that centers around the hierarchical structure of the variational multiscale (VMS) methodology and utilizes data to increase the ROM accuracy at a modest computational cost. The VMS methodology is a natural fit for the hierarchical structure of the ROM basis: In the first step, we use the ROM projection to separate the scales into three categories: (i) resolved large scales, (ii) resolved small scales, and (iii) unresolved scales. In the second step, we explicitly identify the VMS-ROM closure terms, i.e., the terms representing the interactions among the three types of scales. In the third step, we use available data to model the VMS-ROM closure terms. Thus, instead of phenomenological models used in VMS for standard numerical discretizations (e.g., eddy viscosity models), we utilize available data to construct new structural VMS-ROM closure models. Specifically, we build ROM operators (vectors, matrices, and tensors) that are closest to the true ROM closure terms evaluated with the available data. We test the new data-driven VMS-ROM in the numerical simulation of four test cases: (i) the 1D Burgers equation with viscosity coefficient $nu = 10^{-3}$; (ii) a 2D flow past a circular cylinder at Reynolds numbers $Re=100$, $Re=500$, and $Re=1000$; (iii) the quasi-geostrophic equations at Reynolds number $Re=450$ and Rossby number $Ro=0.0036$; and (iv) a 2D flow over a backward facing step at Reynolds number $Re=1000$. The numerical results show that the data-driven VMS-ROM is significantly more accurate than standard ROMs. Furthermore, we propose a new hybrid ROM framework for the numerical simulation of fluid flows. This hybrid framework incorporates two closure modeling strategies: (i) A structural closure modeling component that involves the recently proposed data-driven variational multiscale ROM approach, and (ii) A functional closure modeling component that introduces an artificial viscosity term. We also utilize physical constraints for the structural ROM operators in order to add robustness to the hybrid ROM. We perform a numerical investigation of the hybrid ROM for the three-dimensional turbulent channel flow at a Reynolds number $Re = 13,750$. In addition, we focus on the mathematical foundations of ROM closures. First, we extend the verifiability concept from large eddy simulation to the ROM setting. Specifically, we call a ROM closure model verifiable if a small ROM closure model error (i.e., a small difference between the true ROM closure and the modeled ROM closure) implies a small ROM error. Second, we prove that a data-driven ROM closure (i.e., the data-driven variational multiscale ROM) is verifiable. For strategy (II), we propose new Lagrangian inner products that we use together with Eulerian and Lagrangian data to construct new Lagrangian ROMs. We show that the new Lagrangian ROMs are orders of magnitude more accurate than the standard Eulerian ROMs, i.e., ROMs that use standard Eulerian inner product and data to construct the ROM basis. Specifically, for the quasi-geostrophic equations, we show that the new Lagrangian ROMs are more accurate than the standard Eulerian ROMs in approximating not only Lagrangian fields (e.g., the finite time Lyapunov exponent (FTLE)), but also Eulerian fields (e.g., the streamfunction). We emphasize that the new Lagrangian ROMs do not employ any closure modeling to model the effect of discarded modes (which is standard procedure for low-dimensional ROMs of complex nonlinear systems). Thus, the dramatic increase in the new Lagrangian ROMs' accuracy is entirely due to the novel Lagrangian inner products used to build the Lagrangian ROM basis. / Doctor of Philosophy / Reduced order models (ROMs) are popular in physical and engineering applications: for example, ROMs are widely used in aircraft designing as it can greatly reduce computational cost for the aircraft's aeroelastic predictions while retaining good accuracy. However, for high Reynolds number turbulent flows, such as blood flows in arteries, oil transport in pipelines, and ocean currents, the standard ROMs may yield inaccurate results. In this dissertation, to improve ROM's accuracy for turbulent flows, we investigate three different types of ROMs. In this dissertation, both numerical and theoretical results show that the proposed new ROMs yield more accurate results than the standard ROM and thus can be more useful.
22

Data-driven modeling and simulation of spatiotemporal processes with a view toward applications in biology

Maddu Kondaiah, Suryanarayana 11 January 2022 (has links)
Mathematical modeling and simulation has emerged as a fundamental means to understand physical process around us with countless real-world applications in applied science and engineering problems. However, heavy reliance on first principles, symmetry relations, and conservation laws has limited its applicability to a few scientific domains and even few real-world scenarios. Especially in disciplines like biology the underlying living constituents exhibit a myriad of complexities like non-linearities, non-equilibrium physics, self-organization and plasticity that routinely escape mathematical treatment based on governing laws. Meanwhile, recent decades have witnessed rapid advancement in computing hardware, sensing technologies, and algorithmic innovations in machine learning. This progress has helped propel data-driven paradigms to achieve unprecedented practical success in the fields of image processing and computer vision, natural language processing, autonomous transport, and etc. In the current thesis, we explore, apply, and advance statistical and machine learning strategies that help bridge the gap between data and mathematical models, with a view toward modeling and simulation of spatiotemporal processes in biology. As first, we address the problem of learning interpretable mathematical models of biologial process from limited and noisy data. For this, we propose a statistical learning framework called PDE-STRIDE based on the theory of stability selection and ℓ0-based sparse regularization for parsimonious model selection. The PDE-STRIDE framework enables model learning with relaxed dependencies on tuning parameters, sample-size and noise-levels. We demonstrate the practical applicability of our method on real-world data by considering a purely data-driven re-evaluation of the advective triggering hypothesis explaining the embryonic patterning event in the C. elegans zygote. As a next natural step, we extend our PDE-STRIDE framework to leverage prior knowledge from physical principles to learn biologically plausible and physically consistent models rather than models that simply fit the data best. For this, we modify the PDE-STRIDE framework to handle structured sparsity constraints for grouping features which enables us to: 1) enforce conservation laws, 2) extract spatially varying non-observables, 3) encode symmetry relations associated with the underlying biological process. We show several applications from systems biology demonstrating the claim that enforcing priors dramatically enhances the robustness and consistency of the data-driven approaches. In the following part, we apply our statistical learning framework for learning mean-field deterministic equations of active matter systems directly from stochastic self-propelled active particle simulations. We investigate two examples of particle models which differs in the microscopic interaction rules being used. First, we consider a self-propelled particle model endowed with density-dependent motility character. For the chosen hydrodynamic variables, our data-driven framework learns continuum partial differential equations that are in excellent agreement with analytical derived coarse-grain equations from Boltzmann approach. In addition, our structured sparsity framework is able to decode the hidden dependency between particle speed and the local density intrinsic to the self-propelled particle model. As a second example, the learning framework is applied for coarse-graining a popular stochastic particle model employed for studying the collective cell motion in epithelial sheets. The PDE-STRIDE framework is able to infer novel PDE model that quantitatively captures the flow statistics of the particle model in the regime of low density fluctuations. Modern microscopy techniques produce GigaBytes (GB) and TeraBytes (TB) of data while imaging spatiotemporal developmental dynamics of living organisms. However, classical statistical learning based on penalized linear regression models struggle with issues like accurate computation of derivatives in the candidate library and problems with computational scalability for application to “big” and noisy data-sets. For this reason we exploit the rich parameterization of neural networks that can efficiently learn from large data-sets. Specifically, we explore the framework of Physics-Informed Neural Networks (PINN) that allow for seamless integration of physics priors with measurement data. We propose novel strategies for multi-objective optimization that allow for adapting PINN architecture to multi-scale modeling problems arising in biology. We showcase application examples for both forward and inverse modeling of mesoscale active turbulence phenomenon observed in dense bacterial suspensions. Employing our strategies, we demonstrate orders of magnitude gain in accuracy and convergence in comparison with conventional formulation for solving multi-objective optimization in PINNs. In the concluding chapter of the thesis, we skip model interpretability and focus on learning computable models directly from noisy data for the purpose of pure dynamics forecasting. We propose STENCIL-NET, an artificial neural network architecture that learns solution adaptive spatial discretization of an unknown PDE model that can be stably integrated in time with negligible loss in accuracy. To support this claim, we present numerical experiments on long-term forecasting of chaotic PDE solutions on coarse spatio-temporal grids, and also showcase de-noising application that help decompose spatiotemporal dynamics from the noise in an equation-free manner.
23

Three essays of healthcare data-driven predictive modeling

Zhouyang Lou (15343159) 26 April 2023 (has links)
<p>Predictive modeling in healthcare involves the development of data-driven and computational models which can predict what will happen, be it for a single individual or for an entire system. The adoption of predictive models can guide various stakeholders’ decision-making in the healthcare sector, and consequently improve individual outcomes and the cost-effectiveness of care. With the rapid development in healthcare of big data and the Internet of Things technologies, research in healthcare decision-making has grown in both importance and complexity. One of the complexities facing those who would build predictive models is heterogeneity of patient populations, clinical practices, and intervention outcomes, as well as from diverse health systems. There are many sub-domains in healthcare for which predictive modeling is useful such as disease risk modeling, clinical intelligence, pharmacovigilance, precision medicine, hospitalization process optimization, digital health, and preventive care. In my dissertation, I focus on predictive modeling for applications that fit into three broad and important domains of healthcare, namely clinical practice, public health, and healthcare system. In this dissertation, I present three papers that present a collection of predictive modeling studies to address the challenge of modeling heterogeneity in health care. The first paper presents a decision-tree model to address clinicians’ need to decide among various liver cirrhosis diagnosis strategies. The second paper presents a micro-simulation model to assess the impact on cardiovascular disease (CVD) to help decision makers at government agencies develop cost-effective food policies to prevent cardiovascular diseases, a public-health domain application. The third paper compares a set of data-driven prediction models, the best performing of which is paired together with interpretable machine learning to facilitate the coordination of optimization for hospital-discharged patients choosing skilled nursing facilities. This collection of studies addresses important modeling challenges in specific healthcare domains, and also broadly contribute to research in medical decision-making, public health policy and healthcare systems.</p>
24

Data-driven Dynamic Baseline Calibration Method for Gas Sensors / Datadriven Dynamisk Baslinjekalibreringsmetod för Gassensorer

Yang, Cheng January 2021 (has links)
Automatic Baseline Correction is the state-of-the-art calibration method of non-dispersive infrared CO2 sensing, which is the standard CO2 gas monitoring method. In this thesis, we improve it by introducing the dynamic baseline based on environmental data. The 96 data sets from 48 atmospheric stations verify the characteristics of the annual growth trend and seasonality of the baseline model. In order to improve the accuracy of the calibration, the k-means clustering method is used to identify different types of baselines. Then the localized dynamic baseline model is predicted by using the location information of the stations only, which provides an executable calibration implementation for dynamic baseline calibration without relying on historical CO2 data. / Automatisk baslinjekorrigering är den senaste kalibreringsmetoden för icke-dispersiv infraröd CO2 avkänning, vilket är standard CO2 gasövervakningsmetod. I denna avhandling förbättrar vi den genom att introducera den dynamiska baslinjen baserat på miljödata. De 96 datamängderna från 48 atmosfärstationer bekräftar egenskaperna för den årliga tillväxttrenden och säsongsmässigheten hos basmodellen. För att förbättra kalibreringens noggrannhet används k-medelklusteringsmetoden för att identifiera olika typer av baslinjer. Därefter förutses den lokaliserade dynamiska baslinjemodellen med endast platsinformationen för stationerna, som ger en körbar kalibreringsimplementering för dynamisk baslinjekalibrering utan att förlita sig på historisk CO2 data.
25

Deriving a mathematical framework for data-driven analyses of immune cell dynamics

Burt, Philipp 06 January 2023 (has links)
Zelluläre Entscheidungen, wie z. B. die Differenzierung von T-Helferzellen (Th-Zellen) in spezialisierte Effektorlinien, haben großen Einfluss auf die Spezifität von Immunreaktionen. Solche Reaktionen sind das Ergebnis eines komplexen Zusammenspiels einzelner Zellen, die über kleine Signalmoleküle, so genannte Zytokine, kommunizieren. Die hohe Anzahl der Komponenten, sowie deren komplizierte und oft nichtlineare Interaktionen erschweren dabei die Vorhersage, wie bestimmte zelluläre Reaktionen erzeugt werden. Aus diesem Grund sind die globalen Auswirkungen der gezielten Beeinflussung einzelner Zellen oder spezifischer Signalwege nur unzureichend verstanden. So wirken beispielsweise etablierte Behandlungen von Autoimmunkrankheiten oft nur bei einem Teil der Patienten. Durch Einzelzellmethoden wie Live-Cell-Imaging, Massenzytometrie und Einzelzellsequenzierung, können Immunzellen heutzutage quantitativ auf mehreren Ebenen charakterisiert werden. Diese Ansammlung quantitativer Daten erlaubt die Formulierung datengetriebener Modelle zur Vorhersage von zellulären Entscheidungen, allerdings fehlen in vielen Fällen Methoden, um die verschiedenen Daten auf geeignete Weise zu integrieren und zu annotieren. Die vorliegende Arbeit befasst sich mit quantitativen Modellformulierungen für die Entscheidungsfindung von Zellen im Immunsystem mit dem Schwerpunkt auf Lymphozytenproliferation, -differenzierung und -tod. / Cellular decisions, such as the differentiation of T helper (Th) cells into specialized effector lineages, largely impact the direction of immune responses. Such population-level responses are the result of a complex interplay of individual cells which communicate via small signaling molecules called cytokines. The system's complexity, stemming not only from the number of components but also from their intricate and oftentimes non-linear interactions, makes it difficult to develop intuition for how cellular responses are actually generated. Not surprisingly, the global effects of targeting individual cells or specific signaling pathways through perturbations are poorly understood. For instance, common treatments of autoimmune diseases often work for some patients, but not for others. Recently developed methods such as live-cell imaging, mass cytometry and single-cell sequencing now enable quantitative characterization of individual immune cells. This accumulating wealth of quantitative data has laid the basis to derive predictive, data-driven models of immune cell behavior, but in many cases, methods to integrate and annotate the data in a way suitable for model formulation are missing. In this thesis, quantitative workflows and methods are introduced that allow to formulate data-driven models of immune cell decision-making with a particular focus on lymphocyte proliferation, differentiation and death.

Page generated in 0.0695 seconds