• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • 1
  • Tagged with
  • 10
  • 10
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Truth discovery under resource constraints

Etuk, Anthony Anietie January 2015 (has links)
Social computing initiatives that mark a shift from personal computing towards computations involving collective action, are driving a dramatic evolution in modern decision-making. Decisionmakers or stakeholders can now tap into the power of tremendous numbers and varieties of information sources (crowds), capable of providing information for decisions that could impact individual or collective well-being. More information sources does not necessarily translate to better information quality, however. Social influence in online environments, for example, may bias collective opinions. In addition, querying information sources may be costly, in terms of energy, bandwidth, delay overheads, etc., in real-world applications. In this research, we propose a general approach for truth discovery in resource constrained environments, where there is uncertainty regarding the trustworthiness of sources. First, we present a model of diversity, which allows a decision-maker to form groups, made up of sources likely to provide similar reports. We demonstrate that this mechanism is able to identify different forms of dependencies among information sources, and hence has the potential to mitigate the risk of double-counting evidence due to correlated biases among information sources. Secondly, we present a sampling decision-making model, which combines source diversification and reinforcement learning to drive sampling strategy. We demonstrate that this mechanism is effective in guiding sampling decisions given different task constraints or information needs. We evaluate our model by comparing it with algorithms representing classes of existing approaches reported in the literature.
2

Statistical Error in Particle Simulations of Low Mach Number Flows

Hadjiconstantinou, Nicolas G., Garcia, Alejandro L. 01 1900 (has links)
We present predictions for the statistical error due to finite sampling in the presence of thermal fluctuations in molecular simulation algorithms. Expressions for the fluid velocity, density and temperature are derived using equilibrium statistical mechanics. The results show that the number of samples needed to adequately resolve the flow-field scales as the inverse square of the Mach number. The theoretical results are verified for a dilute gas using direct Monte Carlo simulations. The agreement between theory and simulation verifies that the use of equilibrium theory is justified. / Singapore-MIT Alliance (SMA)
3

Computer system modeling at the hardware platform level /

Saghir, Amir, January 1900 (has links)
Thesis (M.App.Sc.) - Carleton University, 2002. / Includes bibliographical references (p. 80-81). Also available in electronic format on the Internet.
4

Modélisation multi-échelle et hybride des maladies contagieuses : vers le développement de nouveaux outils de simulation pour contrôler les épidémies / Multi-scale-socio-environmental modeling of epidemiological process : a way for organizing humain environments and rhythms to control and prevent the spread of contagious diseases

Hessami, Mohammad Hessam 23 June 2016 (has links)
Les études théoriques en épidémiologie utilisent principalement des équations différentielles pour étudier (voire tenter de prévoir) les processus infectieux liés aux maladies contagieuses, souvent sous des hypothèses peu réalistes (ex: des populations spatialement homogènes). Cependant ces modèles ne sont pas bien adaptés pour étudier les processus épidémiologiques à différentes échelles et ils ne sont pas efficaces pour prédire correctement les épidémies. De tels modèles devraient notamment être liés à la structure sociale et spatiale des populations. Dans cette thèse, nous proposons un ensemble de nouveaux modèles dans lesquels différents niveaux de spatialité (par exemple la structure locale de la population, en particulier la dynamique de groupe, la distribution spatiale des individus dans l'environnement, le rôle des personnes résistantes, etc.) sont pris en compte pour expliquer et prédire la façon dont des maladies transmissibles se développent et se répandent à différentes échelles, même à l'échelle de grandes populations. La manière dont les modèles que nous avons développé sont paramétrés leur permet en outre d'être reliés entre eux pour bien décrire en même temps le processus épidémiologique à grande échelle (population d'une grande ville, pays ...) mais avec précision dans des zones de surface limitée (immeubles de bureaux, des écoles). Nous sommes d'abord parvenus à inclure la notion de groupes dans des systèmes d'équations différentielles de modèles SIR (susceptibles, infectés, résistants) par une réécriture des dynamiques de population s'inspirant des réactions enzymatiques avec inhibition non compétitive : les groupes (une forme de complexe) se forment avec des compositions différentes en individus S, I et R, et les individus R se comportent ici comme des inhibiteurs non compétitifs. Nous avons ensuite couplé de tels modèles SIR avec la dynamique globale des groupes simulée par des algorithmes stochastiques dans un espace homogène, ou avec les dynamiques de groupe émergentes obtenues dans des systèmes multi-agents. Comme nos modèles fournissent de l'information bien détaillée à différentes échelles (c'est-à-dire une résolution microscopique en temps, en espace et en population), nous pouvons proposer une analyse de criticité des processus épidémiologiques. Nous pensons en effet que les maladies dans un environnement social et spatial donné présentent des signatures caractéristiques et que de telles mesures pourraient permettre l'identification des facteurs qui modifient leur dynamique.Nous visons ainsi à extraire l'essence des systèmes épidémiologiques réels en utilisant différents modèles mathématique et numériques. Comme nos modèles peuvent prendre en compte les comportements individuels et les dynamiques de population, ils sont en mesure d'utiliser des informations provenant du BigData, collectée par les technologies des réseaux mobiles et sociaux. Un objectif à long terme de ce travail est d'utiliser de tels modèles comme de nouveaux outils pour réduire les épidémies en guidant les rythmes et organisation humaines, par exemple en proposant de nouvelles architectures et en changeant les comportements pour limiter les propagations épidémiques. / Theoretical studies in epidemiology mainly use differential equations, often under unrealistic assumptions (e.g. spatially homogeneous populations), to study the development and spreading of contagious diseases. Such models are not, however, well adapted understanding epidemiological processes at different scales, nor are they efficient for correctly predicting epidemics. Yet, such models should be closely related to the social and spatial structure of populations. In the present thesis, we propose a series of new models in which different levels of spatiality (e.g. local structure of population, in particular group dynamics, spatial distribution of individuals in the environment, role of resistant people, etc) are taken into account, to explain and predict how communicable diseases develop and spread at different scales, even at the scale of large populations. Furthermore, the manner in which our models are parametrised allow them to be connected together so as to describe the epidemiological process at a large scale (population of a big town, country ...) and with accuracy in limited areas (office buildings, schools) at the same time.We first succeed in including the notion of groups in SIR (Susceptible, Infected, Recovered) differential equation systems by a rewriting of the SIR dynamics in the form of an enzymatic reaction in which group-complexes of different composition in S, I and R individuals form and where R people behave as non-competitive inhibitors. Then, global group dynamics simulated by stochastic algorithms in a homogeneous space, as well emerging ones obtained in multi-agent systems, are coupled to such SIR epidemic models. As our group-based models provide fine-grain information (i.e. microscopical resolution of time, space and population) we propose an analysis of criticality of epidemiological processes. We think that diseases in a given social and spatial environment present characteristic signatures and that such measurements could allow the identification of the factors that modify their dynamics.We aim here to extract the essence of real epidemiological systems by using various methods based on different computer-oriented approaches. As our models can take into account individual behaviours and group dynamics, they are able to use big-data information yielded from smart-phone technologies and social networks. As a long term objective derived from the present work, one can expect good predictions in the development of epidemics, but also a tool to reduce epidemics by guiding new environmental architectures and by changing human health-related behaviours.
5

Foundations and Applications of Entanglement Renormalization

Glen Evenbly Unknown Date (has links)
Understanding the collective behavior of a quantum many-body system, a system composed of a large number of interacting microscopic degrees of freedom, is a key aspect in many areas of contemporary physics. However, as a direct consequence of the difficultly of the so-called many-body problem, many exotic quantum phenomena involving extended systems, such as high temperature superconductivity, remain not well understood on a theoretical level. Entanglement renormalization is a recently proposed numerical method for the simulation of many-body systems which draws together ideas from the renormalization group and from the field of quantum information. By taking due care of the quantum entanglement of a system, entanglement renormalization has the potential to go beyond the limitations of previous numerical methods and to provide new insight to quantum collective phenomena. This thesis comprises a significant portion of the research development of ER following its initial proposal. This includes exploratory studies with ER in simple systems of free particles, the development of the optimisation algorithms associated to ER, and the early applications of ER in the study of quantum critical phenomena and frustrated spin systems.
6

Transient simulation of thermal networks using multi-dimensional model reduction /

Brajtman, Michal, January 1900 (has links)
Thesis (M. App. Sc.)--Carleton University, 2004. / Includes bibliographical references (p. 146-155). Also available in electronic format on the Internet.
7

Development of systems for 3D target/organ definitions and contouring /

Han, Bo, January 1900 (has links)
Thesis (M.C.S.)--Carleton University, 2004. / Includes bibliographical references (p. 92-97). Also available in electronic format on the Internet.
8

Modeling Startegies for Computational Systems Biology

Simoni, Giulia 20 March 2020 (has links)
Mathematical models and their associated computer simulations are nowadays widely used in several research fields, such as natural sciences, engineering, as well as social sciences. In the context of systems biology, they provide a rigorous way to investigate how complex regulatory pathways are connected and how the disruption of these processes may contribute to the develop- ment of a disease, ultimately investigating the suitability of specific molecules as novel therapeutic targets. In the last decade, the launching of the precision medicine initiative has motivated the necessity to define innovative computational techniques that could be used for customizing therapies. In this context, the combination of mathematical models and computer strategies is an essential tool for biologists, which can analyze complex system pathways, as well as for the pharmaceutical industry, which is involved in promoting programs for drug discovery. In this dissertation, we explore different modeling techniques that are used for the simulation and the analysis of complex biological systems. We analyze the state of the art for simulation algorithms both in the stochastic and in the deterministic frameworks. The same dichotomy has been studied in the context of sensitivity analysis, identifying the main pros and cons of the two approaches. Moreover, we studied the quantitative system pharmacology (QSP) modeling approach that elucidates the mechanism of action of a drug on the biological processes underlying a disease. Specifically, we present the definition, calibration and validation of a QSP model describing Gaucher disease type 1 (GD1), one of the most common lysosome storage rare disorders. All of these techniques are finally combined to define a novel computational pipeline for patient stratification. Our approach uses modeling techniques, such as model simulations, sensitivity analysis and QSP modeling, in combination with experimental data to identify the key mechanisms responsible for the stratification. The pipeline has been applied to three test cases in different biological contexts: a whole-body model of dyslipidemia, the QSP model of GD1 and a QSP model of cardiac electrophysiology. In these test cases, the pipeline proved to be accurate and robust, allowing the interpretation of the mechanistic differences underlying the phenotype classification.
9

<i>COHERENT QUANTUM CONTROL AND QUANTUM </i><i>SIMULATION OF CHEMICAL REACTIONS</i>

Sumit Suresh Kale (17743605) 18 March 2024 (has links)
<p dir="ltr">This thesis explores the intersection of quantum interference, entanglement, and quantum algorithms in the context of chemical reactions. The initial exploration delves into the constructive quantum interference in the photoassociation reaction of a 87Rb Bose Einstein condensate (BEC), where a coherent superposition of multiple bare spin states is achieved and it’s impact on photo-association (PA) was studied. Employing a quantum processor, the study illustrates that interferences can function as a resource for coherent control in photochemical reactions, presenting a universally applicable framework relevant to a spectrum of ultracold chemical reactions. The subsequent inquiry scrutinizes the entanglement dynamics between the spin and momentum degrees of freedom in an optically confined BEC of 87Rb atoms, induced by Raman and RF fields. Significantly, this study unveils substantial spin momentum entanglement under specific experimental conditions, indicating potential applications in the realm of quantum information processing. Finally, the third study advances a quantum algorithm for the computation of scattering matrix elements in chemical reactions, adeptly navigating the complexities of quantum interactions. This algorithm, rooted in the time-dependent method and Möller operator formulation, is applied to scenarios such as 1D semi-infinite square well potentials and co-linear hydrogen exchange reactions, showcasing its potential to enhance our comprehension of intricate quantum interactions within chemical systems.</p>
10

A Comparison of Computational Efficiencies of Stochastic Algorithms in Terms of Two Infection Models

Banks, H. Thomas, Hu, Shuhua, Joyner, Michele, Broido, Anna, Canter, Brandi, Gayvert, Kaitlyn, Link, Kathryn 01 July 2012 (has links)
In this paper, we investigate three particular algorithms: A sto- chastic simulation algorithm (SSA), and explicit and implicit tau-leaping al- gorithms. To compare these methods, we used them to analyze two infection models: A Vancomycin-resistant enterococcus (VRE) infection model at the population level, and a Human Immunode ciency Virus (HIV) within host in- fection model. While the rst has a low species count and few transitions, the second is more complex with a comparable number of species involved. The relative effciency of each algorithm is determined based on computational time and degree of precision required. The numerical results suggest that all three algorithms have the similar computational effciency for the simpler VRE model, and the SSA is the best choice due to its simplicity and accuracy. In addition, we have found that with the larger and more complex HIV model, implementation and modication of tau-Leaping methods are preferred.

Page generated in 0.1493 seconds