• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 43
  • 39
  • 11
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 125
  • 125
  • 30
  • 27
  • 18
  • 17
  • 15
  • 15
  • 14
  • 11
  • 10
  • 10
  • 9
  • 9
  • 9
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Modelo de simulação estocástica da demanda de água em edifí­cios residenciais. / Stochastic simulation model of water demand in residential buildings.

Tiago de Vasconcelos Gonçalves Ferreira 19 January 2018 (has links)
Ao longo dos anos, pesquisadores têm liderado estudos com o objetivo de investigar o perfil de consumo de água em edifícios, os quais contribuem para o conhecimento no que tange ao correto dimensionamento dos sistemas prediais. No contexto dos métodos para a caracterização das solicitações, as rotinas comumente empregadas para a obtenção das vazões de projeto foram, em sua maioria, propostas na metade do século XX. Estes modelos precisam ser revisados e readequados para a realidade de conservação existente atualmente. Nos últimos anos, alguns estudos propuseram modelos de simulação com foco de aplicação em sistemas prediais de distribuição de água, devido ao comportamento aleatório e temporal das solicitações neste tipo de sistema. Neste trabalho foi proposto um modelo de simulação estocástica da demanda de água em edifícios residenciais, que contemplou a modelagem comportamental dos usuários e a interação destes com o sistema, a fim de aperfeiçoar o processo de dimensionamento dos sistemas prediais de distribuição de água. Para isto, foram revisadas as bases teóricas de modelos propostos anteriormente com interesse de identificar aspectos significativos e construir um novo modelo, que mesclou a modelagem comportamental dos usuários e do sistema hidráulico. Para a obtenção dos valores das variáveis intervenientes, foi feita uma consulta em trabalhos dentro do contexto nacional e uma coleta de dados em campo. Os resultados da pesquisa em campo mostraram a correlação entre a rotina dos usuários e o volume de água consumida e um aumento médio de 192% do valor da vazão de projeto obtida pelo Método dos Pesos Relativos quando comparada com as vazões obtidas no medidor dos apartamentos monitorados. Em posse de todos os dados de entrada, foram feitas diferentes simulações que variaram o tipo do chuveiro instalado nos apartamentos. Quando comparadas as vazões obtidas pela simulação e pelo Método dos Pesos Relativos, em todos os componentes do sistema, a redução da vazão de projeto variou entre 4% e 61%. Em termos de consumo de material, a redução ficou entre 25% a 63%. / Over the years, researchers have been conducting studies to investigate the water consumption profile in buildings, which contribute to the knowledge regarding the correct sizing of the building hydraulic systems. In context of the methods for characterization of requests, the routines commonly used to obtain the project flows were mostly proposed in mid-20th-century. These models need to be revised and adapted to nowadays water conservation reality. In recent years, some studies have proposed simulation models with application focus in water distribution systems, due to the random and temporal behavior of the requests in this type of system. In this study, a stochastic simulation model of water demand in residential buildings has been proposed, which contemplated the behavioral modeling of users and their interaction with the system, in order to improve the design process of water distribution systems. For such, the theoretical bases of previously- proposed models for the identification of significant aspects for the construction of a new model were revised, which merged the behavioral modeling of users and the hydraulic system. In order to obtain the values of intervening variables, fieldworks and a review was conducted in papers which treated about the Brazilian context. The results of the data collected on the fieldworks show a correlation between the routine of users and the volume of water consumed. Besides, there was an average increase of 192% in the value of the project flow rate obtained by the Brazilian Standard Method when compared with the flows obtained in the monitored apartments. Considering the input data in the model, different simulations - with several different types of showers installed in the apartments - were made. When comparing the flows obtained by the simulation and the Brazilian Standard Method, in all components of the system, the reduction of the project flow varied between 4% and 61%. In terms of material consumption, the reduction was between 25% and 63%.
32

Estimativa de propriedades petrofisicas atraves da reconstrução 3D do meio poroso a partir da analise de imagens / Prediction of petrophysical properties by 3D reconstruction of porous media from image analysis

De Gasperi, Patricia Martins Silva 12 October 1999 (has links)
Orientadores: Euclides Jose Bonet, Marco Antonio Schreiner Moraes / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica / Made available in DSpace on 2018-07-26T08:21:56Z (GMT). No. of bitstreams: 1 DeGasperi_PatriciaMartinsSilva_M.pdf: 13462853 bytes, checksum: cff9140cfbd41d9dc52865fb52425605 (MD5) Previous issue date: 1999 / Resumo: Este trabalho tem como objetivos o estudo e a aplicação do processo de estimativa de propriedades petrofisicas a partir de informações obtidas em imagens petrográficas bidimensionais. O método assume a hipótese da homogeneidade estatística, e utiliza a simulação estocástica para a reconstrução do modelo tridimensional do meio poroso. A caracterização geométrica do meio simulado permite a elaboração de um modelo de rede para a simulação do fluxo e a estimativa da permeabilidade, fator de formação, pressão capilar por injeção de mercúrio e relação índice de resistividade versus saturação de água. Esta metodologia é aplicada a quatro sistemas porosos com diferentes níveis de heterogeneidade. Os resultados demonstram que estimativas confiáveis dependem da utilização de uma resolução apropriada de aquisição das imagens, que permita a identificação de poros e gargantas que efetivamente controlem as propriedades de fluxo do sistema. As curvas de pressão capilar simuladas sugerem a necessidade da composição de escalas. As propriedades elétricas são afetadas pela porosidade das amostras e sua confiabilidade é restrita a sistemas preferencialmente molháveis pela água / Abstract: The aim of this work is to investigate and apply a method for predicting petrophysical properties ftom bidimensional petrographic image data. Based on the assumption of statistical homogeneity, the method uses stochastic simulation to reconstruct the porous media tridimensional structure. The geometrical characterization of the simulated media allows the construction of a network model to simulate fluid flow and estimate permeability, formation factor, mercury capillary pressure curves and resistivity index as function of water saturation. This method is applied to four porous systems with different heterogeneity levels. The results demonstrate that good predictions depend on the appropriate image aquisition resolution, which identifies pores and throats that effectively control the flow properties of the system. The capillary pressure curves suggest the necessity of scale composition. The electrical properties are affected by samples porosity, with reliable estimates being restricted to water-wet systems / Mestrado / Mestre em Engenharia de Petróleo
33

Optimizing stochastic simulation of a neuron with parallelization

Liss, Anders January 2017 (has links)
In order to optimize the solving of stochastic simulations of neuron channels, an attempt to parallelize the solver has been made. The result of the implementation was unsuccessful. However, the implementation is not impossible and is still a field of research with big potential for improving performance of stochastic simulations.
34

[en] D-ENGINE: FRAMEWORK FOR THE RANDOM EXECUTION OF PLANS IN AGENT-BASED MODELS / [pt] D-ENGINE: FRAMEWORK PARA A EXECUÇÃO ALEATÓRIA DE PLANOS EM MODELOS BASEADOS EM AGENTES

WALDECIR VICENTE FARIA 24 May 2016 (has links)
[pt] Uma questão importante em sistemas baseados em agentes é como executar uma ação planejada de uma maneira aleatória. Saber responder esta questão é fundamental para manter o interesse do usuário em um determinado produto, não apenas porque torna a experiência menos repetitiva, mas também porque a torna mais realista. Este tipo de execução de ações pode ser aplicado principalmente em simuladores, jogos sérios ou de entretenimento que se baseiam em modelos de agentes. Algumas vezes, a aleatoriedade pode ser obtida pela simples geração de números aleatórios. Porém, quando estamos criando um produto mais complexo, é recomendável usar algum conhecimento estatístico ou estocástico para não arruinar a experiência de consumo deste produto. Neste trabalho, nós damos suporte à criação de animações e histórias dinâmicas e interativas usando um modelo arbitrário baseado em agentes. Para isto, inspirado em métodos estocásticos, nós propomos um novo framework, chamado D-Engine, que é capaz de criar um conjunto de timestamps aleatórios, mas com um comportamento esperado bem conhecido, que descrevem a execução de ações em regime de tempo discreto e a uma determinada taxa. Ao mesmo tempo em que estes timestamps nos permitem animar uma história, uma ação ou uma cena, os resultados gerados com o nosso framework podem ser usados para auxiliar outras aplicações, tais como previsões de resultado, planejamento não determinístico, mídia interativa e criação de estórias. Nesta dissertação também mostramos como criar dois aplicativos diferentes usando o framework proposto: um cenário de duelo em um jogo e um site de leilões interativo. / [en] An important question in agent-based systems is how to execute some planned action in a random way. The answer for this question is fundamental to keep the user s interest in some product, not just because it makes the experience less repetitive but also because it makes the product more realistic. This kind of action execution can be mainly applied on simulators, serious and entertainment games based on agent models. Sometimes the randomness can be reached by just generating random numbers. However, when creating a more complex product, it is recommended to use some statistical or stochastic knowledge to not ruin the product s consumption experience. In this work we try to give support to the creation of dynamic and interactive animation and story using an arbitrary model based on agents. Inspired on stochastic methods, we propose a new framework called D-Engine, which is able to create a random, but with a well-known expected behavior, set of timestamps describing the execution of an action in a discrete way following some specific rate. While these timestamps allow us to animate a story, an action or a scene, the mathematical results generated with our framework can be used to aid other applications such as result forecasting, nondeterministic planning, interactive media and storytelling. In this work we also present how to implement two different applications using our framework: a duel scenario and an interactive online auction website.
35

WATER-DRIVEN EROSION PREDICTION TECHNOLOGY FOR A MORE COMPLICATED REALITY

Josept David Revuelta Acosta Sr. (8735910) 21 April 2020 (has links)
<p>Hydrological modeling has been a valuable tool to understand the processes governing water distribution, quantity, and quality of the planet Earth. Through models, one has been able to grasp processes such as runoff, soil moisture, soil erosion, subsurface drainage, plant growth, evapotranspiration, and effects of land use changes on hydrology at field and watershed scales. The number and diversity of water-related challenges are vast and expected to increase. As a result, current models need to be under continuous modifications to extend their application to more complex processes. Several models have been extensively developed in recent years. These models include the Soil and Water Assessment Tool (SWAT), Variable Infiltration Capacity (VIC) model, MIKE-SHE, and the Water Erosion Prediction Project (WEPP) model. The latter, although it is a well-validated model at field scales, the WEPP watershed model has been limited to small catchments, and almost no research has been introduced regarding water quality issues (only one study).</p><p>In this research, three objectives were proposed to improve the WEPP model in three areas where either the model has not been applied, or modifications can be performed to improve algorithms of the processes within the model (e.g. erosion, runoff, drainage). The enhancements impact the WEPP model by improving the current stochastic weather generation, extending its applicability to subsurface drainage estimation, and formulating a new routing model that allows future incorporation of transport of reactive solutes.</p><p>The first contribution was development of a stochastic storm generator based on 5-min time resolution and correlated non-normal Monte Carlo-based numerical simulation. The model considered the correlated and non-normal rainstorm characteristics such as time between storms, duration, and amount of precipitation, as well as the storm intensity structure. The model was tested using precipitation data from a randomly selected 5-min weather station in North Carolina. Results showed that the proposed storm generator captured the essential statistical features of rainstorms and their intensity patterns, preserving the first four moments of monthly storm events, good annual extreme event correspondence, and the correlation structure within each storm. Since the proposed model depends on statistical properties at a site, this may allow the use of synthetic storms in ungauged locations provided relevant information from a regional analysis is available.</p><p>A second development included the testing, improvement, and validation of the WEPP model to simulate subsurface flow discharges. The proposed model included the modification of the current subsurface drainage algorithm (Hooghoudt-based expression) and the WEPP model percolation routine. The modified WEPP model was tested and validated on an extensive dataset collected at four experimental sites managed by USDA-ARS within the Lake Erie Watershed. Predicted subsurface discharges show Nash-Sutcliffe Efficiency (NSE) values ranging from 0.50 to 0.70, and percent bias ranging from -30% to +15% at daily and monthly resolutions. Evidence suggests the WEPP model can be used to produce reliable estimates of subsurface flow with minimum calibration.</p><p>The last objective presented the theoretical framework for a new hillslope and channel-routing model for the Water Erosion Prediction Project (WEPP) model. The routing model (WEPP-CMT) is based on catchment geomorphology and mass transport theory for flow and transport of reactive solutes. The WEPP-CMT uses the unique functionality of WEPP to simulate hillslope responses under diverse land use and management conditions and a Lagrangian description of the carrier hydrologic runoff at hillslope and channel domains. An example of the model functionality was tested in a sub-catchment of the Upper Cedar River Watershed in the U.S. Pacific Northwest. Results showed that the proposed model provides an acceptable representation of flow at the outlet of the study catchment. Model efficiencies and percent bias for the calibration period and the validation period were NSE = 0.55 and 0.65, and PBIAS = -2.8% and 2.1%, respectively. The WEPP-CMT provides a suitable foundation for the transport of reactive solutes (e.g. nitrates) at basin scales.</p><p><br></p>
36

Simulation Algorithms for Continuous Time Markov Chain Models

Banks, H. T., Broido, Anna, Canter, Brandi, Gayvert, Kaitlyn, Hu, Shuhua, Joyner, Michele, Link, Kathryn 01 December 2012 (has links)
Continuous time Markov chains are often used in the literature to model the dynamics of a system with low species count and uncertainty in transitions. In this paper, we investigate three particular algorithms that can be used to numerically simulate continuous time Markov chain models (a stochastic simulation algorithm, explicit and implicit tau-leaping algorithms). To compare these methods, we used them to analyze two stochastic infection models with different level of complexity. One of these models describes the dynamics of Vancomycin-Resistant Enterococcus (VRE) infection in a hospital, and the other is for the early infection of Human Immunodeficiency Virus (HIV) within a host. The relative efficiency of each algorithm is determined based on computational time and degree of precision required. The numerical results suggest that all three algorithms have similar computational efficiency for the VRE model due to the low number of species and small number of transitions. However, we found that with the larger and more complex HIV model, implementation and modification of tau-Leaping methods are preferred.
37

Complexity penalized methods for structured and unstructured data

Goeva, Aleksandrina 08 November 2017 (has links)
A fundamental goal of statisticians is to make inferences from the sample about characteristics of the underlying population. This is an inverse problem, since we are trying to recover a feature of the input with the availability of observations on an output. Towards this end, we consider complexity penalized methods, because they balance goodness of fit and generalizability of the solution. The data from the underlying population may come in diverse formats - structured or unstructured - such as probability distributions, text tokens, or graph characteristics. Depending on the defining features of the problem we can chose the appropriate complexity penalized approach, and assess the quality of the estimate produced by it. Favorable characteristics are strong theoretical guarantees of closeness to the true value and interpretability. Our work fits within this framework and spans the areas of simulation optimization, text mining and network inference. The first problem we consider is model calibration under the assumption that given a hypothesized input model, we can use stochastic simulation to obtain its corresponding output observations. We formulate it as a stochastic program by maximizing the entropy of the input distribution subject to moment matching. We then propose an iterative scheme via simulation to approximately solve it. We prove convergence of the proposed algorithm under appropriate conditions and demonstrate the performance via numerical studies. The second problem we consider is summarizing text documents through an inferred set of topics. We propose a frequentist reformulation of a Bayesian regularization scheme. Through our complexity-penalized perspective we lend further insight into the nature of the loss function and the regularization achieved through the priors in the Bayesian formulation. The third problem is concerned with the impact of sampling on the degree distribution of a network. Under many sampling designs, we have a linear inverse problem characterized by an ill-conditioned matrix. We investigate the theoretical properties of an approximate solution for the degree distribution found by regularizing the solution of the ill-conditioned least squares objective. Particularly, we study the rate at which the penalized solution tends to the true value as a function of network size and sampling rate.
38

Stochastic simulation of near-surface atmospheric forcings for distributed hydrology / Simulation stochastique des forçages atmosphériques utiles aux modèles hydrologiques spatialisés

Chen, Sheng 01 February 2018 (has links)
Ce travail de thèse propose de nouveaux concepts et outils pour des activités de simulation stochastique du temps ciblant les besoins spécifiques de l'hydrologie. Nous avons utilisé une zone climatique contrastée dans le sud-est de la France, les Cévennes-Vivarais, qui est très attractive pour les aléas hydrologiques et les changements climatiques.Notre point de vue est que les caractéristiques physiques (humidité du sol, débit) liées aux préoccupations quotidiennes sont directement liées à la variabilité atmosphérique à l'échelle des bassins. Pour la modélisation de multi-variable, la covariabilité avec les précipitations est d'abord considérée.La première étape du thèse est dédiée à la prise en compte de l'hétérogénéité de la précipitation au sein du simulateur de pluie SAMPO [Leblois et Creutin, 2013]. Nous regroupons les pas de temps dans les types de pluie qui sont organisés dans le temps. Deux approches sont testées pour la simulation: un modèle semi-markovienne et un modèle de ré-échantillonnage pour la séquence des types de pluie historiques. Grâce au regroupement, toutes sortes de précipitations sont desservies par un type de pluie spécifique. Dans une zone plus vaste, où l'hypothèse d'homogénéité climatique n'est plus valide, une coordination doit être introduite entre les séquences de types de pluie sur les sous-zones délimitées, en formant à plus grande échelle.Nous avons d'abord étudié une coordination de modèle de Markov, en appliquant des durées de séjour observées par un algorithme glouton. Cet approche respecte les accumulations de longue durée et la variabilité interannuelle, mais les valeurs extrêmes de précipitation sont trop faibles. En revanche, le ré-échantillonnage est plus facile à mettre en œuvre et donne un comportement satisfaisant pour la variabilité à court terme. Cependant, il manque une variabilité inter-annuelle. Les deux accès souffrent de la délimitation stricte des zones homogènes et des types de précipitations homogènes.Pour ces raisons, une approche complètement différente est également envisagée, où les pluies totales sont modélisées conjointement en utilisant la copule, puis désagrégés sur la petite échelle en utilisant une simulation conditionnelle géostatistique.Enfin, la technique de la copule est utilisée pour relier les autres variables météorologiques (température, rayonnement solaire, humidité, vitesse du vent) aux précipitations. Puisque la modélisation multivariée vise à être pilotée par la simulation des précipitations, la copule doit être exécutée en mode conditionnel. La boîte à outils réalisée a déjà été utilisée dans des explorations scientifiques, elle est maintenant disponible pour tester aux applications réelles. En tant qu'approche pilotée par les données, elle est également adaptable à d'autres conditions climatiques. / This PhD work proposes new concepts and tools for stochastic weather simulation activities targeting the specific needs of hydrology. We used, as a demonstration, a climatically contrasted area in the South-East of France, Cévennes-Vivarais, which is highly attractive to hydrological hazards and climate change.Our perspective is that physical features (soil moisture, discharge) relevant to everyday concerns (water resources assessment and/or hydrological hazard) are directly linked to the atmospheric variability at the basins scale, meaning firstly that relevant time and space scales ranges must be respected in the rainfall simulation technique. Since hydrological purposes are the target, other near-surface variates must be also considered. They may exhibit a less striking variability, but it does exist. To build the multi-variable modeling, co-variability with rainfall is first considered.The first step of the PhD work is dedicated to take into account the heterogeneity of the precipitation within the rainfall simulator SAMPO [Leblois and Creutin, 2013]. We cluster time steps into rainfall types organized in time. Two approaches are tested for simulation: a semi-Markov simulation and a resampling of the historical rainfall types sequence. Thanks to clustering, all kind of rainfall is served by some specific rainfall type. In a larger area, where the assumption of climatic homogeneity is not considered valid, a coordination must be introduced between the rainfall type sequences over delineated sub-areas, forming rainy patterns at the larger scale.We first investigated a coordination of Markov models, enforcing observed lengths-of-stay by a greedy algorithm. This approach respects long duration aggregates and inter-annual variability, but the high values of rainfall are too low. As contrast, the joint resampling of historically observed sequences is easier to implement and gives a satisfactory behavior for short term variability. However it lacks inter-annual variability.Both approaches suffer from the strict delineation of homogeneous zones and homogeneous rainfall types.For these reasons, a completely different approach is also considered, where the areal rainfall totals are jointly modeled using a spatio-temporal copula approach, then disaggregated to the user grid using a non-deterministic, geostatistically-based conditional simulation technique. In the copula approach, the well-known problem of rainfall having atom at zero is handled in replacing historical rainfall by an appropriated atmospheric based rainfall index having a continuous distribution. Simulated values of this index can be turned to rainfall by quantile-quantile mapping.Finally, the copula technique is used to link other meteorological variables (i.e. temperature, solar radiation, humidity, wind speed) to rainfall. Since the multivariate simulation aims to be driven by the rainfall simulation, the copula needs to be run in conditional mode. The achieved toolbox has already been used in scientific explorations, it is now available for testing in real-size application. As a data-driven approach, it is also adaptable to other climatic conditions. The presence of atmospheric precursors a large scale values in some key steps may enable the simulation tools to be converted into a climate simulation disaggregation.
39

Stochastic Simulation of the Phage Lambda System and the Bioluminescence System Using the Next Reaction Method

Ananthanpillai, Balaji January 2009 (has links)
No description available.
40

A computational framework for analyzing chemical modification and limited proteolysis experimental data used for high confidence protein structure prediction

Anderson, Paul E. 08 December 2006 (has links)
No description available.

Page generated in 0.1265 seconds