• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 73
  • 20
  • 9
  • 8
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 144
  • 144
  • 21
  • 20
  • 19
  • 19
  • 15
  • 15
  • 15
  • 15
  • 14
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Modelos de distribuição espacial de precipitações intensas /

Diniz, Érika Cristina. January 2003 (has links)
Resumo: Modelos de geração de precipitações são de extrema importância nos dias atuais, pois com o conhecimento do padrão de precipitação em certa área, pode-se planejar obras de forma a minimizar os efeitos das precipitações de grande intensidade. No presente trabalho, aplica-se o modelo de Neyman-Scott e, particularmente, o de Poisson na geração de precipitações de grande intensidade na região da Bacia do Tietê Superior, no Estado de São Paulo, Brasil. Essa região sofre anualmente com as enchentes devido às fortes precipitações e a alta densidade populacional nesta área. Para a aplicação dos modelos de distribuição espacial de precipitações Neyman-Scott e Poisson, foram considerados os dados coletados de 1980 a 1997 de uma rede pluviométrica constituída de treze pluviômetros. / Abstract: Models related with precipitations generation have extremely importance nowadays because with the standard knowledge about an specific area, we can plan projects to minimize the effects caused by high intensity precipitations. At the present work, we applies Neyman-Scott’s model and particularly the one from Poisson, in the precipitations generations with high intensity in the Superior Tietê Bays’ region, São Paulo state, Brazil. This region suffer annually with the floods due to the strong precipitations and the high human density. To use the Neyman-Scott and Poisson models related to spatial precipitations distribution, we have considered data collected during 1980 to 1997 from a pluviometric network consisted by thirteen rain gauges. / Orientador: Roberto Naves Domingos / Coorientador: José Silvio Govone / Banca: José Manoel Balthazar / Banca: Marco Aurélio Sicchiroli Lavrador / Mestre
72

Aplicação de modelos para séries temporais e pluviométricas no estado da Paraíba. / Application of models for temporal and pluviometric series in the state of Paraíba.

DANTAS, Leydson Galvíncio. 08 August 2018 (has links)
Submitted by Lucienne Costa (lucienneferreira@ufcg.edu.br) on 2018-08-08T20:51:33Z No. of bitstreams: 1 LEYDSON GALVÍNCIO DANTAS – DISSERTAÇÃO (PPGMET) 2016.pdf: 2738181 bytes, checksum: db3cee5c7f083124e00eb4eb3220492c (MD5) / Made available in DSpace on 2018-08-08T20:51:33Z (GMT). No. of bitstreams: 1 LEYDSON GALVÍNCIO DANTAS – DISSERTAÇÃO (PPGMET) 2016.pdf: 2738181 bytes, checksum: db3cee5c7f083124e00eb4eb3220492c (MD5) Previous issue date: 2016-02-19 / CNPq / A variabilidade espacial e temporal das precipitações no estado da Paraíba, proporciona irregularidades na distribuição da mesma, como frequentemente observado nos eventos de secas prolongadas e excesso de precipitação em intervalos curtos de tempo, comprometendo em termos gerais a qualidade de vida da população. Neste trabalho é elaborado o preenchimento das falhas encontradas nas séries temporais em alguns municípios do Estado, obtendo deste modo, uma séria robusta o bastante, na qual tem a função de auxiliar, as análises dos resultados ao ser trabalhada com a metodologia de Box-Jenkins para séries temporais. Visando assim, a elaboração de prognósticos pluviométricos em algumas cidades do Estado por intermédio de modelos estocásticos, onde nota-se que a forma do modelo SARIMA(1,0,1)×(1,1,1) é o que melhor se ajusta pela análise dos critérios e resíduos, dentre os municípios onde as previsões chegaram a equivaler alguns dos valores observados nas estações pluviométricas. Os municípios localizados nas proximidades do Planalto da Borborema foram os que demonstraram a melhor acurácia em termos de previsibilidade ao ser comparado com os outros analisados neste estudo. / The spatial and temporal variability of rainfall in the state of Paraiba, provides irregularities in the distribution of the same, as often observed in prolonged droughts events and excess rainfall in short time intervals, affecting generally the quality of life of the population. This paper prepared filling the gaps found in the time series in some cities of the State, thereby obtaining a robust serious enough, which has a helper function, the analysis of the results to be crafted with the Box-Jenkins methodology for time series. Aiming thus the development of rainfall predictions in some cities of the State through stochastic models, where it is noted that the model shape SARIMA(1,0,1)×(1,1,1) is the best fit for analysis the criteria and waste from the municipalities where the predictions come to equate some of the values observed in the rainfall stations. The Cities located near of the Planalto of Borborema were the ones that showed the best accuracy in terms of predictability when compared to the other analyzed in this study.
73

Optimization of multi-channel and multi-skill call centers / Optimisation dans les centres d'appels multi-compétences et multi-canaux

Legros, Benjamin 13 December 2013 (has links)
Les centres d’appels connaissent un grand succès depuis leur introduction dans les entreprises de service. Ils sont le principal point de contact avec les clients, et une composante essentielle de la majorité des entreprises. L’émergence des centres d’appels à grande échelle a suscité de nombreuses problématiques de management. Dans cette thèse, nous considérons des problématiques de management orientées sur les centres multi‐canaux et multi‐compétences. L’objectif de notre travail est de trouver des résultats qualitatifs et quantitatifs utiles pour le management. Dans la première partie, nous considérons les architectures de centres multi‐compétences à flexibilité limitée. Le contexte est celui de centres d’appels avec des paramètres asymétriques : charge de travail non équilibrée, différents temps de services, prédominance d’une catégorie de clients, taux d’abandons variables et couts élevés de la multi‐compétence. Les architectures les plus connues avec flexibilité limitée comme chaining ne résistent pas à de telles asymétries. Nous proposons une nouvelle architecture, appelée Single Pooling avec seulement deux compétences par agent et nous démontrons son efficience dans diverses situations d’asymétrie. Dans la seconde partie, nous nous intéressons aux problèmes de routage dans les centres d’appels multi‐canaux. Dans le premier projet, nous considérons un centre avec des appels arrivant au fil du temps et des emails présents en nombre illimité. Le service des appels se fait en trois étapes dont la seconde est une pause pour l’agent. Nous cherchons à optimiser le routage des emails. L’objectif est de maximiser le débit d’emails traités sous contrainte de temps d’attente pour les appels. De nombreuses recommandations sont proposées au manager. En particulier, nous démontrons que pour obtenir un routage optimal il est nécessaire de fixer à une valeur extrême au moins l’un des deux paramètres définissant le routage des emails. Dans le second projet, nous étudions une politique de seuil de réservation d’agents pour les appels en réception. Nous considérons un cas général de modèle non stationnaire ou le processus d’arrivée des appels est Poisson non homogène. Le problème d’optimisation est la maximisation du débit de taches en émission sous contrainte de qualité de service sur les appels en réception. Nous proposons une méthode efficiente et facile à implémenter de changement adaptatif de seuil. Cette politique est évaluée en comparaison avec les performances optimales trouvées dans le cas particulier de taux d’arrivée constant, et en comparaison avec d’autres méthodes intuitives de changement adaptatif de seuil dans le cas général non stationnaire. Dans le troisième projet, nous considérons un modèle de centre avec option de rappels. Cette option permet de transformer un appel en réception en un appel en attente d’émission. Le problème d’optimisation est la minimisation du temps d’attente des appels en émission sous contrainte de qualité de service pour les appels en réception. Nous proposons une politique de routage à deux seuils, un sur la réservation d’agents pour les appels en réception et un sur le nombre d’appels en attente d’émission. Nous déterminons une courbe optimale entre ces deux seuils / Call centers have been introduced with great success by many service‐oriented companies. They become the main point of contact with the customer, and an integral part of the majority of corporations. The large‐scale emergence of call centers has created a fertile source of management issues. In this PhD thesis, we focus on various operations management issues of multi‐skill and multichannel call centers. The objective of our work is to derive, both qualitative and quantitative, results for practical management. In the first part, we focus on architectures with limited flexibility for multi‐skill call centers. The context is that of call centers with asymmetric parameters: unbalanced workload, different service requirements, a predominant customer type, unbalanced abandonments and high costs of crosstraining. The most knowing architectures with limited flexibility such as chaining fail against such asymmetry. We propose a new architecture referred to as single pooling with only two skills per agent and we demonstrate its efficiency under various situations of asymmetry. In the second part, we focus on routing problems in multi‐channel call centers. In the first study, we consider a blended call center with calls arriving over time and an infinitely backlogged queue of emails. The call service is characterized by three successive stages where the second one is a break. We focus on the optimization of the email routing to agents. The objective is to maximize the throughput of emails subject to a constraint on the call waiting time. Various guidelines to call center managers are provided. In particular, we prove for the optimal routing that all the time at least one of the two email routing parameters has an extreme value. In the second study, we examine a threshold policy on the reservation of agents for the inbound calls. We study a general non‐stationary model where the call arrival follows a non‐homogeneous Poisson process. The optimization problem consists on maximizing the throughput of outbound tasks under a constraint on the waiting time of inbound calls. We propose an efficient adaptive threshold policy easy to implement. This scheduling policy is evaluated through a comparison with the optimal performance measures found in the case of a constant stationary arrival rate, and also a comparison with other intuitive adaptive threshold policies in the general non‐stationary case. In the third study, we consider a call center model with a call back option, which allows to transform an inbound call into an outbound one. The optimization problem consists on minimizing the expected waiting time of the outbound calls while respecting a service level constraint on the inbound ones. We propose a routing policy with two thresholds, one on the reservation of the agents for inbound calls, and another on the number of waiting outbound calls. A curve relating the two thresholds is determined.
74

Champs de densité d'énergie pour la vibroacoustique basse et moyenne fréquence des structures complexes utilisant un modèle numérique stochastique : application à la partition structurale des automobiles / Energy-density field for low- and medium- frequency vibroacoustics of complex structures using a stochastic numerical model : application to the structural partitioning of automotive vehicles

Kassem, Morad 10 December 2009 (has links)
Ce travail de recherche s’inscrit dans le cadre de l’analyse vibroacoustique des structures complexes. Il propose une nouvelle approche énergétique utilisant le champ de densité d’énergie afin de simplifier une telle analyse. Cette approche est basée sur un modèle numérique stochastique construit à l’aide de l’approche probabiliste non paramétrique des incertitudes de modélisation et de paramètres. L’approche énergétique stochastique développée correspond à une nouvelle représentation du système vibroacoustique en terme des grandeurs énergétiques aléatoires. Un modèle vibroacoustique énergétique moyen est alors construit en prenant la moyenne statistique des grandeurs énergétiques. On dispose alors d’un modèle énergétique moyen pour analyser la vibroacoustique des systèmes complexes dans la bande des basses et des moyennes fréquences alors que la méthode SEA ne permet pas d’analyser cette bande de fréquence. L’analyse des propriétés des grandeurs énergétiques moyennes utilisées pour la construction du modèle vibroacoustique énergétique permet de construire une version simplifiée conduisant à un modèle énergétique simplifié pour lequel une méthodologie de partition structurale par zone est établie. Une application de cette approche énergétique et de la méthodologie de partition structurale par zone est présentée pour une voiture constituée d’une structure automobile couplée avec sa cavité acoustique interne / This research lies in the domain of the vibroacoustic analysis of complex structures. It proposes a new energy approach using the energy-density field in order to simplify such an analysis. This approach is based on a stochastic computational model constructed using the nonparametric probabilistic approach of modeling and parameters uncertainties. The stochastic energy approach developed corresponds to a new representation of the vibroacoustic system in terms of random energy quantities. A mean vibroacoustic energy model is thus constructed using a statistical averaging of the random energy quantities. This mean energy model provides a tool to perform a vibroacoustic analysis of complex structures in the low and medium frequency range while the SEA is not adapted to this frequency band. The analysis of the properties of the mean energy quantities used for the construction of the vibroacoustic energy model allows the construction of a simplified model to be obtained and yields a simplified energy model for which a structural partitioning methodology is then established. An application of the energy approach ant of the structural partitioning methodology is done on an automotive vehicle structure coupled with its internal acoustic cavity
75

Stochastic parametrisation and model uncertainty

Arnold, Hannah Mary January 2013 (has links)
Representing model uncertainty in atmospheric simulators is essential for the production of reliable probabilistic forecasts, and stochastic parametrisation schemes have been proposed for this purpose. Such schemes have been shown to improve the skill of ensemble forecasts, resulting in a growing use of stochastic parametrisation schemes in numerical weather prediction. However, little research has explicitly tested the ability of stochastic parametrisations to represent model uncertainty, since the presence of other sources of forecast uncertainty has complicated the results. This study seeks to provide firm foundations for the use of stochastic parametrisation schemes as a representation of model uncertainty in numerical weather prediction models. Idealised experiments are carried out in the Lorenz `96 (L96) simplified model of the atmosphere, in which all sources of uncertainty apart from model uncertainty can be removed. Stochastic parametrisations are found to be a skilful way of representing model uncertainty in weather forecasts in this system. Stochastic schemes which have a realistic representation of model error produce reliable forecasts, improving on the deterministic and the more "traditional" perturbed parameter schemes tested. The potential of using stochastic parametrisations for simulating the climate is considered, an area in which there has been little research. A significant improvement is observed when stochastic parametrisation schemes are used to represent model uncertainty in climate simulations in the L96 system. This improvement is particularly pronounced when considering the regime behaviour of the L96 system - the stochastic forecast models are significantly more skilful than using a deterministic perturbed parameter ensemble to represent model uncertainty. The reliability of a model at forecasting the weather is found to be linked to that model's ability to simulate the climate, providing some support for the seamless prediction paradigm. The lessons learned in the L96 system are then used to test and develop stochastic and perturbed parameter representations of model uncertainty for use in an operational numerical weather prediction model, the Integrated Forecasting System (IFS). A particular focus is on improving the representation of model uncertainty in the convection parametrisation scheme. Perturbed parameter schemes are tested, which improve on the operational stochastic scheme in some regards, but are not as skilful as a new generalised version of the stochastic scheme. The proposed stochastic scheme has a potentially more realistic representation of model error than the operational scheme, and improves the reliability of the forecasts. While studying the L96 system, it was found that there is a need for a proper score which is particularly sensitive to forecast reliability. A suitable score is proposed and tested, before being used for verification of the forecasts made in the IFS. This study demonstrates the power of using stochastic over perturbed parameter representations of model uncertainty in weather and climate simulations. It is hoped that these results motivate further research into physically-based stochastic parametrisation schemes, as well as triggering the development of stochastic Earth-system models for probabilistic climate prediction.
76

Análise estocástica linear de estruturas complexas usando meta-modelo modal / Stochastic linear analysis of complex structures via modal meta-model

Nascimento, Fábio Fialho do, 1983- 28 August 2018 (has links)
Orientador: José Maria Campos dos Santos / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecânica / Made available in DSpace on 2018-08-28T14:26:32Z (GMT). No. of bitstreams: 1 Nascimento_FabioFialhodo_M.pdf: 3796792 bytes, checksum: 3a40db657ae4ddf5e1331783c582bd04 (MD5) Previous issue date: 2015 / Resumo: Este trabalho tem como objetivo geral investigar abordagens para a análise de incerteza em problemas de dinâmica estrutural, de forma computacionalmente eficiente, no contexto industrial. Neste sentido, utilizou-se um metamodelo, baseado no método da superfície de resposta, para simplificar a etapa do cálculo dos modos e das frequências naturais na análise de resposta em frequência da estrutura. Para viabilizar a análise de grandes modelos, a solução de elementos finitos foi realizada pelo Nastran®. O MatLab® foi utilizado para manipular os autovalores e autovetores, e calcular as FRFs. Já o processo de amostragem das variáveis, a preparação da superfície de resposta e a integração com os demais aplicativos, foram realizados por meio do Isight®. Inicialmente, a abordagem foi avaliada em um modelo simples de um para-brisa veicular, com espessura, modo de elasticidade e densidade como parâmetros incertos. Posteriormente, o método foi aplicado para um modelo de uma estrutura veicular com milhares graus de liberdade. Neste caso, as variáveis aleatórias consideradas foram espessuras de vinte peças estampadas. Todas as variáveis foram consideradas com distribuição normal. Para quantificar a incerteza na resposta dinâmica, a simulação por Monte Carlo foi conduzida em conjunto com o metamodelo. A variabilidade das frequências naturais e da FRF é comparada com o resultado do Monte Carlo direto / Abstract: This work has as general objective to investigate approaches for uncertainty analysis in structural dynamics problems in a computational efficient manner in an industrial context. In this sense, we used a metamodel based on the response surface method to simplify the process of modes and natural frequencies calculation for frequency response analysis of a structure. In order to make the process feasible for large models, the finite element solution was performed using Nastran®. MatLab® was used to manipulate the eigenvalues and eigenvectors and calculate the FRFs. Isight® was responsible for the variable sampling process, response surface preparation and integrating other applications as well. Initially, the approach was assessed in a simple model of a car windshield with its thickness, Young¿s modulus and material density as uncertain parameters. Later the method was applied to a vehicle structure model with thousands degrees of freedom. In this case, the random variables considered were thicknesses of twenty stamped parts. Gaussian distribution was considered for all variables. For the purpose of uncertainty quantification in the dynamic response, Monte Carlo simulation was performed over the metamodel. The variability of the natural frequencies and FRF is compared against to direct Monte Carlo results / Mestrado / Mecanica dos Sólidos e Projeto Mecanico / Mestre em Engenharia Mecânica
77

Utilizing A Real Life Data Warehouse To Develop Freeway Travel Time Eliability Stochastic Models

Emam, Emam 01 January 2006 (has links)
During the 20th century, transportation programs were focused on the development of the basic infrastructure for the transportation networks. In the 21st century, the focus has shifted to management and operations of these networks. Transportation network reliability measure plays an important role in judging the performance of the transportation system and in evaluating the impact of new Intelligent Transportation Systems (ITS) deployment. The measurement of transportation network travel time reliability is imperative for providing travelers with accurate route guidance information. It can be applied to generate the shortest path (or alternative paths) connecting the origins and destinations especially under conditions of varying demands and limited capacities. The measurement of transportation network reliability is a complex issue because it involves both the infrastructure and the behavioral responses of the users. Also, this subject is challenging because there is no single agreed-upon reliability measure. This dissertation developed a new method for estimating the effect of travel demand variation and link capacity degradation on the reliability of a roadway network. The method is applied to a hypothetical roadway network and the results show that both travel time reliability and capacity reliability are consistent measures for reliability of the road network, but each may have a different use. The capacity reliability measure is of special interest to transportation network planners and engineers because it addresses the issue of whether the available network capacity relative to the present or forecast demand is sufficient, whereas travel time reliability is especially interesting for network users. The new travel time reliability method is sensitive to the users' perspective since it reflects that an increase in segment travel time should always result in less travel time reliability. And, it is an indicator of the operational consistency of a facility over an extended period of time. This initial theoretical effort and basic research was followed by applying the new method to the I-4 corridor in Orlando, Florida. This dissertation utilized a real life transportation data warehouse to estimate travel time reliability of the I-4 corridor. Four different travel time stochastic models: Weibull, Exponential, Lognormal, and Normal were tested. Lognormal was the best-fit model. Unlike the mechanical equipments, it is unrealistic that any freeway segment can be traversed in zero seconds no matter how fast the vehicles are. So, an adjustment of the developed best-fit statistical model (Lognormal) location parameter was needed to accurately estimate the travel time reliability. The adjusted model can be used to compute and predict travel time reliability of freeway corridors and report this information in real time to the public through traffic management centers. Compared to existing Florida Method and California Buffer Time Method, the new reliability method showed higher sensitivity to geographical locations, which reflects the level of congestion and bottlenecks. The major advantages/benefits of this new method to practitioners and researchers over the existing methods are its ability to estimate travel time reliability as a function of departure time, and that it treats travel time as a continuous variable that captures the variability experienced by individual travelers over an extended period of time. As such, the new method developed in this dissertation could be utilized in transportation planning and freeway operations for estimating the important travel time reliability measure of performance. Then, the segment length impacts on travel time reliability calculations were investigated utilizing the wealth of data available in the I-4 data warehouse. The developed travel time reliability models showed significant evidence of the relationship between the segment length and the results accuracy. The longer the segment, the less accurate were the travel time reliability estimates. Accordingly, long segments (e.g., 25 miles) are more appropriate for planning purposes as a macroscopic performance measure of the freeway corridor. Short segments (e.g., 5 miles) are more appropriate for the evaluation of freeway operations as a microscopic performance measure. Further, this dissertation has explored the impact of relaxing an important assumption in reliability analysis: Link independency. In real life, assuming that link failures on a road network are statistically independent is dubious. The failure of a link in one particular area does not necessarily result in the complete failure of the neighboring link, but may lead to deterioration of its performance. The "Cause-Based Multimode Model" (CBMM) has been used to address link dependency in communication networks. However, the transferability of this model to transportation networks has not been tested and this approach has not been considered before in the calculation of transportation networks' reliability. This dissertation presented the CBMM and applied it to predict transportation networks' travel time reliability that an origin demand can reach a specified destination under multimodal dependency link failure conditions. The new model studied the multi-state system reliability analysis of transportation networks for which one cannot formulate an "all or nothing" type of failure criterion and in which dependent link failures are considered. The results demonstrated that the newly developed method has true potential and can be easily extended to large-scale networks as long as the data is available. More specifically, the analysis of a hypothetical network showed that the dependency assumption is very important to obtain more reasonable travel time reliability estimates of links, paths, and the entire network. The results showed large discrepancy between the dependency and independency analysis scenarios. Realistic scenarios that considered the dependency assumption were on the safe side, this is important for transportation network decision makers. Also, this could aid travelers in making better choices. In contrast, deceptive information caused by the independency assumption could add to the travelers' anxiety associated with the unknown length of delay. This normally reflects negatively on highway agencies and management of taxpayers' resources.
78

Quantification of Model-Form, Predictive, and Parametric Uncertainties in Simulation-Based Design

Riley, Matthew E. 07 September 2011 (has links)
No description available.
79

Spatial-Temporal Statistical Modeling of Treated Drinking Water Usage

Arandia, Ernesto 16 September 2013 (has links)
No description available.
80

An analytic framework for the War of Ideas

Schramm, Harrison C. 09 1900 (has links)
We develop models for the spread of two opposing ideologies in a closed population based on epidemic models. Based on different interaction rules, we study deterministic and stochastic models of the problem. The goal of our work is to provide a tractable analytical framework for each situation, and to analyze the effect of different initial conditions on the proportion of the population affiliated with each ideology after a large time interval.

Page generated in 0.0885 seconds