• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 412
  • 58
  • 47
  • 19
  • 13
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 7
  • 7
  • 4
  • 3
  • Tagged with
  • 692
  • 132
  • 97
  • 95
  • 76
  • 70
  • 63
  • 60
  • 56
  • 54
  • 48
  • 43
  • 38
  • 38
  • 36
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
641

Conditional Streamflow Probabilities

Roefs, T. G., Clainos, D. M. 23 April 1971 (has links)
From the Proceedings of the 1971 Meetings of the Arizona Section - American Water Resources Assn. and the Hydrology Section - Arizona Academy of Science - April 22-23, 1971, Tempe, Arizona / Streamflows of monthly or shorter time periods, are, in most parts of the world, conditionally dependent. In studies of planning, commitment and operation decisions concerning reservoirs, it is probably most computationally efficient to use simulation routines for decisions of low dimensions, as planning and commitment, and optimization routines for the highly dimensional operation rule decisions. This presents the major problem of combining the 2 routines, since streamflow dependencies in simulation routines are continuous while the direct stochastic optimization routines are discrete. A stochastic streamflow synthesis routine is described consisting of 2 parts: streamflow probability distribution and dependency analysis and a streamflow generation using the relationships developed. A discrete dependency matrix between streamflow amounts was then sought. Setting as the limits of interest the class 400-500 thousand acre ft in January and 500-600 thousand acre ft in February, and using the transforms specified, the appropriate normal deviates were determined. The next serious problem was calculating the conditional dependency based on the bivariate normal distribution. In order to calculate the joint probability exactly, double integrations would be required and these use too much computer time. For the problem addressed, therefore, the use of 1-dimensional conditional probabilities based on the flow interval midpoint is an adequate and effective procedure.
642

Improvement of the efficiency of vehicle inspection and maintenance programs through incorporation of vehicle remote sensing data and vehicle characteristics

Samoylov, Alexander V. 13 January 2014 (has links)
Emissions from light-duty passenger vehicles represent a significant portion of total criteria pollutant emissions in the United States. Since the 1970s, emissions testing of these vehicles has been required in many major metropolitan areas, including Atlanta, GA, that were designated to be in non-attainment for one or more of the National Ambient Air Quality Standards. While emissions inspections have successfully reduced emissions by identifying and repairing high emitting vehicles, they have been increasingly inefficient as emissions control systems have become more durable and fewer vehicles are in need of repair. Currently, only about 9% of Atlanta area vehicles fail emissions inspection, but every vehicle is inspected annually. This research addresses explores ways to create a more efficient emissions testing program while continuing to use existing testing infrastructure. To achieve this objective, on road vehicle emissions data were collected as a part of the Continuous Atlanta Fleet Evaluation program sponsored the Georgia Department of Natural Resources. These remote sensing data were combined with in-program vehicle inspection data from the Atlanta Vehicle Inspection and Maintenance (I/M) program to establish the degree to which on road vehicle remote sensing could be used to enhance program efficiency. Based on this analysis, a multi-parameter model was developed to predict the probability of a particular vehicle failing an emissions inspection. The parameters found to influence the probability of failure include: vehicle characteristics, ownership history, vehicle usage, previous emission test results, and remote sensing emissions readings. This model was the foundation for a proposed emissions testing program that would create variable timing for vehicle retesting with high and low failure probability vehicles being more and less frequently, respectively, than the current annual cycle. Implementation of this program is estimated to reduce fleet emissions of 17% for carbon monoxide, 11% for hydrocarbons, and 5% for nitrogen oxides in Atlanta. These reductions would be achieved very cost-effectively at an estimated marginal cost of $149, $7,576 and $2,436 per-ton-per-year for carbon monoxide, hydrocarbons, and nitrogen oxides emissions reductions respectfully.
643

Comparison Of Regression Techniques Via Monte Carlo Simulation

Can Mutan, Oya 01 June 2004 (has links) (PDF)
The ordinary least squares (OLS) is one of the most widely used methods for modelling the functional relationship between variables. However, this estimation procedure counts on some assumptions and the violation of these assumptions may lead to nonrobust estimates. In this study, the simple linear regression model is investigated for conditions in which the distribution of the error terms is Generalised Logistic. Some robust and nonparametric methods such as modified maximum likelihood (MML), least absolute deviations (LAD), Winsorized least squares, least trimmed squares (LTS), Theil and weighted Theil are compared via computer simulation. In order to evaluate the estimator performance, mean, variance, bias, mean square error (MSE) and relative mean square error (RMSE) are computed.
644

A hybrid multi-objective bayesian estimation of distribution algorithm / Um algoritmo de estimação de distribuição híbrido multiobjetivo com modelo probabilístico bayesiano

Martins, Marcella Scoczynski Ribeiro 11 December 2017 (has links)
Atualmente, diversas metaheurísticas têm sido desenvolvidas para tratarem problemas de otimização multiobjetivo. Os Algoritmos de Estimação de Distribuição são uma classe específica de metaheurísticas que exploram o espaço de variáveis de decisão para construir modelos de distribuição de probabilidade a partir das soluções promissoras. O modelo probabilístico destes algoritmos captura estatísticas das variáveis de decisão e suas interdependências com o problema de otimização. Além do modelo probabilístico, a incorporação de métodos de busca local em Algoritmos Evolutivos Multiobjetivo pode melhorar consideravelmente os resultados. Estas duas técnicas têm sido aplicadas em conjunto na resolução de problemas de otimização multiobjetivo. Nesta tese, um algoritmo de estimação de distribuição híbrido, denominado HMOBEDA (Hybrid Multi-objective Bayesian Estimation of Distribution Algorithm ), o qual é baseado em redes bayesianas e busca local é proposto no contexto de otimização multi e com muitos objetivos a fim de estruturar, no mesmo modelo probabilístico, as variáveis, objetivos e as configurações dos parâmetros da busca local. Diferentes versões do HMOBEDA foram testadas utilizando instâncias do problema da mochila multiobjetivo com dois a cinco e oito objetivos. O HMOBEDA também é comparado com outros cinco métodos evolucionários (incluindo uma versão modificada do NSGA-III, adaptada para otimização combinatória) nas mesmas instâncias do problema da mochila, bem como, em um conjunto de instâncias do modelo MNK-landscape para dois, três, cinco e oito objetivos. As fronteiras de Pareto aproximadas também foram avaliadas utilizando as probabilidades estimadas pelas estruturas das redes resultantes, bem como, foram analisadas as interações entre variáveis, objetivos e parâmetros de busca local a partir da representação da rede bayesiana. Os resultados mostram que a melhor versão do HMOBEDA apresenta um desempenho superior em relação às abordagens comparadas. O algoritmo não só fornece os melhores valores para os indicadores de hipervolume, capacidade e distância invertida geracional, como também apresenta um conjunto de soluções com alta diversidade próximo à fronteira de Pareto estimada. / Nowadays, a number of metaheuristics have been developed for dealing with multiobjective optimization problems. Estimation of distribution algorithms (EDAs) are a special class of metaheuristics that explore the decision variable space to construct probabilistic models from promising solutions. The probabilistic model used in EDA captures statistics of decision variables and their interdependencies with the optimization problem. Moreover, the aggregation of local search methods can notably improve the results of multi-objective evolutionary algorithms. Therefore, these hybrid approaches have been jointly applied to multi-objective problems. In this work, a Hybrid Multi-objective Bayesian Estimation of Distribution Algorithm (HMOBEDA), which is based on a Bayesian network, is proposed to multi and many objective scenarios by modeling the joint probability of decision variables, objectives, and configuration parameters of an embedded local search (LS). We tested different versions of HMOBEDA using instances of the multi-objective knapsack problem for two to five and eight objectives. HMOBEDA is also compared with five cutting edge evolutionary algorithms (including a modified version of NSGA-III, for combinatorial optimization) applied to the same knapsack instances, as well to a set of MNK-landscape instances for two, three, five and eight objectives. An analysis of the resulting Bayesian network structures and parameters has also been carried to evaluate the approximated Pareto front from a probabilistic point of view, and also to evaluate how the interactions among variables, objectives and local search parameters are captured by the Bayesian networks. Results show that HMOBEDA outperforms the other approaches. It not only provides the best values for hypervolume, capacity and inverted generational distance indicators in most of the experiments, but it also presents a high diversity solution set close to the estimated Pareto front.
645

Local times of Brownian motion

Mukeru, Safari 09 1900 (has links)
After a review of the notions of Hausdorff and Fourier dimensions from fractal geometry and Fourier analysis and the properties of local times of Brownian motion, we study the Fourier structure of Brownian level sets. We show that if δa(X) is the Dirac measure of one-dimensional Brownian motion X at the level a, that is the measure defined by the Brownian local time La at level a, and μ is its restriction to the random interval [0, L−1 a (1)], then the Fourier transform of μ is such that, with positive probability, for all 0 ≤ β < 1/2, the function u → |u|β|μ(u)|2, (u ∈ R), is bounded. This growth rate is the best possible. Consequently, each Brownian level set, reduced to a compact interval, is with positive probability, a Salem set of dimension 1/2. We also show that the zero set of X reduced to the interval [0, L−1 0 (1)] is, almost surely, a Salem set. Finally, we show that the restriction μ of δ0(X) to the deterministic interval [0, 1] is such that its Fourier transform satisfies E (|ˆμ(u)|2) ≤ C|u|−1/2, u 6= 0 and C > 0. Key words: Hausdorff dimension, Fourier dimension, Salem sets, Brownian motion, local times, level sets, Fourier transform, inverse local times. / Decision Sciences / PhD. (Operations Research)
646

Bewysreg in die Suid-Afrikaanse arbeidsreg

Van der Merwe, George Willem 04 1900 (has links)
Summaries in Afrikaans and English / Text in Afrikaans / In hierdie proefskrif word daar gekonsentreer op die bewyslas in die nywerheidshof omdat die nywerheidshof se benadering met betrekking tot die bewyslas verskil van geval tot gevaL afhangende van die aard van die regshulp waarvoor die party je die nywerheidshof nader. In die tweede plek volg 'n bespreking van hoe en deur wie die voorlegging van getuienis aan die nywerheidshof mag geskied, hetsy by wyse van dokumente of getuies en daarbenewens oak 'n bespreking van watter soort getuienis aan die nywerheidshof voorgele mag word met spesifieke verwysing na inter alia, klankopnames, videobande en die resultate van leuenverklikkertoetse. / In this thesis there will be concentrated on the burden of proof in the industrial court because the industrial court's approach in regard to the burden of proof differs from case to case, depending on the nature of the legal aid for which the party /ies approaches the industrial court. In the second place a discussion will follow of how and by whom the presenting of evidence can be done, whether by documents or by witnesses, and in addition thereto also a discussion on which sort of evidence can be presented to the industrial court with specific reference to, inter alia, taperecordings, video tapes and the results of lie-detector tests. / Private Law / LL.M. (Handelsreg)
647

Transformação de redes de Petri coloridas em processos de decisão markovianos com probabilidades imprecisas. / Conversion from colored Petri nets into Markov decision processes with imprecise probabilities.

Mônica Goes Eboli 01 July 2010 (has links)
Este trabalho foi motivado pela necessidade de considerar comportamento estocástico durante o planejamento da produção de sistemas de manufatura, ou seja, o que produzir e em que ordem. Estes sistemas possuem um comportamento estocástico geralmente não considerado no planejamento da produção. O principal objetivo deste trabalho foi obter um método que modelasse sistemas de manufatura e representasse seu comportamento estocástico durante o planejamento de produção destes sistemas. Como os métodos que eram ideais para planejamento não forneciam a modelagem adequada dos sistemas, e os com modelagem adequada não forneciam a capacidade de planejamento necessária, decidiu-se combinar dois métodos para atingir o objetivo desejado. Decidiu-se modelar os sistemas em rede de Petri e convertê-los em processos de decisão markovianos, e então realizar o planejamento com o ultimo. Para que fosse possível modelar as probabilidades envolvidas nos processos, foi proposto um tipo especial de rede de Petri, nomeada rede de Petri fatorada. Utilizando este tipo de rede de Petri, foi desenvolvido o método de conversão em processos de decisão markovianos. A conversão ocorreu com sucesso, conforme testes que mostraram que planos podem ser produzidos utilizando-se algoritmos de ponta para processos de decisão markovianos. / The present work was motivated by the need to consider stochastic behavior when planning the production mix in a manufacturing system. These systems are exposed to stochastic behavior that is usually not considered during production planning. The main goal of this work was to obtain a method to model manufacturing systems and to represent their stochastic behavior when planning the production for these systems. Because the methods that were suitable for planning were not adequate for modeling the systems and vice-versa, two methods were combined to achieve the main goal. It was decided to model the systems in Petri nets and to convert them into Markov decision processes, to do the planning with the latter. In order to represent probabilities in the process, a special type of Petri nets, named Factored Petri nets, were proposed. Using this kind of Petri nets, a conversion method into Markov decision processes was developed. The conversion is successful as tests showed that plans can be produced within seconds using state-of-art algorithms for Markov decision processes.
648

Avaliação da confiabilidade em tubos de revestimento de poços de petróleo / Reliability assessment in casing tubes of oil Wells

Gouveia, Lucas Pereira de 08 August 2014 (has links)
This work aims to evaluate the reliability levels associated to a probabilistic approach of mechanical strength models of casing tubes on oil and gas wells. A comparative study between different reliability evaluation methods commonly applied is also carried out. On the oil and gas well design, casing tubes must bear the mechanical loadings in the subsurface, such as the ones from formations, from drilling and completion fluids, from production fluid over the well lifetime, from the self-weight of casing column and from weight of other components. Reliability-based analysis applied to a structural design allows the assessment of the probability of violation for a given limit state of the structure, so that it can be predicted with adequate value since the design stage. This kind of analysis is useful to obtain adequate safety levels in design and to discuss the quality control level in the manufacturer production process. In this work, the failure probability is evaluated by the following reliability methods: failure domain numericintegration,MonteCarlosimulationandthetransformationmethods:FirstOrder eliabilityMethod(FORM)andSecondOrderReliabilityMethod(SORM).Thelimitstatesv rified are established by using casing strength models found in the literature, based on mechanics of materials theory and rupture test data.Statistical data are based on technical reports from casing manufacturers found in open-access literature. The achieved results contributes to well casing structural assessment taking into account the influence of design uncertainties, motivating the adoption of reliability-based analysis in decision-making process on OCTG design. / FUNDEPES - Fundação Universitária de Desenvolvimento de Extensão e Pesquisa / Estetrabalho visa avaliar os níveis de confiabilidade associados a uma abordagem probabilística das resistências mecânicas de tubos de revestimento em poços de petróleo. Além disso, durante as análises realizadas, objetiva-se comparar os diferentes métodos de confiabilidade comumente encontrados na literatura com a finalidade de identificar o método mais vantajoso para a aplicação proposta. Em projetos de poços de petróleo e gás natural, os revestimentos exercem o papel de resistir mecanicamente aos esforços existentes na subsuperfície, como as solicitações impostas pela formação, pelo fluido de perfuração, pelos fluidos produzidos ao longo da vida útil do poço e pelos pesos da própria coluna de revestimento e de outros equipamentos. Já a análise de confiabilidade, aplicada a um projeto estrutural, permite a avaliação da probabilidade de violação de um determinado estado limite da estrutura, de forma que esta pode ser prevista, com valor adequado, ainda na fase de projeto.Esse tipo de análise é útil não obtenção da margem de segurança adequada do projeto e na discussão do nível de controle no processo de produção de elementos estruturais. Neste trabalho, o cálculo da probabilidade de falha é realizado através dos seguintes métodos: integração numérica sobre o domínio de falha, simulação de Monte Carlo e dos métodos de transformação: First Order Reliability Method (FORM) e Second Order Reliability Method (SORM). Os estados limites dos tubos são estimados por modelos de resistência encontrados na literatura, baseados em teorias da mecânica dos materiais e em dados de ensaios de ruptura. Os dados estatísticos utilizados são baseados em relatórios técnicos de produção disponíveis na literatura sob domínio público. Os resultados obtidos contribuem para a avaliação estrutural de revestimentos de poços de petróleo sob a influência de incertezas de projeto, motivando a incorporação da análise de confiabilidade no processo de tomada de decisão do projetista.
649

Stochastic models for the estimation of the seismic hazard / Modèles stochastiques pour l'estimation du risque sismique

Pertsinidou, Christina Elisavet 03 March 2017 (has links)
Dans le premier chapitre, la notion d'évaluation des risques sismiques est définie et les caractéristiques sismotectoniques de la région d'étude sont brièvement présentés. Un examen rigoureux des modèles stochastiques, appliqués au domaine de la sismologie est fourni. Dans le chapitre 2, différents modèles semi-Markoviens sont développés pour étudier la sismicité des îles Ioniennes centrales ainsi que le Nord de la mer Egée (Grèce). Les quantités telles que le noyau semi-Markovien et les probabilités de destination sont évaluées, en considérant que les temps de séjour suivent les distributions géométrique, discrète Weibull et Pareto. Des résultats utiles sont obtenus pour l'estimation de la sismicité. Dans le troisième chapitre un nouvel algorithme de Viterbi pour les modèles semi-Markoviens cachés est construit, dont la complexité est une fonction linéaire du nombre d'observations et une fonction quadratique du nombre d'états cachés, la plus basse existante dans la littérature. Une extension de ce nouvel algorithme est développée pour le cas où une observation dépend de l'état caché correspondant, mais aussi de l'observation précédente (cas SM1-M1). Dans le chapitre 4 les modèles semi-Markoviens cachés sont appliquées pour étudier la sismicité du Nord et du Sud de la mer Égée. La séquence d'observation est constituée des magnitudes et des positions d’un tremblement de terre et le nouvel algorithme de Viterbi est mis en œuvre afin de décoder les niveaux des tensions cachés qui sont responsables pour la sismogenèse. Les phases précurseurs (variations des tensions cachées) ont été détectées en avertissant qu’un tremblement de terre pourrait se produire. Ce résultat est vérifié pour 70 sur 88 cas (le score optimal). Les temps de séjour du processus caché étaient supposés suivre les distributions Poisson, logarithmique ou binomiale négative, tandis que les niveaux de tensions cachés ont été classés en 2, 3 ou 4 états. Les modèles de Markov caché ont également été adaptés sans présenter des résultats intéressants concernant les phases précurseurs. Dans le chapitre 5 un algorithme de Viterbi généralisé pour les modèles semi-Markoviens cachés, est construit dans le sens que les transitions au même état caché sont autorisées et peuvent également être décodées. De plus, une extension de cet algorithme généralisé dans le contexte SM1-M1 est présentée. Dans le chapitre 6 nous modifions de manière convenable le modèle Cramér-Lundberg y compris des sinistres négatifs et positifs, afin de décrire l'évolution avec le temps des changements de contraintes de Coulomb (valeurs ΔCFF) calculées pour sept épicentres (M ≥ 6) du Nord de la mer Egée. Formules pour les probabilités de ruine sont définies sous une forme générale. Corollaires sont également formulés pour la distribution exponentielle et Pareto. L'objectif est de mettre en lumière la question suivante qui pose la problématique dans la Sismologie: Au cours d'une année pourquoi un tremblement de terre s’est produit dans une position précise et pas dans une autre position, aux régions sismotectoniquement homogènes ayant valeurs ΔCFF positives. Les résultats montrent que les nouvelles formules de probabilité peuvent contribuer à répondre au problème susmentionné. / In the first chapter the definition of the seismic hazard assessment is provided, the seismotectonic features of the study areas are briefly presented and the already existing mathematical models applied in the field of Seismology are thoroughly reviewed. In chapter 2, different semi-Markov models are developed for studying the seismicity of the areas of the central Ionian Islands and the North Aegean Sea (Greece). Quantities such as the kernel and the destination probabilities are evaluated, considering geometric, discrete-Weibull and Pareto distributed sojourn times. Useful results are obtained for forecasting purposes. In the third chapter a new Viterbi algorithm for hidden semi-Markov models is developed, whose complexity is a linear function of the number of observations and a quadratic function of the number of hidden states, the lowest existing in the literature. Furthermore, an extension of this new algorithm is introduced for the case that an observation depends on the corresponding hidden state but also on the previous observation (SM1-M1 case). In chapter 4, different hidden semi-Markov models (HSMMs) are applied for the study of the North and South Aegean Sea. The earthquake magnitudes and locations comprise the observation sequence and the new Viterbi algorithm is implemented in order to decode the hidden stress field associated with seismogenesis. Precursory phases (variations of the hidden stress field) were detected warning for an anticipated earthquake occurrence for 70 out of 88 cases (the optimal model’s score). The sojourn times of the hidden process were assumed to follow Poisson, logarithmic or negative binomial distributions, whereas the hidden stress levels were classified into 2, 3 or 4 states. HMMs were also adapted without presenting significant results as for the precursory phases. In chapter 5 a generalized Viterbi algorithm for HSMMs is constructed in the sense that now transitions to the same hidden state are allowed and can also be decoded. Furthermore, an extension of this generalized algorithm in the SM1-M1 context is given. In chapter 6 we modify adequately the Cramér-Lundberg model considering negative and positive claims, in order to describe the evolution in time of the Coulomb failure function changes (ΔCFF values) computed at the locations of seven strong (M ≥ 6) earthquakes of the North Aegean Sea. Ruin probability formulas are derived and proved in a general form. Corollaries are also formulated for the exponential and the Pareto distribution. The aim is to shed light to the following problem posed by the seismologists: During a specific year why did an earthquake occur at a specific location and not at another location in seismotectonically homogeneous areas with positive ΔCFF values (stress enhanced areas). The results demonstrate that the new probability formulas can contribute in answering the aforementioned question.
650

Estimation simplifiée de la variance pour des plans complexes

Lefebvre, Isabelle 12 1900 (has links)
En présence de plans de sondage complexes, les méthodes classiques d’estimation de la variance présentent certains défis. En effet, les estimateurs de variance usuels requièrent les probabilités d’inclusion d’ordre deux qui peuvent être complexes à obtenir pour certains plans de sondage. De plus, pour des raisons de confidentialité, les fichiers externes de microdonnées n’incluent généralement pas les probabilités d’inclusion d’ordre deux (souvent sous la forme de poids bootstrap). En s’inspirant d’une approche développée par Ohlsson (1998) dans le contexte de l’échantillonnage de Poisson séquentiel, nous proposons un estimateur ne requérant que les probabilités d’inclusion d’ordre un. L’idée est d’approximer la stratégie utilisée par l’enquête (consistant du choix d’un plan de sondage et d’un estimateur) par une stratégie équivalente dont le plan de sondage est le plan de Poisson. Nous discuterons des plans proportionnels à la taille avec ou sans grappes. Les résultats d’une étude par simulation seront présentés. / In a complex design framework, standard variance estimation methods entail substantial challenges. As we know, conventional variance estimators involve second order inclusion probabilities, which can be difficult to compute for some sampling designs. Also, confidentiality standards generally prevent second order inclusion probabilities to be included in external microdata files (often in the form of bootstrap weights). Based on Ohlsson’s sequential Poisson sampling method (1998), we suggest a simplified estimator for which we only need first order inclusion probabilities. The idea is to approximate a survey strategy (which consists of a sampling design and an estimator) by an equivalent strategy for which a Poisson sampling design is used. We will discuss proportional to size sampling and proportional to size cluster sampling. Results of a simulation study will be presented.

Page generated in 0.0953 seconds