• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 362
  • 149
  • 78
  • 28
  • 10
  • 10
  • 9
  • 8
  • 7
  • 5
  • 5
  • 5
  • 5
  • 5
  • 4
  • Tagged with
  • 855
  • 120
  • 112
  • 110
  • 106
  • 106
  • 95
  • 74
  • 63
  • 60
  • 59
  • 58
  • 58
  • 57
  • 57
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
701

Calibração linear assimétrica / Asymmetric Linear Calibration

Figueiredo, Cléber da Costa 27 February 2009 (has links)
A presente tese aborda aspectos teóricos e aplicados da estimação dos parâmetros do modelo de calibração linear com erros distribuídos conforme a distribuição normal-assimétrica (Azzalini, 1985) e t-normal-assimétrica (Gómez, Venegas e Bolfarine, 2007). Aplicando um modelo assimétrico, não é necessário transformar as variáveis a fim de obter erros simétricos. A estimação dos parâmetros e das variâncias dos estimadores do modelo de calibração foram estudadas através da visão freqüentista e bayesiana, desenvolvendo algoritmos tipo EM e amostradores de Gibbs, respectivamente. Um dos pontos relevantes do trabalho, na óptica freqüentista, é a apresentação de uma reparametrização para evitar a singularidade da matriz de informação de Fisher sob o modelo de calibração normal-assimétrico na vizinhança de lambda = 0. Outro interessante aspecto é que a reparametrização não modifica o parâmetro de interesse. Já na óptica bayesiana, o ponto forte do trabalho está no desenvolvimento de medidas para verificar a qualidade do ajuste e que levam em consideração a assimetria do conjunto de dados. São propostas duas medidas para medir a qualidade do ajuste: o ADIC (Asymmetric Deviance Information Criterion) e o EDIC (Evident Deviance Information Criterion), que são extensões da ideia de Spiegelhalter et al. (2002) que propôs o DIC ordinário que só deve ser usado em modelos simétricos. / This thesis focuses on theoretical and applied estimation aspects of the linear calibration model with skew-normal (Azzalini, 1985) and skew-t-normal (Gómez, Venegas e Bolfarine, 2007) error distributions. Applying the asymmetrical distributed error methodology, it is not necessary to transform the variables in order to have symmetrical errors. The frequentist and the Bayesian solution are presented. The parameter estimation and its variance estimation were studied using the EM algorithm and the Gibbs sampler, respectively, in each approach. The main point, in the frequentist approach, is the presentation of a new parameterization to avoid singularity of the information matrix under the skew-normal calibration model in a neighborhood of lambda = 0. Another interesting aspect is that the reparameterization developed to make the information matrix nonsingular, when the skewness parameter is near to zero, leaves the parameter of interest unchanged. The main point, in the Bayesian framework, is the presentation of two measures of goodness-of-fit: ADIC (Asymmetric Deviance Information Criterion) and EDIC (Evident Deviance Information Criterion ). They are natural extensions of the ordinary DIC developed by Spiegelhalter et al. (2002).
702

Avalanching on dunes and its effects : size statistics, stratification, & seismic surveys

Arran, Matthew Iain January 2018 (has links)
Geophysical research has long been interdisciplinary, with many phenomena on the Earth's surface involving multiple, linked processes that are best understood using a combination of techniques. This is particularly true in the case of grain flows on sand dunes, in which the sedimentary stratification with which geologists are concerned arises from the granular processes investigated by physicists and engineers, and the water permeation that interests hydrologists and soil scientists determines the seismic velocities of concern to exploration geophysicists. In this dissertation, I describe four projects conducted for the degree of Doctor of Philosophy, using a combination of laboratory experimentation, fieldwork, numerical simulation, and mathematical modelling to link avalanching on dunes to its effects on stratification, on the permeation of water, and on seismic surveys. Firstly, I describe experiments on erodible, unbounded, grain piles in a channel, slowly supplied with additional grains, and I demonstrate that the behaviour of the consequent, discrete avalanches alternates between two regimes, typified by their size statistics. Reconciling the `self-organised criticality' that several authors have predicted for such a system with the hysteretic behaviour that others have observed, the system exhibits quasi-periodic, system-spanning avalanches in one regime, while in the other avalanches pass at irregular intervals and have a power-law size distribution. Secondly, I link this power-law size distribution to the strata emplaced by avalanches on bounded grain piles. A low inflow rate of grains into an experimental channel develops a pile, composed of strata in which blue-dyed, coarser grains overlie finer grains. Associating stopped avalanche fronts with the `trapped kinks' described by previous authors, I show that, in sufficiently large grain piles, mean stratum width increases linearly with distance downslope. This implies the possibility of interpreting paleodune height from the strata of aeolian sandstones, and makes predictions for the structure of avalanche-associated strata within active dunes. Thirdly, I discuss investigations of these strata within active, Qatari barchan dunes, using dye-infiltration to image strata in the field and extracting samples across individual strata with sub-centimetre resolution. Downslope increases in mean stratum width are evident, while measurements of particle size distributions demonstrate preferential permeation of water along substrata composed of finer particles, explaining the strata-associated, localised regions of high water content discovered by other work on the same dunes. Finally, I consider the effect of these within-dune variations in water content on seismic surveys for oil and gas. Having used high performance computing to simulate elastic wave propagation in the vicinity of an isolated, barchan sand dune, I demonstrate that such a dune acts as a resonator, absorbing energy from Rayleigh waves and reemitting it over an extensive period of time. I derive and validate a mathematical framework that uses bulk properties of the dune to predict quantitative properties of the emitted waves, and I demonstrate the importance of internal variations in seismic velocity, resulting from variations in water content.
703

Structure des ondes de choc dans les gaz granulaires / Shock wave structure in granular gases

Vilquin, Alexandre 17 December 2015 (has links)
Dans des milieux tels que les gaz, les plasmas et les milieux granulaires, un objet se déplaçant à des vitessessupersoniques, compresse et chauffe le fluide devant lui, formant ainsi une onde de choc. La zone hors-équilibreappelée front d’onde, où ont lieu de brusques variations de température, pression et densité, présente unestructure particulière, avec notamment des distributions des vitesses des particules fortement non-gaussienneset difficiles à visualiser. Dans une avancée importante en 1951, Mott-Smith décrit le front d’onde comme lasuperposition des deux états que sont le gaz supersonique initial et le gaz subsonique compressé et chauffé,impliquant ainsi l’existence de distributions des vitesses bimodales. Des expériences à grands nombres de Machont confirmé cette structure globalement bimodale. Ce modèle n’explique cependant pas la présence d’un surplusde particules à des vitesses intermédiaires, entre le gaz supersonique et le gaz subsonique.Ce travail de thèse porte sur l’étude des ondes de choc dans les gaz granulaires, où les particules interagissentuniquement par des collisions binaires inélastiques. Dans ces gaz dissipatifs, la température granulaire, traduisantl’agitation des particules, permet de définir l’équivalent d’une vitesse du son par analogie aux gaz moléculaires.Les basses valeurs de ces vitesses du son dans les gaz granulaires, permettent de générer facilement des ondes dechoc dans lesquelles chaque particule peut être suivie, contrairement aux gaz moléculaires. La première partie decette étude porte sur l’effet de la dissipation d’énergie, due aux collisions inélastiques, sur la structure des ondesde choc dans les gaz granulaires. Les modifications induites sur la température, la densité et la vitesse moyennemesurées, sont interprétées à l’aide d’un modèle basé sur l’hypothèse bimodale de Mott-Smith et intégrant ladissipation d’énergie. La deuxième partie est consacrée à l’interprétation des distributions des vitesses dans lefront d’onde. À partir des expériences réalisées dans les gaz granulaires, une description trimodale, incluant unétat intermédiaire supplémentaire, est proposée et étendue avec succès aux distributions des vitesses dans lesgaz moléculaires. / In different materials such as gases, plasmas and granular material, an object, moving at supersonic speed,compresses and heats the fluid ahead. The shock front is the out-of-equilibrium area, where violent changesin temperature, pressure and density occur. It has a particular structure with notably strongly non-Gaussianparticle velocity distributions, which are difficult to observe. In an important breakthrough in 1951, Mott-Smithdescribes the shock front as a superposition of two states: the initial supersonic gas and the compressed andheated subsonic gas, implying existence of bimodal velocity distributions. Several experiences at high Machnumbers show this overall bimodal structure. However this model does not explain the existence of a surplusof particles with intermediate velocities, between the supersonic and the subsonic gas.This thesis focuses on shock waves in granular gases, where particles undergo only inelastic binary collisions.In these dissipative gases, the granular temperature, reflecting the particle random motion, allows to definethe equivalent to the speed of sound by analogy with molecular gases. The low values of this speed of soundpermit to generate easily shock waves in which each particle can be tracked, unlike molecular gases. The firstpart of this work focuses on the effect of the energy dissipation, due to inelastic collisions, on the shock frontstructure in granular gases. Modifications induced on temperature, density and mean velocity, are captured bya model based on the bimodal hypothesis of Mott-Smith and including energy dissipation. The second part isdevoted to the study of velocity distributions in the shock front. From experiences in granular gases, a trimodaldescription, including an additional intermediate state, is proposed and successfully extended to the velocitydistributions in molecular gases.
704

Calibração linear assimétrica / Asymmetric Linear Calibration

Cléber da Costa Figueiredo 27 February 2009 (has links)
A presente tese aborda aspectos teóricos e aplicados da estimação dos parâmetros do modelo de calibração linear com erros distribuídos conforme a distribuição normal-assimétrica (Azzalini, 1985) e t-normal-assimétrica (Gómez, Venegas e Bolfarine, 2007). Aplicando um modelo assimétrico, não é necessário transformar as variáveis a fim de obter erros simétricos. A estimação dos parâmetros e das variâncias dos estimadores do modelo de calibração foram estudadas através da visão freqüentista e bayesiana, desenvolvendo algoritmos tipo EM e amostradores de Gibbs, respectivamente. Um dos pontos relevantes do trabalho, na óptica freqüentista, é a apresentação de uma reparametrização para evitar a singularidade da matriz de informação de Fisher sob o modelo de calibração normal-assimétrico na vizinhança de lambda = 0. Outro interessante aspecto é que a reparametrização não modifica o parâmetro de interesse. Já na óptica bayesiana, o ponto forte do trabalho está no desenvolvimento de medidas para verificar a qualidade do ajuste e que levam em consideração a assimetria do conjunto de dados. São propostas duas medidas para medir a qualidade do ajuste: o ADIC (Asymmetric Deviance Information Criterion) e o EDIC (Evident Deviance Information Criterion), que são extensões da ideia de Spiegelhalter et al. (2002) que propôs o DIC ordinário que só deve ser usado em modelos simétricos. / This thesis focuses on theoretical and applied estimation aspects of the linear calibration model with skew-normal (Azzalini, 1985) and skew-t-normal (Gómez, Venegas e Bolfarine, 2007) error distributions. Applying the asymmetrical distributed error methodology, it is not necessary to transform the variables in order to have symmetrical errors. The frequentist and the Bayesian solution are presented. The parameter estimation and its variance estimation were studied using the EM algorithm and the Gibbs sampler, respectively, in each approach. The main point, in the frequentist approach, is the presentation of a new parameterization to avoid singularity of the information matrix under the skew-normal calibration model in a neighborhood of lambda = 0. Another interesting aspect is that the reparameterization developed to make the information matrix nonsingular, when the skewness parameter is near to zero, leaves the parameter of interest unchanged. The main point, in the Bayesian framework, is the presentation of two measures of goodness-of-fit: ADIC (Asymmetric Deviance Information Criterion) and EDIC (Evident Deviance Information Criterion ). They are natural extensions of the ordinary DIC developed by Spiegelhalter et al. (2002).
705

Análise comparativa do efeito da distribuição espaço-tempo em eventos pluviométricos intensos na formação de vazões em bacias urbanas. / Comparative analysis of the effect of space-time distribution of heavy rainfall events in the formation of flows in urban catchments.

Lígia de Souza Girnius 18 May 2016 (has links)
Esta pesquisa tem como finalidade discutir os impactos da variabilidade espacial e temporal de precipitações intensas nas vazões de cursos d\'água em bacias urbanizadas mediante a análise de dados históricos da pluviometria obtidos durante eventos críticos. A bacia hidrográfica do rio Tietê, em sua porção mais urbanizada, é a área objeto deste estudo. Após uma revisão sobre o tema na literatura específica, foram desenvolvidas chuvas de projeto com os padrões observados e com padrões teóricos, frequentemente utilizados na geração das tormentas sintéticas. O volume total precipitado foi associado ao período de retorno (TR) de 100 anos, a partir da análise estatística de chuvas pontuais e pela aplicação de fatores de redução de área (FRA) observados na área em estudo e de outras regiões, que vêm sendo utilizados em projetos de drenagem, sem qualquer estudo de validação; o intuito foi o de demonstrar a importância da definição de FRA específicos, a fim de evitar superdimensionamentos e otimizar as soluções. As chuvas de projeto foram aplicadas num modelo matemático de transformação chuva-vazão, devidamente calibrado, para obtenção das vazões de projeto resultantes no limite de jusante da bacia hidrográfica, frente aos diferentes padrões de solicitações hidrológicas. Para auxiliar a calibração do modelo estavam disponíveis dados dos postos telemétricos do Sistema de Alerta a Inundações de São Paulo - SAISP, curvas-chave e, para melhor representação dos eventos de precipitação observados, pode-se contar com as imagens do radar de Ponte Nova, em complementação às informações da rede de superfície. A comparação dos resultados obtidos no modelo hidrológico mostrou que os efeitos dos parâmetros variáveis (volume, distribuição espacial e temporal) são expressivos na composição dos hidrogramas de projeto. Dos testes realizados, identificaram-se as situações mais e menos críticas para a bacia, em termos de distribuição espacial e temporal e duração da chuva de projeto, além de estabelecer as diferenças no dimensionamento do sistema de drenagem pela adoção de FRA específico. Concluiu-se que, pela metodologia proposta, é possível chegar a vazões máximas de projeto apenas pela simulação de tormentas sintéticas, com diferenças de 10% a 20% das tormentas observadas maximizadas. Há, no entanto, a necessidade de realização de estudos adicionais, tanto para definição dos valores de FRA específicos, quanto de simulação de quantidade maior de padrões críticos observados, para a aplicação prática das indicações desse estudo com maior confiabilidade. / This research aims to discuss the impact of the spatial and temporal variability of heavy rainfall in the river flows in urbanized catchments by the historical rainfall data analysis obtained during critical events. The Tiete River catchment, in its most urbanized portion, is the subject of study of this research. After a review of the subject in the specific literature, design rainfall was developed along with the observed and theoretical patterns, often used in the generation of synthetic storms. The total volume precipitated was associated with the 100 years return period (RP), from the statistical analysis of point rainfall and for the application of areal reduction factors (ARF) observed in the study area and in other regions, which have been used in drainage projects without any validation study; the intention was to demonstrate the importance of the definition of specific ARF, in order to avoid oversizing and optimizing solutions. The design precipitation was applied on rainfall-runoff mathematical model, properly calibrated, so as to obtain the resulting design flow at the downstream boundary of the catchment, facing the different patterns of hydrological solicitations. In order to assist the calibration of the model, available data has been used from telemetric stations of the Sistema de Alerta a Inundações de São Paulo (São Paulo Flooding Alert System) - SAISP, discharge curves, and for better representation of the observed precipitation events, can be counted on the images taken from the Ponte Nova radar, as a complement to the information from the surface network. The comparison of the results of the hydrological model has shown that the effects of the variable parameters (volume, spatial and temporal distributions) are significant in the composition of the design hydrograph. Out of the performed tests, the most and the least critical situations were identified concerning the catchment in terms of both spatial and temporal distribution as well as the duration of the design storm. Also, the differences in the dimensions of the of the drainage system design were established by the adoption of specific ARF. Thus, it has been concluded, according to the proposed methodology, that it is possible to reach maximum design flow just by simulating synthetic storms, with differences ranging from 10% to 20% of the observed storms maximized. However, there is a need for additional studies, either to set up setting specific values of ARF or to simulate a larger quantity of critical patterns observed, in order to apply the indications of this study with higher reliability.
706

Monte Carlo simulation studies in log-symmetric regressions / Estudos de simulação de Monte Carlo em regressões log- simétricas

Ventura, Marcelo dos Santos 09 March 2018 (has links)
Submitted by Franciele Moreira (francielemoreyra@gmail.com) on 2018-03-29T12:30:01Z No. of bitstreams: 2 Dissertação - Marcelo dos Santos Ventura - 2018.pdf: 4739813 bytes, checksum: 52211670f6e17c893ffd08843056f075 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2018-03-29T13:40:08Z (GMT) No. of bitstreams: 2 Dissertação - Marcelo dos Santos Ventura - 2018.pdf: 4739813 bytes, checksum: 52211670f6e17c893ffd08843056f075 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2018-03-29T13:40:08Z (GMT). No. of bitstreams: 2 Dissertação - Marcelo dos Santos Ventura - 2018.pdf: 4739813 bytes, checksum: 52211670f6e17c893ffd08843056f075 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2018-03-09 / Fundação de Amparo à Pesquisa do Estado de Goiás - FAPEG / This work deals with two Monte Carlo simulation studies in log-symmetric regression models, which are particularly useful for the cases when the response variable is continuous, strictly positive and asymmetric, with the possibility of the existence of atypical observations. In log- symmetric regression models, the distribution of the random errors multiplicative belongs to the log-symmetric class, which encompasses log-normal, log- Student-t, log-power- exponential, log-slash, log-hyperbolic distributions, among others. The first simulation study has as objective to examine the performance for the maximum-likelihood estimators of the model parameters, where various scenarios are considered. The objective of the second simulation study is to investigate the accuracy of popular information criteria as AIC, BIC, HQIC and their respective corrected versions. As illustration, a movie data set obtained and assembled for this dissertation is analyzed to compare log-symmetric models with the normal linear model and to obtain the best model by using the mentioned information criteria. / Este trabalho aborda dois estudos de simulação de Monte Carlo em modelos de regressão log- simétricos, os quais são particularmente úteis para os casos em que a variável resposta é contínua, estritamente positiva e assimétrica, com possibilidade da existência de observações atípicas. Nos modelos de regressão log-simétricos, a distribuição dos erros aleatórios multiplicativos pertence à classe log-simétrica, a qual engloba as distribuições log-normal, log-Student- t, log-exponencial- potência, log-slash, log-hyperbólica, entre outras. O primeiro estudo de simulação tem como objetivo examinar o desempenho dos estimadores de máxima verossimilhança desses modelos, onde vários cenários são considerados. No segundo estudo de simulação o objetivo é investigar a eficácia critérios de informação populares como AIC, BIC, HQIC e suas respectivas versões corrigidas. Como ilustração, um conjunto de dados de filmes obtido e montado para essa dissertação é analisado para comparar os modelos de regressão log-simétricos com o modelo linear normal e para obter o melhor modelo utilizando os critérios de informação mencionados.
707

Uncertainty in Aquatic Toxicological Exposure-Effect Models: the Toxicity of 2,4-Dichlorophenoxyacetic Acid and 4-Chlorophenol to Daphnia carinata

Dixon, William J., bill.dixon@dse.vic.gov.au January 2005 (has links)
Uncertainty is pervasive in risk assessment. In ecotoxicological risk assessments, it arises from such sources as a lack of data, the simplification and abstraction of complex situations, and ambiguities in assessment endpoints (Burgman 2005; Suter 1993). When evaluating and managing risks, uncertainty needs to be explicitly considered in order to avoid erroneous decisions and to be able to make statements about the confidence that we can place in risk estimates. Although informative, previous approaches to dealing with uncertainty in ecotoxicological modelling have been found to be limited, inconsistent and often based on assumptions that may be false (Ferson & Ginzburg 1996; Suter 1998; Suter et al. 2002; van der Hoeven 2004; van Straalen 2002a; Verdonck et al. 2003a). In this thesis a Generalised Linear Modelling approach is proposed as an alternative, congruous framework for the analysis and prediction of a wide range of ecotoxicological effects. This approach was used to investigate the results of toxicity experiments on the effect of 2,4-Dichlorophenoxyacetic Acid (2,4-D) formulations and 4-Chlorophenol (4-CP, an associated breakdown product) on Daphnia carinata. Differences between frequentist Maximum Likelihood (ML) and Bayesian Markov-Chain Monte-Carlo (MCMC) approaches to statistical reasoning and model estimation were also investigated. These approaches are inferentially disparate and place different emphasis on aleatory and epistemic uncertainty (O'Hagan 2004). Bayesian MCMC and Probability Bounds Analysis methods for propagating uncertainty in risk models are also compared for the first time. For simple models, Bayesian and frequentist approaches to Generalised Linear Model (GLM) estimation were found to produce very similar results when non-informative prior distributions were used for the Bayesian models. Potency estimates and regression parameters were found to be similar for identical models, signifying that Bayesian MCMC techniques are at least a suitable and objective replacement for frequentist ML for the analysis of exposureresponse data. Applications of these techniques demonstrated that Amicide formulations of 2,4-D are more toxic to Daphnia than their unformulated, Technical Acid parent. Different results were obtained from Bayesian MCMC and ML methods when more complex models and data structures were considered. In the analysis of 4-CP toxicity, the treatment of 2 different factors as fixed or random in standard and Mixed-Effect models was found to affect variance estimates to the degree that different conclusions would be drawn from the same model, fit to the same data. Associated discrepancies in the treatment of overdispersion between ML and Bayesian MCMC analyses were also found to affect results. Bayesian MCMC techniques were found to be superior to the ML ones employed for the analysis of complex models because they enabled the correct formulation of hierarchical (nested) datastructures within a binomial logistic GLM. Application of these techniques to the analysis of results from 4-CP toxicity testing on two strains of Daphnia carinata found that between-experiment variability was greater than that within-experiments or between-strains. Perhaps surprisingly, this indicated that long-term laboratory culture had not significantly affected the sensitivity of one strain when compared to cultures of another strain that had recently been established from field populations. The results from this analysis highlighted the need for repetition of experiments, proper model formulation in complex analyses and careful consideration of the effects of pooling data on characterising variability and uncertainty. The GLM framework was used to develop three dimensional surface models of the effects of different length pulse exposures, and subsequent delayed toxicity, of 4-CP on Daphnia. These models described the relationship between exposure duration and intensity (concentration) on toxicity, and were constructed for both pulse and delayed effects. Statistical analysis of these models found that significant delayed effects occurred following the full range of pulse exposure durations, and that both exposure duration and intensity interacted significantly and concurrently with the delayed effect. These results indicated that failure to consider delayed toxicity could lead to significant underestimation of the effects of pulse exposure, and therefore increase uncertainty in risk assessments. A number of new approaches to modelling ecotoxicological risk and to propagating uncertainty were also developed and applied in this thesis. In the first of these, a method for describing and propagating uncertainty in conventional Species Sensitivity Distribution (SSD) models was described. This utilised Probability Bounds Analysis to construct a nonparametric 'probability box' on an SSD based on EC05 estimates and their confidence intervals. Predictions from this uncertain SSD and the confidence interval extrapolation methods described by Aldenberg and colleagues (2000; 2002a) were compared. It was found that the extrapolation techniques underestimated the width of uncertainty (confidence) intervals by 63% and the upper bound by 65%, when compared to the Probability Bounds (P3 Bounds) approach, which was based on actual confidence estimates derived from the original data. An alternative approach to formulating ecotoxicological risk modelling was also proposed and was based on a Binomial GLM. In this formulation, the model is first fit to the available data in order to derive mean and uncertainty estimates for the parameters. This 'uncertain' GLM model is then used to predict the risk of effect from possible or observed exposure distributions. This risk is described as a whole distribution, with a central tendency and uncertainty bounds derived from the original data and the exposure distribution (if this is also 'uncertain'). Bayesian and P-Bounds approaches to propagating uncertainty in this model were compared using an example of the risk of exposure to a hypothetical (uncertain) distribution of 4-CP for the two Daphnia strains studied. This comparison found that the Bayesian and P-Bounds approaches produced very similar mean and uncertainty estimates, with the P-bounds intervals always being wider than the Bayesian ones. This difference is due to the different methods for dealing with dependencies between model parameters by the two approaches, and is confirmation that the P-bounds approach is better suited to situations where data and knowledge are scarce. The advantages of the Bayesian risk assessment and uncertainty propagation method developed are that it allows calculation of the likelihood of any effect occurring, not just the (probability)bounds, and that the same software (WinBugs) and model construction may be used to fit regression models and predict risks simultaneously. The GLM risk modelling approaches developed here are able to explain a wide range of response shapes (including hormesis) and underlying (non-normal) distributions, and do not involve expression of the exposure-response as a probability distribution, hence solving a number of problems found with previous formulations of ecotoxicological risk. The approaches developed can also be easily extended to describe communities, include modifying factors, mixed-effects, population growth, carrying capacity and a range of other variables of interest in ecotoxicological risk assessments. While the lack of data on the toxicological effects of chemicals is the most significant source of uncertainty in ecotoxicological risk assessments today, methods such as those described here can assist by quantifying that uncertainty so that it can be communicated to stakeholders and decision makers. As new information becomes available, these techniques can be used to develop more complex models that will help to bridge the gap between the bioassay and the ecosystem.
708

非常態間斷隨機變數的產生 / Generation of non-normal approximated discrete random variables

李晏, Lee, Yen Unknown Date (has links)
使用母數統計方法(Parametric Tests)分析資料時,常需滿足常態假設,但實際得到的資料卻少有常態,因此研究違反常態假設對統計量所造成影響的強韌性研究(Robustness Research)在應用統計方法上是重要的研究主題。在進行此類研究時,常使用蒙地卡羅法(Monte Carlo Method)產生非常態之資料進一步進行研究,目前雖已有多個可產生非常態連續資料的方法被提出,但心理學研究之資 料卻多為間斷資料。而在產生非常態間斷資料時,除難以產生指定參數之間斷分配外,亦有無限多組具同樣參數之間斷分配可供選擇。針對以上兩困難,本研究提出可使用最大資訊熵程序估計符合指定參數之單變數間斷分配,用以產生對應之單變數間斷資料。最大資訊熵方法可所估出之間斷最大資訊熵分配除為符合指定參數時最常出現之分配以外,同時具有平滑、非必要無0 機率等特性。本研究呈現指定4 參數(平均數、變異數、偏態及峰度)與指定2 參數(偏態及峰度) 之最大資訊熵方法,及相對應之R 套件,並以R 套件對此2 方法進行探討評估。結果發現本研究所提出之二方法,在要求指定參數與估計參數之誤差均不超過 .001 時,均可估計出符合指定參數之可能組合之分配,顯示此二方法可精確產生指定參數之間斷分配。而本研究所提供之R 套件,除可在輸入點數、指定參數後產生間斷分配,亦可輸入指定樣本數目及樣本數於此間斷分配中抽取樣本,使此二方法於使用蒙地卡羅法進行間斷資料之強韌性研究時,更易於使用。 / When conducting the robustness researches about normality assumption with Monte Carlo method, a procedure for simulating non-normal data is needed. Some procedures for simulating the non-normal continuous data have been proposed, but the discrete data of ordered categorized variables (e.g., Likert-Type scale) are what we met mostly in practice. To estimate the discrete probability distribution precisely and choose one from infinite discrete probability distributions with the same constraints are 2 difficulties encountered on discrete data simulating process. Therefore, the research purposed a procedure called Maximum Entropy Procedure (MEP) which simulates the univariate discrete maximum entropy distribution with the specified parameters. The distribution is the one with greatest number with the specified parameters, most unlikely probability distribution with 0 probability and smoothest. The characteristics make the MEP a reasonable and considerable choice on simulating univariate discrete data with specified parameters. The MEP-4 (constraints on mean, variance, skewness and kurtosis), the MEP-2 (constraints on skewness and kurtosis) and the corresponding R packages which could estimate the univariate discrete distributions with the specified parameters are presented, evaluated and discussed in this research. It shows that the MEP-4 and MEP-2 are able to estimate the discrete probability distributions precisely with possible combinations of specified parameters with all differences are smaller than .001 and thus useful for robustness researches. The R packages presented in this study are easily to estimate the discrete probability distributions with specified parameters and generate data from these distributions with specified number of samples and sample size. Therefore the MEP-4 and MEP-2 could be easily implemented for generating discrete data with the specified parameters through the corresponding R package and thus useful for Monte Carlo method of robustness researches.
709

De nouveaux résultats sur la géométrie des mosaïques de Poisson-Voronoi et des mosaïques poissoniennes d'hyperplans. Etude du modèle de fissuration de Rényi-Widom

Calka, Pierre 05 December 2002 (has links) (PDF)
Cette thèse traite de trois modèles de géométrie aléatoire: les mosaïques de Poisson-Voronoi, les mosaïques poissoniennes d'hyperplans et le modèle de fissuration unidirectionnel de Rényi-Widom. Nous montrons tout d'abord l'équivalence entre les deux approches historiques pour l'étude statistique des mosaïques: la convergence des moyennes ergodiques et la définition au sens de Palm de la cellule typique. Nous donnons ensuite en dimension deux la loi du nombre de sommets de la cellule typique et conditionnellement à ce nombre, les lois des positions des frontières, de l'aire et du périmètre. De plus, nous explicitons la loi conjointe des rayons des disques centrés en l'origine inscrit dans (resp. circonscrit à) la cellule typique et nous en déduisons le caractère circulaire des "grandes cellules". Dans le cas Poisson-Voronoi, nous relions en toute dimension la fonction spectrale de la cellule typique au pont brownien, ce qui permet en particulier d'estimer asymptotiquement la loi de la première valeur propre en dimension deux. Dans le cas des mosaïques poissoniennes d'hyperplans, nous exploitons les techniques de Palm pour en déduire une construction explicite en toute dimension de la cellule typique à partir de sa boule inscrite et de son simplexe circonscrit. Une preuve rigoureuse d'un résultat de R. E. Miles lorsqu'on épaissit les hyperplans est également donnée. Par ailleurs, nous modélisons un phénomène de fissuration par un processus unidimensionnel stationnaire dont nous calculons la loi de la distance inter-fissures typique. Nous montrons en outre que les points successifs sont ceux d'un processus de renouvellement conditionné explicite.
710

Cascades log-infiniment divisibles et analyse multiresolution. Application à l'étude des intermittences en turbulence.

Chainais, Pierre 30 November 2001 (has links) (PDF)
Les cascades log-infiniment divisibles fournissent un cadre général à l'étude de la propriété d' invariance d'échelle. Nous introduisons ces objets en décrivant l'évolution historique des différents modèles proposés pour décrire le phénomène d'intermittence statistique en turbulence. Nous nous appliquons alors à préciser une définition formelle des cascades log-infiniment divisibles. Nous remplaçons aussi les accroissements, usuels en turbulence, par les coefficients d'une transformée en ondelettes associée à une analyse multirésolution, outil dédié à l'analyse temps-échelle. Une réflexion approfondie sur la signification du formalisme nous amène à démontrer sa flexibilité pour la modélisation, ainsi que sa richesse en lien avec les cascades multiplicatives, les processus de Markov, l'équation de Langevin, l'équation de Fokker-Planck...Grâce à l'étude des cascades log-Poisson composées, nous proposons une vision originale du phénomène d'intermittence statistique. Ensuite, des estimateurs des exposants de lois d'échelle (éventuellement relatives) sont étudiés en insistant sur la correction du biais et la détermination d'intervalles de confiance. Nous les appliquons à des données de télétrafic informatique. Nous expliquons pourquoi une procédure usuelle d'estimation du spectre multifractal appliquée aux mouvements linéaires stables fractionnaires risque de mener à une méprise. Enfin, le lien entre intermittence statistique et intermittence spatio-temporelle (structures cohérentes) en turbulence est étudié à partir de l'enregistrement de signaux de vitesse et de pression conjointement en espace et en temps dans un écoulement turbulent. De fortes dépressions associées à des tourbillons filamentaires sont détectées. Une analyse statistique des coefficients d'ondelette de la vitesse conditionnée à ces événements nous permet de décrire l'influence de ces structures cohérentes à différents nombres de Reynolds.

Page generated in 0.1195 seconds