Spelling suggestions: "subject:"samma codistribution"" "subject:"samma bydistribution""
1 |
Statistical modelling of continuous distributions with a finite probability of zerosAckerley, Elizabeth January 2000 (has links)
No description available.
|
2 |
Extension of spray flow modelling using the drop number size distribution moments approachAlqurashi, Faris January 2015 (has links)
This work is an extension to the spray model of Watkins and Jones (2010). In their model, the spray is characterized by evaluating the three moments Q_2, Q_3 and Q_4 of general gamma number size distribution from their transport equations. The sub-models of drop drag, drop break-up and drop collisions were simulated in terms of gamma distributions. The model is considered as non-vaporising and compared with cases which have low ambient gas temperature and also is strict to a particular set of sub-models for drop drag and break up which they are applicable to produce integrable functions. In this work the model is adjusted to allow a variety of sub-models to be implemented. Three models (TAB, ETAB, DDB) are considered for drop breakup which have been basically introduced to be used with the Droplet Discrete Method (DDM) approach. So in order to implement these models with the model of Watkins and Jones the source terms of the breakup are calculated by grouping the droplets in each cell into parcels which contain a certain number of droplets with similar physical properties (size, velocity, temperature ...). The source terms of each parcel are calculated and multiplied by the number of droplets in these parcels and a numerical integration is then used to obtain the resultant effect of the drop breakup in each cell. The number of drops in each cell is determined from the gamma size distribution. Also three hybrid breakup models (KH-RT, Turb-KH-RT, Turb-TAB) which include two distinct steps: primary and secondary break up model are implemented. The Kelvin- Helmholtz (KH) and the turbulence induced breakup (Turb) models were used to predict the primary break up of the intact liquid core of a liquid jet while the secondary break up is modelled using the TAB model and competition between the KH and the RT models. Both models are allowed to work simultaneously. However it is assumed that if the disintegration occurs due to the RT the KH break up does not occur. In case of drag sub-model, a dynamic drag model is introduced which accounts for the effects of drop distortion and oscillation due to the effects of high relative velocity between the liquid and the surrounding gas. In this model the drag coefficient is empirically related to the magnitude of the drop deformation. The magnitude of drop deformation was calculated by using the TAB model. In this work, the effects of mass and heat transfer on the spray are modelled. An additional equation for the energy of the liquid is solved. The mass transfer rate is evaluated using the model of Godsave (1953) and Spalding (1953) while the Faeth correlation (1983) is used to model heat transfer between the two phases. For all equations of heat and mass transfer between phases, the drop Nusselt and Sherwood number are calculated by using the correlation of Ranz and Marshall. In this model also the liquid surface-average temperature T_l2 which is calculated by Watkins (2007) is used to determine the heat and mass transfer between phases instead of liquid volume-average temperature. It was derived by assuming a parabolic temperature profile within individual drops. All the equations are treated in Eulerian framework using the finite volume method. The model has been applied to a wide range of sprays and compared to a number of experiments with different operating conditions including high liquid injection pressure and high ambient gas density and temperature. A reasonable agreement is found by the ETAB model with most of the data while the TAB and the DDB models continually underestimate the penetration and drop sizes of the spray. The hybrid breakup models perform well and show better agreement with the available experimental data than the single breakup models. In term of high temperature cases, the model correctly captures the effect of evaporation on the different spray properties especially with hybrid break up model.
|
3 |
Bayesian Inference of a Finite Population under Selection BiasXu, Zhiqing 01 May 2014 (has links)
Length-biased sampling method gives the samples from a weighted distribution. With the underlying distribution of the population, one can estimate the attributes of the population by converting the weighted samples. In this thesis, generalized gamma distribution is considered as the underlying distribution of the population and the inference of the weighted distribution is made. Both the models with known and unknown finite population size are considered. In the modes with known finite population size, maximum likelihood estimation and bootstrapping methods are attempted to derive the distributions of the parameters and population mean. For the sake of comparison, both the models with and without the selection bias are built. The computer simulation results show the model with selection bias gives better prediction for the population mean. In the model with unknown finite population size, the distributions of the population size as well as the sample complements are derived. Bayesian analysis is performed using numerical methods. Both the Gibbs sampler and random sampling method are employed to generate the parameters from their joint posterior distribution. The fitness of the size-biased samples are checked by utilizing conditional predictive ordinate.
|
4 |
Casual analysis using two-part models : a general framework for specification, estimation and inferenceHao, Zhuang 22 June 2018 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The two-part model (2PM) is the most widely applied modeling and estimation framework in empirical health economics. By design, the two-part model allows the process governing observation at zero to systematically differ from that which determines non-zero observations. The former is commonly referred to as the extensive margin (EM) and the latter is called the intensive margin (IM). The analytic focus of my dissertation is on the development of a general framework for specifying, estimating and drawing inference regarding causally interpretable (CI) effect parameters in the 2PM context. Our proposed fully parametric 2PM (FP2PM) framework comprises very flexible versions of the EM and IM for both continuous and count-valued outcome models and encompasses all implementations of the 2PM found in the literature. Because our modeling approach is potential outcomes (PO) based, it provides a context for clear definition of targeted counterfactual CI parameters of interest. This PO basis also provides a context for identifying the conditions under which such parameters can be consistently estimated using the observable data (via the appropriately specified data generating process). These conditions also ensure that the estimation results are CI. There is substantial literature on statistical testing for model selection in the 2PM context, yet there has been virtually no attention paid to testing the “one-part” null hypothesis. Within our general modeling and estimation framework, we devise a relatively simple test of that null for both continuous and count-valued outcomes. We illustrate our proposed model, method and testing protocol in the context of estimating price effects on the demand for alcohol.
|
5 |
Bayesian and classical inference for the generalized gamma distribution and related models / Análise clássica e Bayesiana para a distribuição gama generalizada e modelos relacionadosRamos, Pedro Luiz 22 February 2018 (has links)
The generalized gamma (GG) distribution is an important model that has proven to be very flexible in practice for modeling data from several areas. This model has important sub-models, such as the Weibull, gamma, lognormal, Nakagami-m distributions, among others. In this work, our main objective is to develop different estimation procedures for the unknown parameters of the generalized gamma distribution and related models (Nakagami-m and gamma), considering both classical and Bayesian approaches. Under the Bayesian approach, we provide in a simple way necessary and sufficient conditions to check whether or not objective priors lead proper posterior distributions for the Nakagami, gamma, and GG distributions. As a result, one can easily check if the obtained posterior is proper or improper directly looking at the behavior of the improper prior. These theorems are applied to different objective priors such as Jeffreyss rule, Jeffreys prior, maximal data information prior and reference priors. Simulation studies were conducted to investigate the performance of the Bayes estimators. Moreover, maximum a posteriori (MAP) estimators for the Nakagami and gamma distribution that have simple closed-form expressions are proposed Numerical results demonstrate that the MAP estimators outperform the existing estimation procedures and produce almost unbiased estimates for the fading parameter even for a small sample size. Finally, a new lifetime distribution that is expressed as a two-component mixture of the GG distribution is presented. / A distribuição gama Generalizada (GG) possui um papel fundamental para modelar dados em diversas áreas. Tal distribuição possui como casos particulares importantes distribuições, tais como, Weibull, Gama, lognormal, Nakagami-m, dentre outras. Nesta tese, tem-se como objetivo principal, considerando as abordagens clássica e Bayesiana, desenvolver diferentes procedimentos de estimação para os parâmetros da distribuição gama generalizada e de alguns dos seus casos particulares dentre eles as distribuições Nakagami-m e Gama. Do ponto de vista Bayesiano, iremos propor de forma simples, condições suficientes e necessárias para verificar se diferentes distribuições a priori não-informativas impróprias conduzem a distribuições posteriori próprias. Tais resultados são apresentados para as distribuições Nakagami-m, gama e gama generalizada. Assim, com a criação de novas prioris não-informativas, para tais modelos, futuros pesquisadores poderão utilizar nossos resultados para verificar se as distribuições a posteriori obtidas são impróprias ou não. Aplicações dos teoremas propostos são apresentados em diferentes prioris objetivas, tais como, a regra de Jeffreys, priori Jeffreys, priori maximal data information e prioris de referência. Iremos também realizar estudos de simulação para investigar a influência destas prioris nas estimativas a posteriori. Além disso, são propostos estimadores de máxima a posteriori em forma fechada para as distribuições Nakagami-m e Gama. Por meio de estudos de simulação verificamos que tais estimadores superam os procedimentos de estimação existentes e produzem estimativas quase não-viciadas para os parâmetros de interesse. Por fim, apresentamos uma nova distribuição obtida considerando um modelo de mistura de distribuições gama generalizada.
|
6 |
A Framework to Model Bond Liquidity / Ett ramverk för att modellera obligationslikviditetIssa, Alan January 2023 (has links)
The liquidity of financial assets can be studied in various different ways. In this thesis, liquidity is defined as the cost and time required to liquidate a position. While the liquidity of highly traded financial instruments like stocks is typically determined by analyzing the order book, the lack of an order book for over-the-counter bond trading presents challenges for estimating bond liquidity. The objective of this thesis is to develop a framework for estimating the cost and time required to liquidate a bond position. To achieve this, we propose a theoretical order book model based on the order book of more actively traded instruments, and estimate the model parameters using bond transaction data. The volume available to trade in the theoretical order book was modelled as gamma distributed stochastic process. The distribution of the liquidation cost could thereafter be derived where the parameters were estimated using the maximum likelihood estimation. The liquidation time, or liquidity horizon, was then determined through the solution of an optimization problem. The proposed framework for estimating bond liquidity produced promising results. The estimated parameters of the gamma distributed stochastic process accurately captured the behavior of bond trading volumes, allowing for a reliable estimation of the distribution of liquidation costs. Additionally, the optimization problem used to determine the liquidity horizon produced reasonable estimates. / Likviditeten hos finansiella tillgångar kan studeras på olika sätt. I denna uppsats definieras likviditeten som kostnaden och tiden som krävs för att likvidera en position. Medans likviditeten hos aktivt handlade finansiella tillgångar som aktier vanligtvis bestäms genom att analysera orderboken, så medför bristen på en orderbok för handel med "over-the-counter" obligationer utmaningar för att uppskatta likviditeten för dem. Syftet med denna uppsats är att utveckla ett ramverk för att uppskatta kostnaden och tiden som krävs för att likvidera en obligationsposition. För att uppnå detta föreslår vi en teoretisk orderboksmodell baserad på orderboken för mer aktivt handlade instrument, och uppskattar modellparametrarna med hjälp av data för obligationsaffärer. Volymen som är tillgänglig att handla i den teoretiska orderboken modellerades som en gammafördelad stokastisk process. Fördelningen av likvidationskostnaden kunde sedan härledas där parametrarna uppskattades med hjälp av maximum likelihood-estimering. Likvidationstiden, eller likvidationshoristonten, bestämdes sedan genom att lösa ett optimeringsproblem. Det föreslagna ramverket för att uppskatta likviditeten hos obligationer gav lovande resultat. De uppskattade parametrarna för den gammafördelade stokastiska processen fångade noggrant upp beteendet hos handelsvolymerna för obligationer, vilket möjliggjorde en pålitlig uppskattning av fördelning av likvidationskostnader. Optimeringsproblemet som användes för att bestämma likviditetshorisontens gav dessutom rimliga uppskattningar.
|
7 |
New regression methods for measures of central tendencyAristodemou, Katerina January 2014 (has links)
Measures of central tendency have been widely used for summarising statistical data, with the mean being the most popular summary statistic. However, in reallife applications it is not always the most representative measure of central location, especially when dealing with data which is skewed or contains outliers. Alternative statistics with less bias are the median and the mode. Median and quantile regression has been used in different fields to examine the effect of factors at different points of the distribution. Mode estimation, on the other hand, has found many applications in cases where the analysis focuses on obtaining information about the most typical value or pattern. This thesis demonstrates that mode also plays an important role in the analysis of big data, which is becoming increasingly important in many sectors of the global economy. However, mode regression has not been widely applied, even though there is a clear conceptual benefit, due to the computational and theoretical limitations of the existing estimators. Similarly, despite the popularity of the binary quantile regression model, computational straight forward estimation techniques do not exist. Driven by the demand for simple, well-found and easy to implement inference tools, this thesis develops a series of new regression methods for mode and binary quantile regression. Chapter 2 deals with mode regression methods from the Bayesian perspective and presents one parametric and two non-parametric methods of inference. Chapter 3 demonstrates a mode-based, fast pattern-identification method for big data and proposes the first fully parametric mode regression method, which effectively uncovers the dependency of typical patterns on a number of covariates. The proposed approach is demonstrated through the analysis of a decade-long dataset on the Body Mass Index and associated factors, taken from the Health Survey for England. Finally, Chapter 4 presents an alternative binary quantile regression approach, based on the nonlinear least asymmetric weighted squares, which can be implemented using standard statistical packages and guarantees a unique solution.
|
8 |
廣義Gamma分配在競爭風險上的分析 / An analysis on generalized Gamma distribution's application on competing risk陳嬿婷 Unknown Date (has links)
存活分析主要在研究事件的發生時間;傳統的存活分析並不考慮治癒者(或免疫者)的存在。若以失敗為事件,且造成失敗的可能原因不止一種,但它們不會同時發生,則這些失敗原因就是失敗事件的競爭風險。競爭風險可分為有參數的競爭風險與無母數的競爭風險。本文同時考慮了有治癒與有參數的混合廣義Gamma分配,並將預估計的位置參數與失敗機率有關的參數與解釋變數結合,代入Choi及Zhou(2002)提出的最大概似估計量的大樣本性質。並考慮在治癒情況下,利用電腦模擬來估計在型一設限及無訊息(non-informative)的隨機設限(random censoring)下之一個失敗原因與兩個失敗原因下的參數平均數與標準差。 / The purpose of survival analysis is aiming to analyze the timeline of events. The typically method of survival analysis don’t take account of the curer (or the immune). If the event is related to failure and there are more than one possible reason causing the failure but are not happening at the same time, we called the possible reasons a competing risk for failed occurrence. competing risk can be categorized as parameter and non-parameter. This research has considered the generalized gamma distribution over both cure and parameter aspects. In addition, it combines anticipated parameter with covariate which affected to the possibilities of failure. Follow by the previous data, it is then substituted by the large-sample property of the maximum likelihood estimator which is presented by Choi and Zhou in 2002. With considering the possibilities of cure, it uses computer modeling to investigate that under the condition of type-1 censoring and non-informatively random censoring, we will find out the parameter mean and standard error that is resulted by one and two reason causes failure.
|
9 |
APPLICATIONS OF THE BIVARIATE GAMMA DISTRIBUTION IN NUTRITIONAL EPIDEMIOLOGY AND MEDICAL PHYSICSBarker, Jolene 26 September 2008 (has links)
In this thesis the utility of a bivariate gamma distribution is explored. In the field of nutritional epidemiology a nutrition density transformation is used to reduce collinearity. This phenomenon will be shown to result due to the independent variables following a bivariate gamma model. In the field of radiation oncology paired comparison of variances is often performed. The bivariate gamma model is also appropriate for fitting correlated variances. A method for simulating bivariate gamma random variables is presented. This method is used to generate data from several bivariate gamma models and the asymptotic properties of a test statistic, suggested for the radiation oncology application, is studied.
|
10 |
最大利潤下規格上限與EWMA管制圖之設計 / Design of upper specification and EWMA control chart with maximal profit蔡佳宏, Tsai, Chia Hung Unknown Date (has links)
The determination of economic control charts and the determination of specification limits with minimum cost are two different research topics. In this study, we first combine the design of economic control charts and the determination of specification limits to maximize the expected profit per unit time for the smaller the better quality variable following the gamma distribution. Because of the asymmetric distribution, we design the EWMA control chart with asymmetric control limits. We simultaneously determine the economic EWMA control chart and upper specification limit with maximum expected profit per unit time. Then, extend the approach to determine the economic variable sampling interval EWMA control chart and upper specification limit with maximum expected profit per unit time.
In all our numerical examples of the two profit models, the optimum expected profit per unit time under inspection is higher than that of no inspection. The detection ability of the EWMA chart with an appropriate weight is always better than the X-bar probability chart. The detection ability of the VSI EWMA chart is also superior to that of the fixed sampling interval EWMA chart. Sensitivity analyses are provided to determine the significant parameters for the optimal design parameters and the optimal expected profit per unit time.
|
Page generated in 0.0796 seconds