Chiang, An- Jen
30 July 2004
Appropriate utilization of the operating room (ORs) requires a balance of many conflicting resources. This cannot be done without an understanding of the role of the OR in the finances of the institution, the missions of the institution, the actual data concerning utilization and costs. Economics of the OR environment have changed dramatically in the past 10 years. For example, technological advances have led to the introduction and advancement of minimally invasive surgical procedures, which are purported to decrease morbidity, reduce hospital length of stay, and improve outcome. However, many of these procedures actually increase OR cost, time and supplies. The increased costs of minimally invasive surgery would not have been a problem in the past, due to the additional costs would have been easily adsorbed because of the large profit margin associated with surgical procedure. Under the implementation of the NHI, the DRG, capitated payment, and global budget, it is not surprising that this area is earmarked by many hospitals as a place to reduce expenses. Therefore, all of us working in the OR must be cost efficient and maximize productivity for long-term success. Accurate estimation of operating times is a prerequisite for the efficient scheduling of the operating suite. In this study, authors sought to compare surgeons¡¦ time estimates for elective cases and to ascertain whether improvements could be made by statistical modeling. The study was conducted in the GYN department at the VGHKS from 2000, Jan. to 2003, June. Author calculates operation time distribution (lognormal) and variance, and operation time finishing probability, costs, and comparing operating time difference between surgeons.
Thesis (Ph.D.)--York University, 2006. Graduate Programme in Economics. / Typescript. Includes bibliographical references (leaves 137-141). Also available on the Internet. MODE OF ACCESS via web browser by entering the following URL: http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&res_dat=xri:pqdiss&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&rft_dat=xri:pqdiss:NR19821
Pratt, Allan D.
Thesis (Ph. D.)--University of Pittsburgh, 1974. / Includes bibliographical references.
Aristizabal, Rodrigo J.
30 March 2012
The three-parameter lognormal distribution is widely used in many areas of science. Some modifications have been proposed to improve the maximum likelihood estimator. In some cases, however, the modified maximum likelihood estimates do not exist or the procedure encounters multiple estimates. The purpose of this research is focused on estimating the threshold or location parameter , because when is known, then the other two estimated parameters are obtained from the first two MLE equations. In this research, a method for constructing confidence intervals, confidence limits, and point estimator for the threshold parameter is proposed. Monte-Carlo simulation, bisection method, and SAS/IML were used to accomplish this objective. The bias of the point estimator and mean square error (MSE) criteria were used throughout extensive simulation to evaluate the performance of the proposed method. The result shows that the proposed method can provide quite accurate estimates.
Weiyan Ge. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2003. / Includes bibliographical references (leaves 64-68). / Abstracts in English and Chinese. / Abstract --- p.ii / Acknowledgement --- p.iv / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Introduction of Power Control Problem --- p.2 / Chapter 1.1.1 --- Classification of Power Control Problem --- p.2 / Chapter 1.1.2 --- Previous Works --- p.7 / Chapter 1.2 --- Scope and Contribution of the Thesis --- p.11 / Chapter 1.3 --- Organization of the Thesis --- p.12 / Chapter 2 --- Background --- p.14 / Chapter 2.1 --- Stochastic Approximation --- p.14 / Chapter 2.2 --- Lognormal Distribution --- p.17 / Chapter 2.2.1 --- Definition and Properties --- p.17 / Chapter 2.2.2 --- Application on Radio Propagation --- p.18 / Chapter 3 --- System Model and Centralized Algorithm --- p.21 / Chapter 3.1 --- System Model --- p.21 / Chapter 3.2 --- Problem Statement and the Centralized Algorithm --- p.25 / Chapter 4 --- Proposed Stochastic Power Control Algorithm --- p.30 / Chapter 4.1 --- Proposed Power Control Algorithm --- p.30 / Chapter 4.2 --- Basic Properties of the Algorithm --- p.33 / Chapter 4.3 --- Convergence Property --- p.38 / Chapter 5 --- Numerical Results --- p.44 / Chapter 5.1 --- Simulation Model --- p.44 / Chapter 5.2 --- Numerical Results --- p.47 / Chapter 6 --- Conclusions And Future Works --- p.58 / Chapter 6.1 --- Conclusions --- p.58 / Chapter 6.2 --- Future Works --- p.60 / Chapter A --- Basic Properties of LOG-Distribution --- p.62 / Bibliography --- p.64
Rutikanga, Justin Ushize
>Magister Scientiae - MSc / There exists an extensive statistics literature dealing with non-parametric deconvolution, the estimation of the underlying population probability density when sample values are subject to measurement errors. In parametric deconvolution, on the other hand, the data are known to be from a specific distribution. In this case the parameters of the distribution can be estimated by e.g. maximum likelihood. In realistic cases the measurement errors may be heteroscedastic and there may be unknown parameters associated with the distribution. The specific realistic case is investigated in which the measurement error standard deviation is proportional to the true sample values. In this case it is shown that the method of moment’s estimation is particularly simple. Estimation by maximum likelihood is computationally very expensive, since numerical integration needs to be performed for each data point, for each evaluation of the likelihood function. Method of moment’s estimation sometimes fails to give physically meaningful estimates. The origin of this problem lies in the large sampling variations of the third moment. Possible remedies are considered. Due to the fact that a convolution integral needed to be calculated for each data point, and that this has to be repeated for each iteration towards the solution, maximum likelihood computing cost is very high. New preliminary work suggests that saddle point approximations could sometimes be used for the convolution integrals. This allows much larger datasets to be dealt with. Application of the theory is illustrated with simulation and real data.
Ginos, Brenda Faith
13 November 2009
(has links) (PDF)
The lognormal distribution is useful in modeling continuous random variables which are greater than or equal to zero. Example scenarios in which the lognormal distribution is used include, among many others: in medicine, latent periods of infectious diseases; in environmental science, the distribution of particles, chemicals, and organisms in the environment; in linguistics, the number of letters per word and the number of words per sentence; and in economics, age of marriage, farm size, and income. The lognormal distribution is also useful in modeling data which would be considered normally distributed except for the fact that it may be more or less skewed (Limpert, Stahel, and Abbt 2001). Appropriately estimating the parameters of the lognormal distribution is vital for the study of these and other subjects. Depending on the values of its parameters, the lognormal distribution takes on various shapes, including a bell-curve similar to the normal distribution. This paper contains a simulation study concerning the effectiveness of various estimators for the parameters of the lognormal distribution. A comparison is made between such parameter estimators as Maximum Likelihood estimators, Method of Moments estimators, estimators by Serfling (2002), as well as estimators by Finney (1941). A simulation is conducted to determine which parameter estimators work better in various parameter combinations and sample sizes of the lognormal distribution. We find that the Maximum Likelihood and Finney estimators perform the best overall, with a preference given to Maximum Likelihood over the Finney estimators because of its vast simplicity. The Method of Moments estimators seem to perform best when σ is less than or equal to one, and the Serfling estimators are quite accurate in estimating μ but not σ in all regions studied. Finally, these parameter estimators are applied to a data set counting the number of words in each sentence for various documents, following which a review of each estimator's performance is conducted. Again, we find that the Maximum Likelihood estimators perform best for the given application, but that Serfling's estimators are preferred when outliers are present.
Využití kvantilových funkcí při kostrukci pravděpodobnostních modelů mzdových rozdělení / An Application of Quantile Functions in Probability Model Constructions of Wage DistributionsPavelka, Roman January 2004 (has links)
Over the course of years from 1995 to 2008 was acquired by Average Earnings Information System under the professional gestation of the Czech Republic Ministry of Labor and Social Affairs wage and personal data by individual employees. Thanks to the fact that in this statistical survey are collected wage and personal data by concrete employed persons it is possible to obtain a wage distribution, so it how this wages spread out among individual employees. Values that wages can be assumed in whole wage interval are not deterministical but they result from interactions of many random influences. The wage is necessary due to this randomness considered as random quantity with its probability density function. This spreading of wages in all labor market segments is described a wage distribution. Even though a representation of a high-income employee category is evidently small, one's incomes markedly affect statistically itemized average wage level and particularly the whole wage file variability. So wage employee collections are distinguished by the averaged wage that exceeds wages of a major employee mass and the high variability due to great wage heterogeneity. A general approach to distribution of earning modeling under current heterogeneity conditions don't permit to fit by some chosen distribution function or probably density function. This leads to the idea to apply some quantile approach with statistical modeling, i.e. to model an earning distribution with some appropriate inverse distributional function. The probability modeling by generalized or compound forms of quantile functions enables better to characterize a wage distribution, which distinguishes by high asymmetry and wage heterogeneity. The application of inverse distributional function as a probability model of a wage distribution can be expressed in forms of distributional mixture of partial employee's groups. All of the component distributions of this mixture model correspond to an employee's group with greater homogeneity of earnings. The partial employee's subfiles differ in parameters of their component density and in shares of this density in the total wage distribution of the wage file.
Double hitting time distribution of mean-reverting lognormal process and its application in finance. / 均值回復正態過程的雙撞擊時間分佈以及其在金融上的應 / Double hitting time distribution of mean-reverting lognormal process and its application in finance. / Jun zhi hui fu zheng tai guo cheng de shuang zhuang ji shi jian fen bu yi ji qi zai jin rong shang de yingJanuary 2009 (has links)
Chung, Tsz Kin = 均值回復正態過程的雙撞擊時間分佈以及其在金融上的應用 / 鍾子健. / Thesis submitted in: December 2008. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2009. / Includes bibliographical references (leaves 101-105). / Abstracts in English and Chinese. / Chung, Tsz Kin = Jun zhi hui fu zheng tai guo cheng de shuang zhuang ji shi jian fen bu yi ji qi zai jin rong shang de ying yong / Zhong Zijian. / Chapter 1 --- Introduction --- p.1 / Chapter 1.1 --- Overview --- p.1 / Chapter 1.2 --- Mean-reverting lognormal (MRL) process --- p.3 / Chapter 1.3 --- MRL-process and AR(l)-process --- p.5 / Chapter 2 --- Double Hitting Time Distribution of a Mean-Reverting Log-normal Process --- p.8 / Chapter 2.1 --- Introduction --- p.8 / Chapter 2.2 --- Probability density function --- p.9 / Chapter 2.3 --- Interpolation scheme - estimates and bounds --- p.12 / Chapter 2.4 --- Multi-stage approximation scheme --- p.17 / Chapter 2.5 --- Hitting time distribution and density --- p.19 / Chapter 2.6 --- Numerical analysis --- p.20 / Chapter 2.7 --- Appendix --- p.24 / Chapter 2.7.1 --- Solving the Fokker-Planck equation --- p.24 / Chapter 2.7.2 --- Probability density function associated with two piecewise-continuous boundaries --- p.27 / Chapter 3 --- Pricing Exotic Options with Mean Reversion --- p.29 / Chapter 3.1 --- Introduction --- p.29 / Chapter 3.2 --- Barrier options --- p.30 / Chapter 3.2.1 --- Double barrier options --- p.32 / Chapter 3.2.2 --- Rebates --- p.33 / Chapter 3.2.3 --- Numerical examples --- p.34 / Chapter 3.3 --- Lookback options --- p.36 / Chapter 3.3.1 --- Expected minimum and maximum --- p.37 / Chapter 3.3.2 --- Standard lookback options --- p.41 / Chapter 3.3.3 --- Fixed strike lookback options --- p.42 / Chapter 3.3.4 --- Lookback spread option --- p.43 / Chapter 3.3.5 --- Numerical examples --- p.43 / Chapter 3.4 --- Sensitivity analysis --- p.46 / Chapter 3.4.1 --- Analysis ´ؤ double knock-out call option --- p.47 / Chapter 3.4.2 --- Analysis ´ؤ floating strike lookback put option --- p.52 / Chapter 3.4.3 --- Analysis ´ؤ lookback spread option --- p.56 / Chapter 3.4.4 --- Summary --- p.60 / Chapter 3.5 --- Appendix --- p.61 / Chapter 3.5.1 --- Closed-form price formula of the double knock-out call option --- p.61 / Chapter 3.5.2 --- Derivations of lookback options --- p.63 / Chapter 4 --- Using First-Passage-Time Density to Assess Realignment Risk of a Target Zone --- p.66 / Chapter 4.1 --- Realignment risk of a target zone --- p.66 / Chapter 4.1.1 --- Currency option market and target zone --- p.66 / Chapter 4.1.2 --- First-Passage-Time approach --- p.67 / Chapter 4.1.3 --- Option price and implied volatility --- p.69 / Chapter 4.1.4 --- FPT density and realignment risk --- p.73 / Chapter 4.2 --- The ERM crisis of 1992 --- p.74 / Chapter 4.2.1 --- British pound (GBP) target zone --- p.74 / Chapter 4.2.2 --- Italian lira (ITL) target zone --- p.81 / Chapter 4.2.3 --- Summary --- p.85 / Chapter 5 --- Market Expectation of Appreciation of the Renminbi --- p.87 / Chapter 5.1 --- The Chinese Renminbi exchange rate system --- p.87 / Chapter 5.2 --- First-Passage-Time approach --- p.90 / Chapter 5.3 --- Estimations of expected maximum appreciation of Renminbi --- p.92 / Chapter 5.4 --- Appendix --- p.99 / Chapter 5.4.1 --- Derivations of the expected minimum and maximum --- p.99 / Bibliography --- p.101
Szyszkowicz, Sebastian S.
Thesis (M.App.Sc.) - Carleton University, 2007. / Includes bibliographical references (p. 117-125). Also available in electronic format on the Internet.
Page generated in 0.147 seconds