• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 6
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 35
  • 35
  • 20
  • 12
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Comparison of Different Methods for Estimating Log-normal Means

Tang, Qi 01 May 2014 (has links)
The log-normal distribution is a popular model in many areas, especially in biostatistics and survival analysis where the data tend to be right skewed. In our research, a total of ten different estimators of log-normal means are compared theoretically. Simulations are done using different values of parameters and sample size. As a result of comparison, ``A degree of freedom adjusted" maximum likelihood estimator and Bayesian estimator under quadratic loss are the best when using the mean square error (MSE) as a criterion. The ten estimators are applied to a real dataset, an environmental study from Naval Construction Battalion Center (NCBC), Super Fund Site in Rhode Island.
2

An Analysis on the Coverage Distance of LDPC-Coded Free-Space Optical Links

Luna, Ricardo, Tapse, Hrishikesh 10 1900 (has links)
ITC/USA 2008 Conference Proceedings / The Forty-Fourth Annual International Telemetering Conference and Technical Exhibition / October 27-30, 2008 / Town and Country Resort & Convention Center, San Diego, California / We design irregular Low-Density Parity-Check (LDPC) codes for free-space optical (FSO) channels for different transmitter-receiver link distances and analyze the error performance for different atmospheric conditions. The design considers atmospheric absorption, laser beam divergence, and random intensity fluctuations due to atmospheric turbulence. It is found that, for the same transmit power, a system using the designed codes works over much longer link distances than a system that employs regular LDPC codes. Our analysis is particularly useful for portable optical transceivers and mobile links.
3

Spatial regression-based model specifications for exogenous and endogenous spatial interaction

LeSage, James P., Fischer, Manfred M. 03 September 2014 (has links) (PDF)
Spatial interaction models represent a class of models that are used for modeling origin destination flow data. The interest in such models is motivated by the need to understand and explain the flows of tangible entities such as persons or commodities or intangible ones such as capital, information or knowledge between regions. The focus here is on the log-normal version of the model. In this context, we consider spatial econometric specifications that can be used to accommodate two types of dependence scenarios, one involving endogenous interaction and the other exogenous interaction. These model specifications replace the conventional assumption of independence between origin-destination-flows with formal approaches that allow for two different types of spatial dependence in flow magnitudes. (authors' abstract) / Series: Working Papers in Regional Science
4

Statistical distributions for service times

Adedigba, Adebolanle Iyabo 20 September 2005
<p>Queueing models have been used extensively in the design of call centres. In particular, a queueing model will be used to describe a help desk which is a form of a call centre. The design of the queueing model involves modelling the arrival an service processes of the system.</p><p>Conventionally, the arrival process is assumed to be Poisson and service times are assumed to be exponentially distributed. But it has been proposed that practically these are seldom the case. Past research reveals that the log-normal distribution can be used to model the service times in call centres. Also, services may involve stages/tasks before completion. This motivates the use of a phase-type distribution to model the underlying stages of service.</p><p>This research work focuses on developing statistical models for the overall service times and the service times by job types in a particular help desk. The assumption of exponential service times was investigated and a log-normal distribution was fitted to service times of this help desk. Each stage of the service in this help desk was modelled as a phase in the phase-type distribution.</p><p>Results from the analysis carried out in this work confirmed the irrelevance of the assumption of exponential service times to this help desk and it was apparent that log-normal distributions provided a reasonable fit to the service times. A phase-type distribution with three phases fitted the overall service times and the service times of administrative and miscellaneous jobs very well. For the service times of e-mail and network jobs, a phase-type distribution with two phases served as a good model.</p><p>Finally, log-normal models of service times in this help desk were approximated using an order three phase-type distribution.</p>
5

Statistical distributions for service times

Adedigba, Adebolanle Iyabo 20 September 2005 (has links)
<p>Queueing models have been used extensively in the design of call centres. In particular, a queueing model will be used to describe a help desk which is a form of a call centre. The design of the queueing model involves modelling the arrival an service processes of the system.</p><p>Conventionally, the arrival process is assumed to be Poisson and service times are assumed to be exponentially distributed. But it has been proposed that practically these are seldom the case. Past research reveals that the log-normal distribution can be used to model the service times in call centres. Also, services may involve stages/tasks before completion. This motivates the use of a phase-type distribution to model the underlying stages of service.</p><p>This research work focuses on developing statistical models for the overall service times and the service times by job types in a particular help desk. The assumption of exponential service times was investigated and a log-normal distribution was fitted to service times of this help desk. Each stage of the service in this help desk was modelled as a phase in the phase-type distribution.</p><p>Results from the analysis carried out in this work confirmed the irrelevance of the assumption of exponential service times to this help desk and it was apparent that log-normal distributions provided a reasonable fit to the service times. A phase-type distribution with three phases fitted the overall service times and the service times of administrative and miscellaneous jobs very well. For the service times of e-mail and network jobs, a phase-type distribution with two phases served as a good model.</p><p>Finally, log-normal models of service times in this help desk were approximated using an order three phase-type distribution.</p>
6

Non-inferiority hypothesis testing in two-arm trials with log-normal data

Wickramasinghe, Lahiru 07 April 2015 (has links)
In health related studies, non-inferiority tests are used to demonstrate that a new treatment is not worse than a currently existing treatment by more than a pre-specified margin. In this thesis, we discuss three approaches; a Z-score approach, a generalized p-value approach and a Bayesian approach, to test the non-inferiority hypotheses in two-arm trials for ratio of log-normal means. The log-normal distribution is widely used to describe the positive random variables with positive skewness which is appealing for data arising from studies with small sample sizes. We demonstrate the approaches using data arising from an experimental aging study on cognitive penetrability of posture control. We also examine the suitability of three methods under various sample sizes via simulations. The results from the simulation studies indicate that the generalized p-value and the Bayesian approaches reach an agreement approximately and the degree of the agreement increases when the sample sizes increase. However, the Z-score approach can produce unsatisfactory results even under large sample sizes.
7

Numerical techniques for optimal investment consumption models

Mvondo, Bernardin Gael January 2014 (has links)
>Magister Scientiae - MSc / The problem of optimal investment has been extensively studied by numerous researchers in order to generalize the original framework. Those generalizations have been made in different directions and using different techniques. For example, Perera [Optimal consumption, investment and insurance with insurable risk for an investor in a Levy market, Insurance: Mathematics and Economics, 46 (3) (2010) 479-484] applied the martingale approach to obtain a closed form solution for the optimal investment, consumption and insurance strategies of an individual in the presence of an insurable risk when the insurable risk and risky asset returns are described by Levy processes and the utility is a constant absolute risk aversion. In another work, Sattinger [The Markov consumption problem, Journal of Mathematical Economics, 47 (4-5) (2011) 409-416] gave a model of consumption behavior under uncertainty as the solution to a continuous-time dynamic control problem in which an individual moves between employment and unemployment according to a Markov process. In this thesis, we will review the consumption models in the above framework and will simulate some of them using an infinite series expansion method − a key focus of this research. Several numerical results obtained by using MATLAB are presented with detailed explanations.
8

Técnicas não-paramétricas e paramétricas usadas na análise de sobrevivência de Chrysoperla externa (Neuroptera: Chrysopidae) / Non-Parametric and Parametric Techniques used in the survival analysis of Chrysoperla externa (Neuroptera: Chrysopidae)

Miranda, Marconi Silva 13 March 2012 (has links)
Made available in DSpace on 2015-03-26T13:32:15Z (GMT). No. of bitstreams: 1 texto completo.pdf: 512216 bytes, checksum: fd4223913c0ad60bce75a563695255ec (MD5) Previous issue date: 2012-03-13 / In survival analysis, the response variable is the time of occurrence of an event of interest, denominated failure time. Another characteristic of the survival analysis is to incorporate to the study incomplete sample data, in which for a determined reason the occurrence of the event was not verified, being these data defined as censured. The objective of this paper was to compare the use of the parametric and non-parametric techniques to estimate the survival time of C. externa (Neuroptera: Chrysopidae), predator insect which feed on other insects as well as mite, under the effect of three commercial products nim-based: Neempro (10 g of azadirachtina L-1), Organic neem (3,3 g of Azadirachtina L-1) and Natuneem (1,5 g of azadirachtina L-1). With this objective the survival functions for the different concentrations of each product, through the non-parametric method of Kaplan-Meier were estimated and compared by the logrank test and by parametric techniques, using the Weibull and log-normal exponential tests. Besides that, a study in order to select the most parsimonious model was done, using for that the likelihood ratio test (LRT) as well as the Akaike information criterion (AIC). The estimates of the selected parametric model were used to determine the survival functions in the concentrations of the three products, with the purpose of comparing with the nonparametric estimator Kaplan-Meier. Once the best model was defined the median survival time of C. externa was calculated in the tested concentrations of the products. Taking into consideration the conditions described in this experiment, one can conclude that the concentrations of the nim-based products have influence in the survival of C. externa. The higher the concentration of the used products, the lower was the survival time and among the evaluated products, Neempro was the one which presented the least lethal to the natural predator. / Em análise de sobrevivência, a variável resposta é o tempo de ocorrência de um evento de interesse, denominado tempo de falha. Outra característica da análise de sobrevivência é incorporar ao estudo dados amostrais incompletos, que por algum motivo a ocorrência do evento não foi verificada, dados estes definidos como censurados. O objetivo deste trabalho foi comparar o uso das técnicas paramétricas e não-paramétricas para estimar o tempo de sobrevivência de C. externa (Neuroptera: Chrysopidae), inseto predador que se alimenta de outros insetos e ácaros, sob efeito de três produtos comerciais à base de nim: Neempro (10 g de azadirachtina L-1), Organic neem (3,3 g de Azadirachtina L-1) e Natuneem (1,5 g de azadiractina L-1). Com esse objetivo foram estimadas as funções de sobrevivência para as diferentes concentrações de cada produto, por meio do método não-paramétrico de Kaplan-Meier, e comparadas pelo teste logrank e por meio das técnicas paramétricas, utilizando os modelos exponencial, de Weibull e log-normal. Foi realizado ainda, um estudo com a finalidade de selecionar o modelo mais parcimonioso, utilizando para isto o teste da razão de verossimilhança (TRV) e o critério de informação de Akaike (AIC). As estimativas do modelo paramétrico selecionado foram usadas para determinar as funções de sobrevivência nas concentrações dos três produtos, com o objetivo de comparar com o estimador não-paramétrico de Kaplan-Meier. Definido o melhor modelo foi calculado o tempo mediano de sobrevivência do C. externa nas concentrações testadas dos produtos. Levando em consideração as condições descritas neste experimento, pode-se concluir que as concentrações dos produtos a base de nim possuem influencia na sobrevivência de C. externa. Quanto maior foi a concentração dos produtos utilizados, menor foi o tempo de sobrevivência e entre os produtos avaliados o Neempro foi o que apresentou ser o menos letal ao predador natural.
9

Development of a portable aerosol collector and spectrometer (PACS)

Cai, Changjie 01 May 2018 (has links)
The overall goal of this doctoral dissertation is to develop a prototype instrument, a Portable Aerosol Collector and Spectrometer (PACS), that can continuously measure aerosol size distributions by number, surface area and mass concentrations over a wide size range (from 10 nm to 10 µm) while also collecting particles with impactor and diffusion stages for post-sampling chemical analyses. To achieve the goal, in the first study, we designed, built and tested the PACS hardware. The PACS consists of a six-stage particle size selector, a valve system, a water condensation particle counter to measure number concentrations and a photometer to measure mass concentrations. The valve system diverts airflow to pass sequentially through upstream stages of the selector to the detectors. The stages of the selector include three impactor and two diffusion stages, which resolve particles by size and collect particles for chemical analysis. Particle penetration by size was measured through each stage to determine actual performance and account for particle losses. The measured d50 of each stage (aerodynamic diameter for impactor stages and geometric diameter for diffusion stages) was similar to the design. The pressure drop of each stage was sufficiently low to permit its operation with portable air pumps. In the second study, we developed a multi-modal log-normal (MMLN) fitting algorithm to leverage the multi-metric, low-resolution data from one sequence of PACS measurements to estimate aerosol size distributions of number, surface area, and mass concentration in near-real-time. The algorithm uses a grid-search process and a constrained linear least-square (CLLS) solver to find a tri-mode (ultrafine, fine, and coarse), log-normal distribution that best fits the input data. We refined the algorithm to obtain accurate and precise size distributions for four aerosols typical of diverse environments: clean background, urban and freeway, coal power plant, and marine surface. Sensitivity studies were conducted to explore the influence of unknown particle density and shape factor on algorithm output. An adaptive process that refined the ranges and step sizes of the grid-search reduced the computation time to fit a single size distribution in near-real-time. Assuming standard density spheres, the aerosol size distributions fit well with the normalized mean bias (NMB) of -4.9% to 3.5%, normalized mean error (NME) of 3.3% to 27.6%, and R2 values of 0.90 to 1.00. The fitted number and mass concentration biases were within ± 10% regardless of uncertainties in density and shape. With this algorithm, the PACS is able to estimate aerosol size distributions by number, surface area, and mass concentrations from 10 nm to 10 µm in near-real-time. In the third study, we developed a new algorithm–the mass distribution by composition and size (MDCS) algorithm–to estimate the mass size distribution of various particle compositions. Then we compared the PACS for measuring multi-mode aerosols to three reference instruments, including a scanning mobility particle sizer (SMPS), an aerodynamic particle sizer (APS) and a nano micro-orifice uniform deposit impactor (nanoMOUDI). We used inductively coupled plasma mass spectrometry to measure the mass of collected particles on PACS and nanoMOUDI stages by element. For the three-mode aerosol, the aerosol size distributions in three metrics measured with the PACS agreed well with those measured with the SMPS/APS: number concentration, bias = 9.4% and R2 = 0.96; surface area, bias = 17.8%, R2 = 0.77; mass, bias = -2.2%, R2 = 0.94. Agreement was considerably poorer for the two-mode aerosol, especially for surface area and mass concentrations. Comparing to the nanoMOUDI, for the three-mode aerosol, the PACS estimated the mass median diameters (MMDs) of the coarse mode well, but overestimated the MMDs for ultrafine and fine modes. The PACS overestimated the mass concentrations of ultrafine and fine mode, but underestimated the coarse mode. This work provides insight into a novel way to simultaneously assess airborne aerosol size, composition, and concentration by number, surface area and mass using cost-effective handheld technologies.
10

Mixture models for estimating operation time distributions.

Chen, Yi-Ling 12 July 2005 (has links)
Surgeon operation time is a useful and important information for hospital management, which involves operation time estimation for patients under different diagnoses, operation room scheduling, operating room utilization improvements and so on. In this work, we will focus on studying the operation time distributions of thirteen operations performed in the gynecology (GYN) department of one major teaching hospital in southern Taiwan. We firstly investigate what types of distributions are suitable in describing these operation times empirically, where log-normal and mixture log-normal distribution are identified to be acceptable statistically in describing these operation times. Then we compare and characterize the operations into different categories based on the operation time distribution estimates. Later we try to illustrate the possible reason why distributions for some operations with large data set turn out to be mixture of certain log-normal distributions. Finally we end with discussions on possible future work.

Page generated in 0.0743 seconds