• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 59
  • 43
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 126
  • 126
  • 39
  • 25
  • 22
  • 21
  • 21
  • 19
  • 17
  • 17
  • 14
  • 14
  • 13
  • 13
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.

Alternative methods of investigating the time dependent M/G/k queue

Kivestu, Peeter Andrus January 1976 (has links)
Thesis. 1976. M.S.--Massachusetts Institute of Technology. Dept. of Aeronautics and Astronautics. / Microfiche copy available in Archives and Aero. / Bibliograpy: leaf 154. / by Peeter A. Kivestu. / M.S.

Stochastic systems : models and polices [sic]

Bataineh, Mohammad Saleh, University of Western Sydney, College of Science, Technology and Environment, School of Science, Food and Horticulture January 2001 (has links)
In a multi-server system, probability distributions and loss probabilities for customers arriving with different priority categories are studied. Customers arrive in independent Poisson streams and their service times are exponentially distributed, with different rates for different priorities. The non-queuing customers will be lost if the capacity is fully occupied. In these systems, particularly for higher priority customers, the reduction of the loss probabilities is essential to guarantee the quality of the service. Four different policies for high and low priorities were introduced utilizing the fixed capacity of the system, producing different loss probabilities. The same policies were introduced in the case of a low priority being placed in the queue when the system is fully occupied. An application to the Intensive Care and Coronary Care Unit in Campbelltown Public Hospital in Sydney was introduced. This application analyses the admission and discharge by using queuing theory to develop a model which predicts the proportion of patients from each category that would be prematurely transferred as a function of the size of the unit, number of categories, mean arrival rates, and length of stay. / Master of Science (Hons)

Αποδεκτικότητα εκτιμητών για την παράμετρο της κατανομής Poisson

Παναγιωτόπουλος, Λεωνίδας Ν. 11 September 2008 (has links)
- / -

Non market valuation of alcohol consumption for off-highway vehicle parks in North Carolina

González-Sepúlveda, Juan Marcos. January 2005 (has links)
Thesis (M.S.)--University of Nevada, Reno, 2005. / "August 2005." Includes bibliographical references (leaves 48-54). Online version available on the World Wide Web.

Distribution et abondance des larves d'éperlan arc-en-ciel (Osmerus mordax) au lac Saint-Jean /

Gagnon, Karine, January 2005 (has links)
Thèse (M.Ress.Renouv.) -- Université du Québec à Chicoutimi, 2005. / Bibliogr.: f. 79-85. Document électronique également accessible en format PDF. CaQCU

Statistical inference concerning some compound and generalized discrete distributions

Bhalerao, Narayan Rangnath. January 1976 (has links)
Thesis (Ph. D.)--University of Wisconsin--Madison, 1976. / Typescript. Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 166-172).

Mean reversion models for weather derivatives /

Petschel, Ben. January 2005 (has links) (PDF)
Thesis (Ph.D.) - University of Queensland, 2005. / Includes bibliography.

Redução no vício da distribuição da deviance para dados de contagem. / Bias reduction in the distribution of the deviance for count data.

Denise Nunes Viola 26 October 2001 (has links)
Dados de contagem podem ser considerados, em geral, como provenientes de uma distribuição de Poisson. Neste contexto, a análise de tais dados apresenta certas dificuldades, pois não segue algumas pressuposições básicas para o ajuste de um modelo matemático. Desse modo, algumas transformações são sugeridas, mas nem sempre bons resultados são obtidos. No enfoque de Modelos Lineares Generalizados, a estatística que mede a qualidade do ajuste do modelo para os dados é chamada deviance. Porém, a distribuição da deviance é, em geral, desconhecida. No entanto, para dados com distribuição de Poisson, pode-se mostrar que a distribuição da deviance se aproxima de uma distribuição ?2, mas tal aproximação não é boa para tamanhos pequenos de amostra. Para melhorar essa aproximação, alguns fatores de correção para os dados são sugeridos, mas os resultados obtidos ainda não são satisfatórios. Assim, o objetivo deste trabalho é propor um novo fator de correção para os dados seguindo uma distribuição de Poisson, de modo a se obter uma melhora na distribuição da deviance para qualquer tamanho de amostra. Para isto, será adicionada uma constante à variável resposta e, através do valor esperado da deviance, calcula-se tal constante de modo a reduzir o erro cometido na aproximação. Para verificar a melhora na aproximação da distribuição da deviance a uma distribuição qui-quadrado, dados de uma distribuição de Poisson são simulados e o valor da deviance é calculado. QQ-plots são construídos para a comparação com a distribuição qui-quadrado. / Analysis of count data presents, in general, can be supposed coming from a Poisson distribution. The analysis of such data have some problems once the underlying distribution of them does not follow the basic assumptions to fit a model. Some tranformations can be suggested, but good results are not always obtained. In the approach of the Generalized Linear Models, the deviance is the statistics that measures the goodness of fit, but its distribution is unknown. Furthermore, considering Poisson distribution data, it is possible to approximate the distribution of the deviance for a chi-square distribution, but such approximation is not good for small sample size. In order of improve this approximation, corrections for the data are suggested, but the results are not good yet. Then, the aim of this work is to propose a new correction factor for data following a Poisson distribution in order to obtain an improvement in the distribution of the deviance for any sample size. For this, just adding a constant at the response variable and, through the expected value of the deviance, such constant is obtained in order to reduce the error in the aproximation. Simulated data from the Poisson distribution were made to calculate the deviance with and without the correction and QQ-plots were used to compare the values of the deviance with the chi-square distribution.

Seismic and Volcanic Hazard Analysis for Mount Cameroon Volcano

Wetie Ngongang, Ariane January 2016 (has links)
Mount Cameroon is considered the only active volcano along a 1600 km long chain of volcanic complexes called the Cameroon Volcanic Line (CVL). It has erupted seven times during the last 100 years, the most recent was in May 2000. The approximately 500,000 inhabitants that live and work around the fertile flanks are exposed to impending threats from volcanic eruptions and earthquakes. In this thesis, a hazard assessment study that involves both statistical modelling of seismic hazard parameters and the evaluation of a future volcanic risk was undertaken on Mount Cameroon. The Gutenberg-Richter magnitude-frequency relations, the annual activity rate, the maximum magnitude, the rate of volcanic eruptions and risks assessment were examined. The seismic hazard parameters were estimated using the Maximum Likelihood Method on the basis of a procedure which combines seismic data containing incomplete files of large historical events with complete files of short periods of observations. A homogenous Poisson distribution model was applied to previous recorded volcanic eruptions of Mount Cameroon to determine the frequency of eruption and assess the probability of a future eruption. Frequency-magnitude plots indicated that Gutenberg-Richter b-values are partially dependent on the maximum regional magnitude and the method used in their calculation. b-values showed temporal and spatial variation with an average value of 1.53 ± 0.02. The intrusion of a magma body generating the occurrence of relatively small earthquakes as observed in our instrumental catalogue, could be responsible for this high anomalous b-value. An epicentre map of locally recorded earthquakes revealed that the southeastern zone is the most seismically active part of the volcano. The annual mean activity rate of the seismicity strongly depends on the time span of the seismic catalogue and results showed that on average, one earthquake event occurs every 10 days. The maximum regional magnitude values which had been determined from various approaches overlap when their standard deviations are taken into account. However, the magnitude distribution model of the Mt. Cameroon earthquakes might not follow the form of the Gutenberg-Richter frequency magnitude relationship. The datations of the last eruptive events that have occurred on Mt. Cameroon volcanic complex are presented. No specific pattern was observed on the frequency of eruptions, which means that a homogenous Poisson distribution provides a suitable model to estimate the rate of occurrence of volcanic eruptions and evaluate the risk of a future eruption. Two different approaches were used to estimate the mean eruption rate (λ) and both yielded a value of 0.074. The results showed that eruptions take place on average once every 13 years and, with the last eruption occurring over 15 years ago, it is considered that there is at present a high risk of an eruption to occur. / Dissertation (MSc)--University of Pretoria, 2016. / Geology / MSc / Unrestricted

Casual analysis using two-part models : a general framework for specification, estimation and inference

Hao, Zhuang 22 June 2018 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / The two-part model (2PM) is the most widely applied modeling and estimation framework in empirical health economics. By design, the two-part model allows the process governing observation at zero to systematically differ from that which determines non-zero observations. The former is commonly referred to as the extensive margin (EM) and the latter is called the intensive margin (IM). The analytic focus of my dissertation is on the development of a general framework for specifying, estimating and drawing inference regarding causally interpretable (CI) effect parameters in the 2PM context. Our proposed fully parametric 2PM (FP2PM) framework comprises very flexible versions of the EM and IM for both continuous and count-valued outcome models and encompasses all implementations of the 2PM found in the literature. Because our modeling approach is potential outcomes (PO) based, it provides a context for clear definition of targeted counterfactual CI parameters of interest. This PO basis also provides a context for identifying the conditions under which such parameters can be consistently estimated using the observable data (via the appropriately specified data generating process). These conditions also ensure that the estimation results are CI. There is substantial literature on statistical testing for model selection in the 2PM context, yet there has been virtually no attention paid to testing the “one-part” null hypothesis. Within our general modeling and estimation framework, we devise a relatively simple test of that null for both continuous and count-valued outcomes. We illustrate our proposed model, method and testing protocol in the context of estimating price effects on the demand for alcohol.

Page generated in 0.1653 seconds