• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 53
  • 13
  • 4
  • 4
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 79
  • 79
  • 28
  • 15
  • 13
  • 12
  • 11
  • 10
  • 10
  • 9
  • 9
  • 8
  • 8
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Local parametric poisson models for fisheries data

Yee, Irene Mei Ling January 1988 (has links)
Poisson process is a common model for count data. However, a global Poisson model is inadequate for sparse data such as the marked salmon recovery data that have huge extraneous variations and noise. An empirical Bayes model, which enables information to be aggregated to overcome the lack of information from data in individual cells, is thus developed to handle these data. The method fits a local parametric Poisson model to describe the variation at each sampling period and incorporates this approach with a conventional local smoothing technique to remove noise. Finally, the overdispersion relative to the Poisson model is modelled by mixing these locally smoothed, Poisson models in an appropriate way. This method is then applied to the marked salmon data to obtain the overall patterns and the corresponding credibility intervals for the underlying trend in the data. / Science, Faculty of / Statistics, Department of / Graduate
42

Statistical inferences for a pure birth process

Hsu, Jyh-Ping January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
43

A Bayesian approach for modeling stochastic deterioration

SILVA, Rodrigo Bernardo da 31 January 2010 (has links)
Made available in DSpace on 2014-06-12T17:40:37Z (GMT). No. of bitstreams: 2 arquivo720_1.pdf: 2087569 bytes, checksum: 4e440439e51674690e086dbc501c7a58 (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2010 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / A modelagem de deterioracão tem estado na vanguarda das analises Bayesianas de confiabilidade. As abordagens mais conhecidas encontradas na literatura para este proposito avaliam o comportamento da medida de confiabilidade ao longo do tempo a luz dos dados empiricos, apenas. No contexto de engenharia de confiabilidade, essas abordagens têm aplicabilidade limitada uma vez que frequentemente lida-se com situacões caracterizadas pela escassez de dados empiricos. Inspirado em estrategias Bayesianas que agregam dados empiricos e opiniões de especialistas na modelagem de medidas de confiabilidade não-dependentes do tempo, este trabalho propõe uma metodologia para lidar com confiabilidade dependente do tempo. A metodologia proposta encapsula conhecidas abordagens Bayesianas, como metodos Bayesianos para combinar dados empiricos e opiniões de especialistas e modelos Bayesianos indexados no tempo, promovendo melhorias sobre eles a fim de encontrar um modelo mais realista para descrever o processo de deterioracão de um determinado componente ou sistema. Os casos a serem discutidos são os tipicamente encontrados na pratica de confiabilidade (por meio de simulacão): avaliacão dos dados sobre tempo de execucão para taxas de falha e a quantidade de deterioracão, dados com base na demanda para probabilidade de falha; e opiniões de especialistas para analise da taxa de falha, quantidade de deterioracão e probabilidade de falha. Estes estudos de caso mostram que o uso de informacões especializadas pode levar a uma reducão da incerteza sobre distribuicões de medidas de confiabilidade, especialmente em situacões em que poucas ou nenhuma falha e observada.
44

Composing Deep Learning and Bayesian Nonparametric Methods

Zhang, Aonan January 2019 (has links)
Recent progress in Bayesian methods largely focus on non-conjugate models featured with extensive use of black-box functions: continuous functions implemented with neural networks. Using deep neural networks, Bayesian models can reasonably fit big data while at the same time capturing model uncertainty. This thesis targets at a more challenging problem: how do we model general random objects, including discrete ones, using random functions? Our conclusion is: many (discrete) random objects are in nature a composition of Poisson processes and random functions}. Thus, all discreteness is handled through the Poisson process while random functions captures the rest complexities of the object. Thus the title: composing deep learning and Bayesian nonparametric methods. This conclusion is not a conjecture. In spacial cases such as latent feature models , we can prove this claim by working on infinite dimensional spaces, and that is how Bayesian nonparametric kicks in. Moreover, we will assume some regularity assumptions on random objects such as exchangeability. Then the representations will show up magically using representation theorems. We will see this two times throughout this thesis. One may ask: when a random object is too simple, such as a non-negative random vector in the case of latent feature models, how can we exploit exchangeability? The answer is to aggregate infinite random objects and map them altogether onto an infinite dimensional space. And then assume exchangeability on the infinite dimensional space. We demonstrate two examples of latent feature models by (1) concatenating them as an infinite sequence (Section 2,3) and (2) stacking them as a 2d array (Section 4). Besides, we will see that Bayesian nonparametric methods are useful to model discrete patterns in time series data. We will showcase two examples: (1) using variance Gamma processes to model change points (Section 5), and (2) using Chinese restaurant processes to model speech with switching speakers (Section 6). We also aware that the inference problem can be non-trivial in popular Bayesian nonparametric models. In Section 7, we find a novel solution of online inference for the popular HDP-HMM model.
45

Dynamic Switching Times For Season And Single Tickets In Sports And Entertainment With Time Dependent Demand Rates

Pakyardim, Yusuf Kenan 01 August 2011 (has links) (PDF)
The most important market segmentation in sports and entertainment industry is the competition between customers that buy bundle and single tickets. A common selling practice is starting the selling season with bundle ticket sales and switching to selling single tickets later on. The aim of this practice is to increase the number of customers that buy bundles, to create a fund before the season starts and to increase the load factor of the games with low demand. In this thesis, we investigate the effect of time dependent demand on dynamic switching times and the potential revenue gain over the case where the demand rate is assumed to be constant with time.
46

Bayesian multivariate poisson-lognormal regression for crash prediction on rural two-lane highways

Ma, Jianming, 1972- 12 August 2011 (has links)
Not available / text
47

Large deviations for re-sequencing buffer size /

Gao, Yanfei, January 1900 (has links)
Thesis (M. Sc.)--Carleton University, 2008. / Includes bibliographical references (p. 81-83). Also available in electronic format on the Internet.
48

Stochastic Dynamic Model of Urban Traffic and Optimum Management of Its Flow and Congestion

Wang, Shi'an January 2018 (has links)
There are a lot more roads being built periodically in most of the countries with the advancement of modern society. In order to promote the overall traffic flow quality within different cities, city traffic management has been playing a more and more essential role during the last few decades. In recent years, a significantly increasing attention has been paid to the management of traffic flow in major cities all over the world. In this thesis, we develop a stochastic dynamic model for urban traffic along with physical constraints characteristic of intersections equipped with traffic light. We assume that the incoming traffic to each stream in an intersection is amenable to the Poisson random process with variable intensity (mean). We introduce expressions for traffic throughput, congestion as well as operator's waiting time for the typical intersection in a city and hereafter define an appropriate objective functional. Afterwards, we formulate an optimization problem and propose the sequential (or recursive) algorithm based on the principle of optimality (dynamic programming) due to Bellman. The solution if implemented is expected to improve throughput, reduce congestion, and promote driver's satisfaction. Because the dynamic programming method is computationally quite intensive, we consider the scenario that one unit traffic stream stands for a specific number of vehicles which actually depends on the volume of traffic flow through the intersection. The system is simulated with inputs described by several distinct nonhomogeneous Poisson processes. For example, we apply the typical traffic arrival rate in Canada with morning peak hour at around 7:30 AM and afternoon peak hour at around 4:30 PM whilst it is also applied with morning rush hour at about 8:00 AM and afternoon rush hour at about 6:00 PM like in China. In the meanwhile, we also present a group of numerical results for the traffic arrival rates that have shorter morning peak-hour period but longer afternoon rush hour period. This may occasionally happen when there are some social activities or big events in the afternoon. In addition, another series of experiments are carried out to illustrate the feasibility of the proposed dynamic model based on the traffic arrival rates with only one peak-hour throughout the whole day. The system is simulated with a series of experiments and the optimization problem is solved by dynamic programming based on the proposed algorithm which gives us the optimal feedback control law. More specifically, the results show that both the optimal traffic light timing allocated for each stream and the congestion broadcast level (CBL) of each road segment during each time segment are found. Accordingly, the corresponding optimal cost can be found for any given initial condition. It is reasonably believed that this stochastic dynamic model would be potentially applicable for real time adaptive traffic control system.
49

Perturbation of renewal processes

Akin, Osman Caglar 05 1900 (has links)
Renewal theory began development in the early 1940s, as the need for it in the industrial engineering sub-discipline operations research had risen. In time, the theory found applications in many stochastic processes. In this thesis I investigated the effect of seasonal effects on Poisson and non-Poisson renewal processes in the form of perturbations. It was determined that the statistical analysis methods developed at UNT Center for Nonlinear Science can be used to detect the effects of seasonality on the data obtained from Poisson/non-Poisson renewal systems. It is proved that a perturbed Poisson process can serve as a paradigmatic model for a case where seasonality is correlated to the noise and that diffusion entropy method can be utilized in revealing this relation. A renewal model making a connection with the stochastic resonance phenomena is used to analyze a previous neurological experiment, and it was shown that under the effect of a nonlinear perturbation, a non-Poisson system statistics may make a transition and end up in the of Poisson basin of statistics. I determine that nonlinear perturbation of the power index for a complex system will lead to a change in the complexity characteristics of the system, i.e., the system will reach a new form of complexity.
50

A Logistic System Simulation Model Encompassing Poisson Processes and Normal or Weibull Life

Hansen, Willard A. 01 May 1966 (has links)
This thesis describes a computer simulation model for determining effective spares stock levels for recoverable items at Air Force bases and depots. The simulation model is based on the following fundamental inventory theory; whenever a demand arises, it is satisfied from stock on hand, and the quantity equal to that demand is recorded immediately; when a demand exceeds stock on hand, the excess demand is backordered immediately and when item life expires procurement action is initiated at depot level. The resulting product of the model cam be used as a guide for the optimum distribution of available spares or as a computation of the necessary spares which will meet a desired percent fill rate. Outputs from the simulation model will also enable evaluation of the spares level effects as a result of change in other logistic parameters. The purpose of this thesis is two-fold to the extent that it presents: (a) A computer simulation model of an Air Force logistic system; and (b) A discussion of compound Monte-Carlo demand generation involving various analytic failure distributions. The specific nature of the problem to which the simulation model is applied is described and the model construction and output are discussed in detail.

Page generated in 0.0648 seconds