• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 20
  • 11
  • 7
  • 4
  • 3
  • 1
  • 1
  • Tagged with
  • 56
  • 22
  • 15
  • 10
  • 7
  • 7
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Rate Estimators for Non-stationary Point Processes

Anna N Tatara (6629942) 11 June 2019 (has links)
<div>Non-stationary point processes are often used to model systems whose rates vary over time. Estimating underlying rate functions is important for input to a discrete-event simulation along with various statistical analyses. We study nonparametric estimators to the marked point process, the infinite-server queueing model, and the transitory queueing model. We conduct statistical inference for these estimators by establishing a number of asymptotic results.</div><div><br></div><div>For the marked point process, we consider estimating the offered load to the system over time. With direct observations of the offered load sampled at fixed intervals, we establish asymptotic consistency, rates of convergence, and asymptotic covariance through a Functional Strong Law of Large Numbers, a Functional Central Limit Theorem, and a Law of Iterated Logarithm. We also show that there exists an asymptotically optimal interval width as the sample size approaches infinity.</div><div><br></div><div>The infinite-server queueing model is central in many stochastic models. Specifically, the mean number of busy servers can be used as an estimator for the total load faced to a multi-server system with time-varying arrivals and in many other applications. Through an omniscient estimator based on observing both the arrival times and service requirements for n samples of an infinite-server queue, we show asymptotic consistency and rate of convergence. Then, we establish the asymptotics for a nonparametric estimator based on observations of the busy servers at fixed intervals.</div><div><br></div><div>The transitory queueing model is crucial when studying a transitory system, which arises when the time horizon or population is finite. We assume we observe arrival counts at fixed intervals. We first consider a natural estimator which applies an underlying nonhomogeneous Poisson process. Although the estimator is asymptotically unbiased, we see that a correction term is required to retrieve an accurate asymptotic covariance. Next, we consider a nonparametric estimator that exploits the maximum likelihood estimator of a multinomial distribution to see that this estimator converges appropriately to a Brownian Bridge.</div>
12

Argmax over Continuous Indices of Random Variables - An Approach Using Random Fields

Malmberg, Hannes Unknown Date (has links)
optimizationover a discrete number of random variables. In this paperwe extend this theory from the discrete to the continuous case, andconsider the limiting distribution of the location of the best offer asthe number of offers tends to infinity.Given a set   Rd of possible offers we seek a distribution over ,the argmax measure of the best offer. It depends on , the samplingdistribution of offer locations, and a measure index , which assignsto each point x 2  a probability distribution of offers.This problem is closely related to argmax theory of marked pointprocesses, altough we consider deterministic sequences of points inspace, to allow for greater generality. We first define a finite sampleargmax measure and then give conditions under which it converges asthe number of offers tends to infinity.To this end, we introduce a max-field of best offers and use continuityproperties of this field to calculate the argmax measure. Wedemonstrate the usefulness of the method by giving explicit formulasfor the limiting argmax distribution for a large class of models, includingexponential independent offers with a deterministic, additivedisturbance term. Finally, we illustrate the theory by simulations.
13

Spatial Service Systems Modelled as Stochastic Integrals of Marked Point Processes

Jones, Matthew O. 14 July 2005 (has links)
We characterize the equilibrium behavior of a class of stochastic particle systems, where particles (representing customers, jobs, animals, molecules, etc.) enter a space randomly through time, interact, and eventually leave. The results are useful for analyzing the dynamics of randomly evolving systems including spatial service systems, species populations, and chemical reactions. Such models with interactions arise in the study of species competitions and systems where customers compete for service (such as wireless networks). The models we develop are space-time measure-valued Markov processes. Specifically, particles enter a space according to a space-time Poisson process and are assigned independent and identically distributed attributes. The attributes may determine their movement in the space, and whenever a new particle arrives, it randomly deletes particles from the system according to their attributes. Our main result establishes that spatial Poisson processes are natural temporal limits for a large class of particle systems. Other results include the probability distributions of the sojourn times of particles in the systems, and probabilities of numbers of customers in spatial polling systems without Poisson limits.
14

On the separation of preferences among marked point process wager alternatives

Park, Jee Hyuk 15 May 2009 (has links)
A wager is a one time bet, staking money on one among a collection of alternatives having uncertain reward. Wagers represent a common class of engineering decision, where “bets” are placed on the design, deployment, and/or operation of technology. Often such wagers are characterized by alternatives having value that evolves according to some future cash flow. Here, the values of specific alternatives are derived from a cash flow modeled as a stochastic marked point process. A principal difficulty with these engineering wagers is that the probability laws governing the dynamics of random cash flow typically are not (completely) available; hence, separating the gambler’s preference among wager alternatives is quite difficult. In this dissertation, we investigate a computational approach for separating preferences among alternatives of a wager where the alternatives have values that evolve according to a marked point processes. We are particularly concerned with separating a gambler’s preferences when the probability laws on the available alternatives are not completely specified.
15

Modeling the Spread of Infectious Disease Using Genetic Information Within a Marked Branching Process

Leman, Scotland C., Levy, Foster, Walker, Elaine S. 20 December 2009 (has links)
Accurate assessment of disease dynamics requires a quantification of many unknown parameters governing disease transmission processes. While infection control strategies within hospital settings are stringent, some disease will be propagated due to human interactions (patient-to-patient or patient-to- caregiver-topatient). In order to understand infectious transmission rates within the hospital, it is necessary to isolate the amount of disease that is endemic to the outside environment. While discerning the origins of disease is difficult when using ordinary spatio-temporal data (locations and time of disease detection), genotypes that are common to pathogens, with common sources, aid in distinguishing nosocomial infections from independent arrivals of the disease. The purpose of this study was to demonstrate a Bayesian modeling procedure for identifying nosocomial infections, and quantify the rate of these transmissions. We will demonstrate our method using a 10-year history of Morexella catarhallis. Results will show the degree to which pathogen-specific, genotypic information impacts inferences about the nosocomial rate of infection.
16

Formal Approaches to Globally Asynchronous and Locally Synchronous Design

Xue, Bin 30 September 2011 (has links)
The research reported in this dissertation is motivated by two trends in the system-on-chip (SoC) design industry. First, due to the incessant technology scaling, the interconnect delays are getting larger compared to gate delays, leading to multi-cycle delays in communication between functional blocks on the chip, which makes implementing a synchronous global clock difficult, and power consuming. As a result, globally asynchronous and locally synchronous (GALS) designs have been proposed for future SoCs. Second, due to time-to-market pressure, and productivity gain, intellectual property (IP) block reuse is a rising trend in SoC design industry. Predesigned IPs may already be optimized and verified for timing for certain clock frequency, and hence when used in an SoC, GALS offers a good solution that avoids reoptimizing or redesigning the existing IPs. A special case of GALS, known as Latency-Insensitive Protocol (LIP) lets designers adopt the well-understood and developed design flow of synchronous design while solving the multi-cycle latency at the interconnects. The communication fabrics for LIP are synchronous pipelines with hand shaking. However, handshake based protocol has complex control logics and the unnecessary handshake brings down the system's throughput. That is why scheduling based LIP was proposed to avoid the hand-shakes by pre-calculated clock gating sequences for each block. It is shown to have better throughput and easier to implement. Unfortunately, static scheduling only exists for bounded systems. Therefore, this type of design in literatures restrict their discussions to systems whose graphic representation has a single strongly connected component (SCC), which by the theory is bounded. This dissertation provides an optimization design flow for LIP synthesis with respect to back pressure, throughput and buffer sizes. This is based on extending the scheduled LIP with minimum modifications to render it general enough to be applicable to most systems, especially those with multiple SCCs. In order to guarantee the design correctness, a formal framework that can analyze concurrency and prevent fallacious behaviors such as overflow, deadlock etc., is required. Among many formal models of concurrency used previously in asynchronous system design, marked graphs, periodic clock calculus and polychrony are chosen for the purpose of modeling, analyzing and verifying in this work. Polychrony, originally developed for embedded software modeling and synthesis, is able to specify multi-rate interfaces. Then a synchronous composition can be analyzed to avoid incompatibly and combinational loops which causes incorrect GALS distribution. The marked graph model is a good candidate to represent the interconnection network which is quite suitable for modeling the communication and synchronizations in LIP. The periodic clock calculus is useful in analyzing clock gating sequences because periodic clock calculus easily captures data dependencies, throughput constraints as well as buffer sizes required for synchronization. These formal methods help establish a formally based design flow for creating a synchronous design and then transforming it into a GALS implementation either using LIP or in a more general GALS mechanisms. / Ph. D.
17

カラの主語性に関する研究 : コーパス検索および文処理実験

TAMAOKA, Katsuo, MU, Xin, 玉岡, 賀津雄, 穆, 欣 05 December 2014 (has links)
No description available.
18

A Study of the Calibration Regression Model with Censored Lifetime Medical Cost

Lu, Min 03 August 2006 (has links)
Medical cost has received increasing interest recently in Biostatistics and public health. Statistical analysis and inference of life time medical cost have been challenging by the fact that the survival times are censored on some study subjects and their subsequent cost are unknown. Huang (2002) proposed the calibration regression model which is a semiparametric regression tool to study the medical cost associated with covariates. In this thesis, an inference procedure is investigated using empirical likelihood ratio method. The unadjusted and adjusted empirical likelihood confidence regions are constructed for the regression parameters. We compare the proposed empirical likelihood methods with normal approximation based method. Simulation results show that the proposed empirical likelihood ratio method outperforms the normal approximation based method in terms of coverage probability. In particular, the adjusted empirical likelihood is the best one which overcomes the under coverage problem.
19

Remotely Sensed Data Segmentation under a Spatial Statistics Framework

Li, Yu 08 January 2010 (has links)
In remote sensing, segmentation is a procedure of partitioning the domain of a remotely sensed dataset into meaningful regions which correspond to different land use and land cover (LULC) classes or part of them. So far, the remotely sensed data segmentation is still one of the most challenging problems addressed by the remote sensing community, partly because of the availability of remotely sensed data from diverse sensors of various platforms with very high spatial resolution (VHSR). Thus, there is a strong motivation to propose a sophisticated data representation that can capture the significant amount of details presented in a VHSR dataset and to search for a more powerful scheme suitable for multiple remotely sensed data segmentations. This thesis focuses on the development of a segmentation framework for multiple VHSR remotely sensed data. The emphases are on VHSR data model and segmentation strategy. Starting with the domain partition of a given remotely sensed dataset, a hierarchical data model characterizing the structures hidden in the dataset locally, regionally and globally is built by three random fields: Markova random field (MRF), strict stationary random field (RF) and label field. After defining prior probability distributions which should capture and characterize general and scene-specific knowledge about model parameters and the contextual structure of accurate segmentations, the Bayesian based segmentation framework, which can lead to algorithmic implementation for multiple remotely sensed data, is developed by integrating both the data model and the prior knowledge. To verify the applicability and effectiveness of the proposed segmentation framework, the segmentation algorithms for different types of remotely sensed data are designed within the proposed segmentation framework. The first application relates to SAR intensity image processing, including segmentation and dark spot detection by marked point process. In the second application, the algorithms for LiDAR point cloud segmentation and building detection are developed. Finally, texture and colour texture segmentation problems are tackled within the segmentation framework. All applications demonstrate that the proposed data model provides efficient representations for hierarchical structures hidden in remotely sensed data and the developed segmentation framework leads to successful data processing algorithms for multiple data and task such as segmentation and object detection.
20

Remotely Sensed Data Segmentation under a Spatial Statistics Framework

Li, Yu 08 January 2010 (has links)
In remote sensing, segmentation is a procedure of partitioning the domain of a remotely sensed dataset into meaningful regions which correspond to different land use and land cover (LULC) classes or part of them. So far, the remotely sensed data segmentation is still one of the most challenging problems addressed by the remote sensing community, partly because of the availability of remotely sensed data from diverse sensors of various platforms with very high spatial resolution (VHSR). Thus, there is a strong motivation to propose a sophisticated data representation that can capture the significant amount of details presented in a VHSR dataset and to search for a more powerful scheme suitable for multiple remotely sensed data segmentations. This thesis focuses on the development of a segmentation framework for multiple VHSR remotely sensed data. The emphases are on VHSR data model and segmentation strategy. Starting with the domain partition of a given remotely sensed dataset, a hierarchical data model characterizing the structures hidden in the dataset locally, regionally and globally is built by three random fields: Markova random field (MRF), strict stationary random field (RF) and label field. After defining prior probability distributions which should capture and characterize general and scene-specific knowledge about model parameters and the contextual structure of accurate segmentations, the Bayesian based segmentation framework, which can lead to algorithmic implementation for multiple remotely sensed data, is developed by integrating both the data model and the prior knowledge. To verify the applicability and effectiveness of the proposed segmentation framework, the segmentation algorithms for different types of remotely sensed data are designed within the proposed segmentation framework. The first application relates to SAR intensity image processing, including segmentation and dark spot detection by marked point process. In the second application, the algorithms for LiDAR point cloud segmentation and building detection are developed. Finally, texture and colour texture segmentation problems are tackled within the segmentation framework. All applications demonstrate that the proposed data model provides efficient representations for hierarchical structures hidden in remotely sensed data and the developed segmentation framework leads to successful data processing algorithms for multiple data and task such as segmentation and object detection.

Page generated in 0.0549 seconds