• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 3
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 33
  • 33
  • 12
  • 11
  • 9
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Jackknife Empirical Likelihood Inference For The Pietra Ratio

Su, Yueju 17 December 2014 (has links)
Pietra ratio (Pietra index), also known as Robin Hood index, Schutz coefficient (Ricci-Schutz index) or half the relative mean deviation, is a good measure of statistical heterogeneity in the context of positive-valued data sets. In this thesis, two novel methods namely "adjusted jackknife empirical likelihood" and "extended jackknife empirical likelihood" are developed from the jackknife empirical likelihood method to obtain interval estimation of the Pietra ratio of a population. The performance of the two novel methods are compared with the jackknife empirical likelihood method, the normal approximation method and two bootstrap methods (the percentile bootstrap method and the bias corrected and accelerated bootstrap method). Simulation results indicate that under both symmetric and skewed distributions, especially when the sample is small, the extended jackknife empirical likelihood method gives the best performance among the six methods in terms of the coverage probabilities and interval lengths of the confidence interval of Pietra ratio; when the sample size is over 20, the adjusted jackknife empirical likelihood method performs better than the other methods, except the extended jackknife empirical likelihood method. Furthermore, several real data sets are used to illustrate the proposed methods.
12

Stochastic Geometry-based Analysis of LEO Satellite Communication Systems

Talgat, Anna 21 July 2020 (has links)
Wireless coverage becomes one of the most significant needs of modern society because of its importance in various applications such as health, distance education, industry, and much more. Therefore, it is essential to provide wireless coverage worldwide, including remote areas, rural areas, and poorly served locations. Recent advances in Low Earth Orbit (LEO) satellite communications provide a promising solution to address these issues in poorly served locations. The thesis studies the performance of a multi-level LEO satellite communication system. More precisely, we model the LEO satellites’ location as Binomial Point Process (BPP) on a spherical surface at n different altitudes given that the number of satellites at each altitude ak is Nk where 1 ≤ k ≤ n and study the distance distribution. The distance distribution is characterized in two categories depending on the location of the observation point: contact distance and the nearest neighbor distance. For that proposed model, we study the user coverage probability by using tools from stochastic geometry for a scenario where satellite earth stations (ESs) with two antennas are deployed on the ground where one of the antennas communicates with the user while the other communicates with LEO satellite. Additionally, we consider a practical use case where satellite communication systems are deployed to increase coverage in remote and rural areas. For that purpose, we compare the coverage probability of the satellite-based communication system in such regions with the coverage probability in case of relying on the nearest anchored base station (ABS), which is usually located at far distances from rural and remote areas.
13

Advancing Bechhofer's Ranking Procedures to High-dimensional Variable Selection

Gu, Chao 01 September 2021 (has links)
No description available.
14

Advances in Stochastic Geometry for Cellular Networks

Saha, Chiranjib 24 August 2020 (has links)
The mathematical modeling and performance analysis of cellular networks have seen a major paradigm shift with the application of stochastic geometry. The main purpose of stochastic geometry is to endow probability distributions on the locations of the base stations (BSs) and users in a network, which, in turn, provides an analytical handle on the performance evaluation of cellular networks. To preserve the tractability of analysis, the common practice is to assume complete spatial randomness} of the network topology. In other words, the locations of users and BSs are modeled as independent homogeneous Poisson point processes (PPPs). Despite its usefulness, the PPP-based network models fail to capture any spatial coupling between the users and BSs which is dominant in a multi-tier cellular network (also known as the heterogeneous cellular networks (HetNets)) consisting of macro and small cells. For instance, the users tend to form hotspots or clusters at certain locations and the small cell BSs (SBSs) are deployed at higher densities at these locations of the hotspots in order to cater to the high data demand. Such user-centric deployments naturally couple the locations of the users and SBSs. On the other hand, these spatial couplings are at the heart of the spatial models used in industry for the system-level simulations and standardization purposes. This dissertation proposes fundamentally new spatial models based on stochastic geometry which closely emulate these spatial couplings and are conductive for a more realistic and fine-tuned performance analysis, optimization, and design of cellular networks. First, this dissertation proposes a new class of spatial models for HetNets where the locations of the BSs and users are assumed to be distributed as Poisson cluster process (PCP). From the modeling perspective, the proposed models can capture different spatial couplings in a network topology such as the user hotspots and user BS coupling occurring due to the user-centric deployment of the SBSs. The PCP-based model is a generalization of the state-of-the-art PPP-based HetNet model. This is because the model reduces to the PPP-based model once all spatial couplings in the network are ignored. From the stochastic geometry perspective, we have made contributions in deriving the fundamental distribution properties of PCP, such as the distance distributions and sum-product functionals, which are instrumental for the performance characterization of the HetNets, such as coverage and rate. The focus on more refined spatial models for small cells and users brings to the second direction of the dissertation, which is modeling and analysis of HetNets with millimeter wave (mm-wave) integrated access and backhaul (IAB), an emerging design concept of the fifth generation (5G) cellular networks. While the concepts of network densification with small cells have emerged in the fourth generation (4G) era, the small cells can be realistically deployed with IAB since it solves the problem of high capacity wired backhaul of SBSs by replacing the last-mile fibers with mm-wave links. We have proposed new stochastic geometry-based models for the performance analysis of IAB-enabled HetNets. Our analysis reveals some interesting system-design insights: (1) the IAB HetNets can support a maximum number of users beyond which the data rate drops below the rate of a single-tier macro-only network, and (2) there exists a saturation point of SBS density beyond which no rate gain is observed with the addition of more SBSs. The third and final direction of this dissertation is the combination of machine learning and stochastic geometry to construct a new class of data driven network models which can be used in the performance optimization and design of a network. As a concrete example, we investigate the classical problem of wireless link scheduling where the objective is to choose an optimal subset of simultaneously active transmitters (Tx-s) from a ground set of Tx-s which will maximize the network-wide sum-rate. Since the optimization problem is NP-hard, we replace the computationally expensive heuristic by inferring the point patterns of the active Tx-s in the optimal subset after training a determinantal point process (DPP). Our investigations demonstrate that the DPP is able to learn the spatial interactions of the Tx-s in the optimal subset and gives a reasonably accurate estimate of the optimal subset for any new ground set of Tx-s. / Doctor of Philosophy / The high speed global cellular communication network is one of the most important technologies, and it continues to evolve rapidly with every new generation. This evolution greatly depends on observing performance-trends of the emerging technologies on the network models through extensive system-level simulations. Since these simulation models are extremely time-consuming and error prone, the complementary analytical models of cellular networks have been an area of active research for a long time. These analytical models are intended to provide crisp insights on the network behavior such as the dependence of network performance metrics (such as coverage or rate) on key system-level parameters (such as transmission powers, base station (BS) density) which serve as the prior knowledge for more fine-tuned simulations. Over the last decade, the analytical modeling of the cellular networks has been driven by stochastic geometry. The main purpose of stochastic geometry is to endow the locations of the base stations (BSs) and users with probability distributions and then leverage the properties of these distributions to average out the spatial randomness. This process of spatial averaging allows us to derive the analytical expressions of the system-level performance metrics despite the presence of a large number of random variables (such as BS and user locations, channel gains) under some reasonable assumptions. The simplest stochastic geometry based model of cellular networks, which is also the most tractable, is the so-called Poisson point process (PPP) based network model. In this model, users and BSs are assumed to be distributed as independent homogeneous PPPs. This is equivalent to saying that the users and BSs independently and uniformly at random over a plane. The PPP-based model turned out to be a reasonably accurate representation of the yesteryear’s cellular networks which consisted of a single tier of macro BSs (MBSs) intended to provide a uniform coverage blanket over the region. However, as the data-hungry devices like smart-phones, tablets, and application like online gaming continue to flood the consumer market, the network configuration is rapidly deviating from this baseline setup with different spatial interactions between BSs and users (also termed spatial coupling) becoming dominant. For instance, the user locations are far from being homogeneous as they are concentrated in specific areas like residential and commercial zones (also known as hotspots). Further, the network, previously consisting of a single tier of macro BSs (MBSs), is becoming increasingly heterogeneous with the deployment of small cell BSs (SBSs) with small coverage footprints and targeted to serve the user hotspots. It is not difficult to see that the network topology with these spatial couplings is quite far from complete spatial randomness which is the basis of the PPP-based models. The key contribution of this dissertation is to enrich the stochastic geometry-based mathematical models so that they can capture the fine-grained spatial couplings between the BSs and users. More specifically, this dissertation contributes in the following three research directions. Direction-I: Modeling Spatial Clustering. We model the locations of users and SBSs forming hotspots as Poisson cluster processes (PCPs). A PCP is a collection of offspring points which are located around the parent points which belong to a PPP. The coupling between the locations of users and SBSs (due to their user-centric deployment) can be introduced by assuming that the user and SBS PCPs share the same parent PPP. The key contribution in this direction is the construction of a general HetNet model with a mixture of PPP and PCP-distributed BSs and user distributions. Note that the baseline PPP-based HetNet model appears as one of the many configurations supported by this general model. For this general model, we derive the analytical expressions of the performance metrics like coverage probability, BS load, and rate as functions of the coupling parameters (e.g. BS and user cluster size). Direction-II: Modeling Coupling in Wireless Backhaul Networks. While the deployment of SBSs clearly enhances the network performance in terms of coverage, one might wonder: how long network densification with tens of thousands of SBSs can meet the everincreasing data demand? It turns out that in the current network setting, where the backhaul links (i.e. the links between the BSs and core network) are still wired, it is not feasible to densify the network beyond some limit. This backhaul bottleneck can be overcome if the backhaul links also become wireless and the backhaul and access links (link between user and BS) are jointly managed by an integrated access and backhaul (IAB) network. In this direction, we develop the analytical models of IAB-enabled HetNets where the key challenge is to tackle new types of couplings which exist between the rates on the wireless access and backhaul links. Such couplings exist due to the spatial correlation of the signal qualities of the two links and the number of users served by different BSs. Two fundamental insights obtained from this work are as follows: (1) the IAB HetNets can support a maximum number of users beyond which the network performance drops below that of a single-tier macro-only network, and (2) there exists a saturation point of SBS density beyond which no performance gain is observed with the addition of more SBSs. Direction-III: Modeling Repulsion. In this direction, we focus on modeling another aspect of spatial coupling imposed by the intra-point repulsion. Consider a device-to-device (D2D) communication scenario, where some users are transmitting some on-demand content locally cached in their devices using a common channel. Any reasonable multiple access scheme will ensure that two nearly users are never simultaneously active as they will cause severe mutual interference and thereby reducing the network-wide sum rate. Thus the active users in the network will have some spatial repulsion. The locations of these users can be modeled as determinantal point processes (DPPs). The key property of DPP is that it forms a bridge between stochastic geometry and machine learning, two otherwise non-overlapping paradigms for wireless network modeling and design. The main focus in this direction is to explore the learning framework of DPP and bring together advantages of stochastic geometry and machine learning to construct a new class of data-driven analytical network models.
15

Analiza energetske efikasnosti isporuke multimedijalnih servisa u mobilnim ćelijskim sistemima četvrte generacije (LTE/LTE-A) / Analysis of Energy Efficient Delivery Multimedia Services in Mobile Cellular System Fourth Generation (LTE/LTE-A)

Rastovac Dragan 16 September 2016 (has links)
<p>U ovoj disertaciji razvijeni su analitički alati za izračunavanje protoka servisa, propusnog opsega i u&scaron;tede energije zahtevanim u različitim eMBMS LTE/LTE-A servisnim strukturama. Takođe, mi smo analizirali protok podataka i optimalnu dodelu parametara za prenos na fizičkom sloju za eMBMS baziran video servis u 2-klasnoj heterogenoj mreži primenom stohastičke geometrije.</p> / <p>In this dissertation we develop simple analytical tools for evaluation of average service data rates, bandwidth and energy consumption requirements in dierent eMBMS LTE/LTE-A service congurations. Also, we consider a simple approach to estimate achievable rates and optimally assign the physical layer transmission parameters for eMBMS based video service in the two-tier heterogeneous cellular systems.</p>
16

Jackknife Empirical Likelihood for the Variance in the Linear Regression Model

Lin, Hui-Ling 25 July 2013 (has links)
The variance is the measure of spread from the center. Therefore, how to accurately estimate variance has always been an important topic in recent years. In this paper, we consider a linear regression model which is the most popular model in practice. We use jackknife empirical likelihood method to obtain the interval estimate of variance in the regression model. The proposed jackknife empirical likelihood ratio converges to the standard chi-squared distribution. The simulation study is carried out to compare the jackknife empirical likelihood method and standard method in terms of coverage probability and interval length for the confidence interval of variance from linear regression models. The proposed jackknife empirical likelihood method has better performance. We also illustrate the proposed methods using two real data sets.
17

Prediction of recurrent events

Fredette, Marc January 2004 (has links)
In this thesis, we will study issues related to prediction problems and put an emphasis on those arising when recurrent events are involved. First we define the basic concepts of frequentist and Bayesian statistical prediction in the first chapter. In the second chapter, we study frequentist prediction intervals and their associated predictive distributions. We will then present an approach based on asymptotically uniform pivotals that is shown to dominate the plug-in approach under certain conditions. The following three chapters consider the prediction of recurrent events. The third chapter presents different prediction models when these events can be modeled using homogeneous Poisson processes. Amongst these models, those using random effects are shown to possess interesting features. In the fourth chapter, the time homogeneity assumption is relaxed and we present prediction models for non-homogeneous Poisson processes. The behavior of these models is then studied for prediction problems with a finite horizon. In the fifth chapter, we apply the concepts discussed previously to a warranty dataset coming from the automobile industry. The number of processes in this dataset being very large, we focus on methods providing computationally rapid prediction intervals. Finally, we discuss the possibilities of future research in the last chapter.
18

Prediction of recurrent events

Fredette, Marc January 2004 (has links)
In this thesis, we will study issues related to prediction problems and put an emphasis on those arising when recurrent events are involved. First we define the basic concepts of frequentist and Bayesian statistical prediction in the first chapter. In the second chapter, we study frequentist prediction intervals and their associated predictive distributions. We will then present an approach based on asymptotically uniform pivotals that is shown to dominate the plug-in approach under certain conditions. The following three chapters consider the prediction of recurrent events. The third chapter presents different prediction models when these events can be modeled using homogeneous Poisson processes. Amongst these models, those using random effects are shown to possess interesting features. In the fourth chapter, the time homogeneity assumption is relaxed and we present prediction models for non-homogeneous Poisson processes. The behavior of these models is then studied for prediction problems with a finite horizon. In the fifth chapter, we apply the concepts discussed previously to a warranty dataset coming from the automobile industry. The number of processes in this dataset being very large, we focus on methods providing computationally rapid prediction intervals. Finally, we discuss the possibilities of future research in the last chapter.
19

Jackknife Empirical Likelihood for the Accelerated Failure Time Model with Censored Data

Bouadoumou, Maxime K 15 July 2011 (has links)
Kendall and Gehan estimating functions are used to estimate the regression parameter in accelerated failure time (AFT) model with censored observations. The accelerated failure time model is the preferred survival analysis method because it maintains a consistent association between the covariate and the survival time. The jackknife empirical likelihood method is used because it overcomes computation difficulty by circumventing the construction of the nonlinear constraint. Jackknife empirical likelihood turns the statistic of interest into a sample mean based on jackknife pseudo-values. U-statistic approach is used to construct the confidence intervals for the regression parameter. We conduct a simulation study to compare the Wald-type procedure, the empirical likelihood, and the jackknife empirical likelihood in terms of coverage probability and average length of confidence intervals. Jackknife empirical likelihood method has a better performance and overcomes the under-coverage problem of the Wald-type method. A real data is also used to illustrate the proposed methods.
20

Constructing confidence regions for the locations of putative trait loci using data from affected sib-pair designs

Papachristou, Charalampos 24 August 2005 (has links)
No description available.

Page generated in 0.0776 seconds