• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 60
  • 9
  • 9
  • 7
  • 6
  • 1
  • Tagged with
  • 113
  • 113
  • 44
  • 29
  • 20
  • 19
  • 16
  • 13
  • 12
  • 12
  • 11
  • 11
  • 11
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Teste para avaliar a propriedade de incrementos independentes em um processo pontual / Test to evaluate the property of independent increments in a point process

Francys Andrews de Souza 26 June 2013 (has links)
Em econometria um dos tópicos que vem se tornando ao longo dos anos primordial e a análise de ultra-frequência, ou seja, a análise da transação negócio a negócio. Ela tem se mostrado fundamental na modelagem da microestrutura do mercado intraday. Ainda assim temos uma teoria escassa que vem crescendo de forma humilde a cerca deste tema. Buscamos desenvolver um teste de hipótese para verificar se os dados de ultra-frequência apresentam incrementos independentes e estacionários, pois neste cenário saber disso é de grande importância, ja que muitos trabalhos tem como base essa hipótese. Além disso Grimshaw et. al. (2005)[6] mostrou que ao utilizarmos uma distribuição de probabilidade contínua para modelarmos dados econômicos, em geral, estimamos uma função de intensidade crescente, devido a resultados viciados obtidos como consequência do arredondamento, em nosso trabalho buscamos trabalhar com distribuições discretas para que contornar esse problema acarretado pelo uso de distribuições contínuas / In econometrics a topic that is becoming primordial over the years is the ultra frequency analysis, or analysis of the trades to trades transaction. This topic is shown to be fundamental in modeling the microstructure of the market intraday. Nevertheless we have a little theory that is growing so lowly about this topic. We seek to develop a hypothesis test to verify that the data ultrasonic frequency have independent and stationary increments, for this scenario the knowledge of it great importance, since many jobs is based on this hypothesis. In general Grimshaw et. al. (2005)[6] showed that when we use a continuous probability distribution to model ecomomic data, we estimate a function of increasing intensity due to addicts results obtained as a result of rounding. In our research we seek to work with discrete distributions to circumvent this problem entailed by the use of continuous distributions
42

Green heterogeneous cellular networks

Mugume, Edwin January 2016 (has links)
Data traffic demand has been increasing exponentially and this trend will continue over theforeseeable future. This has forced operators to upgrade and densify their mobile networks toenhance their capacity. Future networks will be characterized by a dense deployment of different kinds of base stations (BSs) in a hierarchical cellular structure. However network densification requires extensive capital and operational investment which limits operator revenues and raises ecological concerns over greenhouse gas emissions. Although networks are planned to support peak traffic, traffic demand is actually highly variable in both space and time which makes it necessary to adapt network energy consumption to inevitable variations in traffic demand. In this thesis, stochastic geometry tools are used to perform simple and tractable analysis of thecoverage, rate and energy performance of homogeneous networks and heterogeneous networks(HetNets). BSs in each tier are located according to independent Poisson Point Processes(PPPs) to generate irregular topologies that fairly resemble practical deployment topologies. The homogeneous network is optimized to determine the optimal BS density and transmit power configuration that minimizes its area power consumption (APC) subject to both coverage and average rate constraints. Results show that optimal transmit power only depends on the BSpower consumption parameters and can be predetermined. Furthermore, various sleep modemechanisms are applied to the homogeneous network to adapt its APC to changes in userdensity. A centralized strategic scheme which prioritize BSs with the least number of usersenhances energy efficiency (EE) of the network. Due to the complexity of such a centralizedscheme, a distributed scheme which implements the strategic algorithm within clusters of BSsis proposed and its performance closely matches that of its centralized counterpart. It is more challenging to model the optimal deployment configuration per tier in a multi-tier HetNet. Appropriate assumptions are used to determine tight approximations of these deployment configurations that minimize the APC of biased and unbiased HetNets subject tocoverage and rate constraints. The optimization is performed for three different user associationschemes. Similar to the homogeneous network, optimal transmit power per tier also depends onBS power consumption parameters only and can also be predetermined. Analysis of the effect of biasing on HetNet performance shows appropriate biasing can further reduce the deploymentconfiguration (and consequently the APC) compared to an unbiased HetNet. In addition, biasing can be used to offload traffic from congesting and high-power macro BSs to low-power small BSs. If idle BSs are put into sleep mode, more energy is saved and HetNet EE improves. Moreover, appropriate biasing also enhances the EE of the HetNet.
43

Důsledky a aplikace věty o reprezentaci Fockova prostoru / Consequences and applications of the Fock space representation theorem

Novotná, Daniela January 2017 (has links)
Consequences and applications of the Fock space representation theorem Daniela Novotn'a Department of Probability and Mathematical Statistics, Faculty of Mathematics and Physics, Charles University Abstract In this thesis, we deal with selected applications of the Fock space rep- resentation theorem. One of the most important is the covariance identity, which can yield in an estimation of the correlation function of a point process having Papangelou conditional intensity. We used this result to generalise some asymptotic results for Gibbs particle processes. Namely, in combina- tion with Stein's method, we derived bounds for the Wasserstein distance between the standard normal distribution and the distribution of an innova- tion of a Gibbs particle process. As an application, we present a central limit theorem for a functional of a Gibbs segment process with pair potential.
44

The role of heterogeneity in spatial plant population dynamics

van Waveren, Clara-Sophie 24 October 2016 (has links)
No description available.
45

Asymptotic Analysis of Interference in Cognitive Radio Networks

Yaobin, Wen January 2013 (has links)
The aggregate interference distribution in cognitive radio networks is studied in a rigorous and analytical way using the popular Poisson point process model. While a number of results are available for this model for non-cognitive radio networks, cognitive radio networks present extra levels of difficulties for the analysis, mainly due to the exclusion region around the primary receiver, which are typically addressed via various ad-hoc approximations (e.g., based on the interference cumulants) or via the large-deviation analysis. Unlike the previous studies, we do not use here ad-hoc approximations but rather obtain the asymptotic interference distribution in a systematic and rigorous way, which also has a guaranteed level of accuracy at the distribution tail. This is in contrast to the large deviation analysis, which provides only the (exponential) order of scaling but not the outage probability itself. Unlike the cumulant-based analysis, our approach provides a guaranteed level of accuracy at the distribution tail. Additionally, our analysis provides a number of novel insights. In particular, we demonstrate that there is a critical transition point below which the outage probability decays only polynomially but above which it decays super-exponentially. This provides a solid analytical foundation to the earlier empirical observations in the literature and also reveals what are the typical ways outage events occur in different regimes. The analysis is further extended to include interference cancelation and fading (from a broad class of distributions). The outage probability is shown to scale down exponentially in the number of canceled nearest interferers in the below-critical region and does not change significantly in the above-critical one. The proposed asymptotic expressions are shown to be accurate in the non-asymptotic regimes as well.
46

Stochastic Geometry-based Analysis of LEO Satellite Communication Systems

Talgat, Anna 21 July 2020 (has links)
Wireless coverage becomes one of the most significant needs of modern society because of its importance in various applications such as health, distance education, industry, and much more. Therefore, it is essential to provide wireless coverage worldwide, including remote areas, rural areas, and poorly served locations. Recent advances in Low Earth Orbit (LEO) satellite communications provide a promising solution to address these issues in poorly served locations. The thesis studies the performance of a multi-level LEO satellite communication system. More precisely, we model the LEO satellites’ location as Binomial Point Process (BPP) on a spherical surface at n different altitudes given that the number of satellites at each altitude ak is Nk where 1 ≤ k ≤ n and study the distance distribution. The distance distribution is characterized in two categories depending on the location of the observation point: contact distance and the nearest neighbor distance. For that proposed model, we study the user coverage probability by using tools from stochastic geometry for a scenario where satellite earth stations (ESs) with two antennas are deployed on the ground where one of the antennas communicates with the user while the other communicates with LEO satellite. Additionally, we consider a practical use case where satellite communication systems are deployed to increase coverage in remote and rural areas. For that purpose, we compare the coverage probability of the satellite-based communication system in such regions with the coverage probability in case of relying on the nearest anchored base station (ABS), which is usually located at far distances from rural and remote areas.
47

Adversarial Attacks and Defense Mechanisms to Improve Robustness of Deep Temporal Point Processes

Khorshidi, Samira 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Temporal point processes (TPP) are mathematical approaches for modeling asynchronous event sequences by considering the temporal dependency of each event on past events and its instantaneous rate. Temporal point processes can model various problems, from earthquake aftershocks, trade orders, gang violence, and reported crime patterns, to network analysis, infectious disease transmissions, and virus spread forecasting. In each of these cases, the entity’s behavior with the corresponding information is noted over time as an asynchronous event sequence, and the analysis is done using temporal point processes, which provides a means to define the generative mechanism of the sequence of events and ultimately predict events and investigate causality. Among point processes, Hawkes process as a stochastic point process is able to model a wide range of contagious and self-exciting patterns. One of Hawkes process’s well-known applications is predicting the evolution of viral processes on networks, which is an important problem in biology, the social sciences, and the study of the Internet. In existing works, mean-field analysis based upon degree distribution is used to predict viral spreading across networks of different types. However, it has been shown that degree distribution alone fails to predict the behavior of viruses on some real-world networks. Recent attempts have been made to use assortativity to address this shortcoming. This thesis illustrates how the evolution of such a viral process is sensitive to the underlying network’s structure. In Chapter 3 , we show that adding assortativity does not fully explain the variance in the spread of viruses for a number of real-world networks. We propose using the graphlet frequency distribution combined with assortativity to explain variations in the evolution of viral processes across networks with identical degree distribution. Using a data-driven approach, by coupling predictive modeling with viral process simulation on real-world networks, we show that simple regression models based on graphlet frequency distribution can explain over 95% of the variance in virality on networks with the same degree distribution but different network topologies. Our results highlight the importance of graphlets and identify a small collection of graphlets that may have the most significant influence over the viral processes on a network. Due to the flexibility and expressiveness of deep learning techniques, several neural network-based approaches have recently shown promise for modeling point process intensities. However, there is a lack of research on the possible adversarial attacks and the robustness of such models regarding adversarial attacks and natural shocks to systems. Furthermore, while neural point processes may outperform simpler parametric models on in-sample tests, how these models perform when encountering adversarial examples or sharp non-stationary trends remains unknown. In Chapter 4 , we propose several white-box and black-box adversarial attacks against deep temporal point processes. Additionally, we investigate the transferability of whitebox adversarial attacks against point processes modeled by deep neural networks, which are considered a more elevated risk. Extensive experiments confirm that neural point processes are vulnerable to adversarial attacks. Such a vulnerability is illustrated both in terms of predictive metrics and the effect of attacks on the underlying point process’s parameters. Expressly, adversarial attacks successfully transform the temporal Hawkes process regime from sub-critical to into a super-critical and manipulate the modeled parameters that is considered a risk against parametric modeling approaches. Additionally, we evaluate the vulnerability and performance of these models in the presence of non-stationary abrupt changes, using the crimes and Covid-19 pandemic dataset as an example. Considering the security vulnerability of deep-learning models, including deep temporal point processes, to adversarial attacks, it is essential to ensure the robustness of the deployed algorithms that is despite the success of deep learning techniques in modeling temporal point processes. In Chapter 5 , we study the robustness of deep temporal point processes against several proposed adversarial attacks from the adversarial defense viewpoint. Specifically, we investigate the effectiveness of adversarial training using universal adversarial samples in improving the robustness of the deep point processes. Additionally, we propose a general point process domain-adopted (GPDA) regularization, which is strictly applicable to temporal point processes, to reduce the effect of adversarial attacks and acquire an empirically robust model. In this approach, unlike other computationally expensive approaches, there is no need for additional back-propagation in the training step, and no further network isrequired. Ultimately, we propose an adversarial detection framework that has been trained in the Generative Adversarial Network (GAN) manner and solely on clean training data. Finally, in Chapter 6 , we discuss implications of the research and future research directions.
48

Resource Allocation and Pricing in Virtual Wireless Networks

Chen, Xin 01 January 2014 (has links) (PDF)
The Internet architecture has proven its success by completely changing people’s lives. However, making significant architecture improvements has become extremely difficult since it requires competing Internet Service Providers to jointly agree. Re- cently, network virtualization has attracted the attention of many researchers as a solution to this ossification problem. A network virtualization environment allows multiple network architectures to coexist on a shared physical resource. However, most previous research has focused on network virtualization in a wired network en- vironment. It is well known that wireless networks have become one of the main access technologies. Due to the probabilistic nature of the wireless environment, vir- tualization becomes more challenging. This thesis consider virtualization in wireless networks with a focus on the challenges due to randomness. First, I apply mathe- matical tools from stochastic geometry on the random system model, with transport capacity as the network performance metric. Then I design an algorithm which can allow multiple virtual networks working in a distributed fashion to find a solution such that the aggregate satisfaction of the whole network is maximized. Finally, I proposed a new method of charging new users fairly when they ask to enter the system. I measure the cost of the system when a new user with a virtual network request wants to share the resource and demonstrate a simple method for estimating this “price”.
49

Hawkes Process Models for Unsupervised Learning on Uncertain Event Data

Haghdan, Maysam January 2017 (has links)
No description available.
50

Spatial Clutter Intensity Estimation for Multitarget Tracking

CHEN, XIN 10 1900 (has links)
<p>In this thesis, the problem of estimating the clutter spatial intensity function for the multitarget tracking algorithms has been considered. In many scenarios, after the signal detection process, measurement points provided by the sensor (e.g., sonar, infrared sensor, radar) are not distributed uniformly in the surveillance region as assumed by most tracking algorithms. On the other hand, in order to obtain accurate results, the multitarget tracking algorithm requires information about clutter’s spatial intensity. Thus, non-homogeneous clutter spatial intensity has to be estimated from the measurement set and the tracking filter’s output. Also, in order to take advantage of existing tracking algorithms, it is desirable for the clutter estimation method to be integrated into the tracker itself. In this thesis, the clutter is modeled by a non-homogeneous Poisson point (NHPP) process with a spatial intensity function g(z). To calculate the value of the clutter spatial intensity, all we need to do is estimating g(z). First, two new methods for joint spatial clutter intensity estimation and multitarget tracking using the Probability Hypothesis Density (PHD) Filter are presented. Then, based on NHPP process, multitarget multi-Bernoulli processes and set calculus, the approximated Bayesian method is extended to joint the non–homogeneous clutter background estimation and multitarget tracking with standard multitarget tracking algorithms, like the Multiple Hypothesis Tracking (MHT) and the Joint Integrated Probabilistic Data Association (JIPDA) tracker. Finally, a kernel density method is proposed for the clutter spatial intensity estimation problem. Simulation results illustrate the performance of the above algorithms, both in terms of the false track number and the true track initialization speed. All proposed algorithms show the ability to improve the performance of the multitarget tracker in the presence of slowly time varying non–homogeneous clutter background.</p> / Doctor of Philosophy (PhD)

Page generated in 0.0977 seconds