Large-scale blockages like buildings affect the performance of urban cellular networks, especially in the millimeter-wave frequency band. Unfortunately, such blockage effects are either neglected or characterized by oversimplified models in the analysis of cellular networks. Leveraging concepts from random shape theory, this paper proposes a mathematical framework to model random blockages, and quantifies their effects on the performance of cellular networks. Specifically, random buildings are modeled as a process of rectangles with random sizes and orientations whose centers form a Poisson point process on the plane, which is called a Boolean scheme. The distribution of the number of blockages in a link is proven to be Poisson with parameter dependent on the length of the link, which leads to the distribution of penetration losses of a single link. A path loss model that incorporates the blockage effects is proposed, which matches experimental trends observed in prior work. The blockage model is applied to analyze blockage effects on cellular networks assuming blockages are impenetrable, in terms of connectivity, coverage probability, and average rate. Analytic results show while buildings may block the desired signal, they may still have a positive impact on network performance since they also block more interference. / text
Identifer | oai:union.ndltd.org:UTEXAS/oai:repositories.lib.utexas.edu:2152/21660 |
Date | 22 October 2013 |
Creators | Bai, Tianyang |
Source Sets | University of Texas |
Language | en_US |
Detected Language | English |
Format | application/pdf |
Page generated in 0.0015 seconds