• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 253
  • 98
  • 21
  • 16
  • 11
  • 9
  • 9
  • 9
  • 8
  • 6
  • 5
  • 2
  • 2
  • 2
  • 1
  • Tagged with
  • 521
  • 521
  • 91
  • 78
  • 77
  • 67
  • 64
  • 57
  • 55
  • 53
  • 51
  • 38
  • 37
  • 36
  • 35
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Distributionally Robust Performance Analysis: Data, Dependence and Extremes

He, Fei January 2018 (has links)
This dissertation focuses on distributionally robust performance analysis, which is an area of applied probability whose aim is to quantify the impact of model errors. Stochastic models are built to describe phenomena of interest with the intent of gaining insights or making informed decisions. Typically, however, the fidelity of these models (i.e. how closely they describe the underlying reality) may be compromised due to either the lack of information available or tractability considerations. The goal of distributionally robust performance analysis is then to quantify, and potentially mitigate, the impact of errors or model misspecifications. As such, distributionally robust performance analysis affects virtually any area in which stochastic modelling is used for analysis or decision making. This dissertation studies various aspects of distributionally robust performance analysis. For example, we are concerned with quantifying the impact of model error in tail estimation using extreme value theory. We are also concerned with the impact of the dependence structure in risk analysis when marginal distributions of risk factors are known. In addition, we also are interested in connections recently found to machine learning and other statistical estimators which are based on distributionally robust optimization. The first problem that we consider consists in studying the impact of model specification in the context of extreme quantiles and tail probabilities. There is a rich statistical theory that allows to extrapolate tail behavior based on limited information. This body of theory is known as extreme value theory and it has been successfully applied to a wide range of settings, including building physical infrastructure to withstand extreme environmental events and also guiding the capital requirements of insurance companies to ensure their financial solvency. Not surprisingly, attempting to extrapolate out into the tail of a distribution from limited observations requires imposing assumptions which are impossible to verify. The assumptions imposed in extreme value theory imply that a parametric family of models (known as generalized extreme value distributions) can be used to perform tail estimation. Because such assumptions are so difficult (or impossible) to be verified, we use distributionally robust optimization to enhance extreme value statistical analysis. Our approach results in a procedure which can be easily applied in conjunction with standard extreme value analysis and we show that our estimators enjoy correct coverage even in settings in which the assumptions imposed by extreme value theory fail to hold. In addition to extreme value estimation, which is associated to risk analysis via extreme events, another feature which often plays a role in the risk analysis is the impact of dependence structure among risk factors. In the second chapter we study the question of evaluating the worst-case expected cost involving two sources of uncertainty, each of them with a specific marginal probability distribution. The worst-case expectation is optimized over all joint probability distributions which are consistent with the marginal distributions specified for each source of uncertainty. So, our formulation allows to capture the impact of the dependence structure of the risk factors. This formulation is equivalent to the so-called Monge-Kantorovich problem studied in optimal transport theory, whose theoretical properties have been studied in the literature substantially. However, rates of convergence of computational algorithms for this problem have been studied only recently. We show that if one of the random variables takes finitely many values, a direct Monte Carlo approach allows to evaluate such worst case expectation with $O(n^{-1/2})$ convergence rate as the number of Monte Carlo samples, $n$, increases to infinity. Next, we continue our investigation of worst-case expectations in the context of multiple risk factors, not only two of them, assuming that their marginal probability distributions are fixed. This problem does not fit the mold of standard optimal transport (or Monge-Kantorovich) problems. We consider, however, cost functions which are separable in the sense of being a sum of functions which depend on adjacent pairs of risk factors (think of the factors indexed by time). In this setting, we are able to reduce the problem to the study of several separate Monge-Kantorovich problems. Moreover, we explain how we can even include martingale constraints which are often natural to consider in settings such as financial applications. While in the previous chapters we focused on the impact of tail modeling or dependence, in the later parts of the dissertation we take a broader view by studying decisions which are made based on empirical observations. So, we focus on so-called distributionally robust optimization formulations. We use optimal transport theory to model the degree of distributional uncertainty or model misspecification. Distributionally robust optimization based on optimal transport has been a very active research topic in recent years, our contribution consists in studying how to specify the optimal transport metric in a data-driven way. We explain our procedure in the context of classification, which is of substantial importance in machine learning applications.
12

Efficient Modelling and Performance Analysis of Wideband Communication Receivers

Eriksson, Andreas January 2011 (has links)
This thesis deals with Symbol Error Rate (SER)-simulation of wireless communications and its application into throughput analysis of UltraWideband (UWB) systems. The SERs will be simulated in C++ using the Monte Carlo method and when some are calculated, the rest will be estimated using a novel extrapolation method. These SER values will be very accurate and in this thesis go as low as 1.0e-14. Reaching that low values would otherwise be impossible using the traditional Monte Carlo method, because of very large computation time. However, the novel extrapolation method, can simulate a SER-curve in less than 30 seconds. It is assumed that the noise belongs to the Generalized Gaussian distribution family and among them noise from the Normal distribution (Gaussian noise) gives the best result. It is to be noted that Gaussian noise is the most commonly used in digital communication simulations. Although the program is used for throughput analysis of UWB, the program could easily be adapted to various signals. In this thesis, throughput analysis means a plot with symbol rate vs distance. From any given symbols, the user can, with a desired minimum SER, generate an extrapolated SER-curve and see what symbol rate can be achieved by the system, while obeying power constraints of signals imposed by international laws. The developed program is, by comparing with published theoretical results, tested for QAM and PSK cases, but can easily be extended to UWB systems.
13

Design and Performance Analysis of Efficient Cooperative Wireless Communication Systems

Altubaishi, Essam Saleh 10 August 2012 (has links)
Cooperative communication has recently become a key technology for modern wireless networks such as 3GPP long-term evolution and WiMAX, because in such networks the transmission rate, the communication reliability, and coverage problems could be improved in a cost-effective manner. This, however, faces many design challenges. First, cooperative transmission typically involves a relaying phase which requires extra resources. This may cause a reduction in the spectral efficiency. Second, extra control signaling increases the complexity of operation, which may limit practical implementation. In addition, a wireless channel is time-varying, mainly due to the multipath propagation. As a result, a careful design of efficient cooperative communication systems is required, not only to enhance the spectral efficiency and maintain the quality-of-service (QoS), but also to be practical. In this dissertation, we aim to address the challenges imposed by cooperative communication and wireless transmission, and design the efficient and distributed systems which can be practically implemented in existing wireless systems. The research work is divided into two main topics: 1) adaptive cooperative wireless systems with variable-rate transmission, and 2) cooperative wireless systems with a power consumption constraint. The first topic investigates how the spectral efficiency of cooperative wireless communication systems can be improved while maintaining the QoS in terms of bit error rate and outage probability. The spectral efficiency enhancement is achieved by using three techniques: adaptivity over the relay node (i.e., relay node is active or not), adaptivity over the modulation mode, and relay selection. Based on that, we propose several adaptive cooperative schemes for both the decode-and-forward (DF) and amplify-and-forward (AF) protocols. To evaluate these schemes, we provide performance analysis in terms of average spectral efficiency, average bit error rate (ABER), and outage probability over Rayleigh fading channels. We start with the single-relay cooperative system using DF protocol, in which two adaptive cooperative schemes with variable-rate transmission are proposed. The first scheme, called the minimum error rate scheme (MERS), aims to exploit the transmit diversity to improve the bit error rate. By trading the multiplexing gain against the diversity gain, we propose the second scheme, called the maximum spectral efficiency scheme (MSES), in which cooperative transmission is avoided whenever it is not beneficial. The MERS improves the ABER significantly and achieves equal or better average spectral efficiency compared to the fixed (i.e., non-adaptive) relaying scheme. In contrast, the MSES provides the best average spectral efficiency due to its ability to not only adapt to the channel variation but also to switch between cooperative and non-cooperative transmissions. To further increase the spectral efficiency, we then propose the third scheme, called variable-rate based relay selection (VRRS) scheme, in which a relay node is selected from among the available relay nodes, based on a predefined criterion. Furthermore, we propose two AF adaptive cooperative schemes, mainly to enhance the spectral efficiency. In the first scheme, we introduce a generalized switching policy (GSP) for a single-relay cooperative wireless system that exploits the variable-rate transmission and useful cooperative regions. The second scheme, called the AF efficient relay selection (AFERS) scheme, extends the GSP to also consider the relay selection technique. Analytical and simulation results verify that the AFERS scheme not only outperforms conventional direct transmission in terms of the average spectral efficiency, but also the AF fixed relaying and the outage-based AF adaptive cooperative scheme. The second topic investigates the fair power consumption of the relay nodes for AF cooperative wireless communication systems. The fairness is defined as to achieve equal power consumption over the relay nodes. We focus on how the relay selection process can be controlled in a distributed manner so that the power consumption of the relay nodes can be included in relay selection. We first introduce a simple closed-form expression for the weight coefficient used in order to achieve the considered fairness that depends only on the local average channel conditions of the relay path. We then derive closed-form expressions of the weighted outage probability and ABER and show that our proposed strategy not only has less complexity than the conventional centralized one but also provides better accuracy in distributing the total consumed power equally among the relay nodes without affecting the performance.
14

A Study on the Development of Kaohsiung toward a Livable City

Tsai, Hsin-yi 03 July 2012 (has links)
This research intends to understand whether Kaohsiung is heading toward or away from being a livable city. Additionally, it also intends to show if the developments in the city conform to the expectations of the residents. Therefore, the analysis in this research is based on objective statistics and the resident¡¦s subjective perceptions. This research utilized Time Series analysis and questionnaires to conduct the research, and used importance-performance analysis as the analytical method. The questionnaires targeted the residents in Kaohsiung City who are over 20 years of age. The total samples are 330 with 254 valid samples. The questionnaires surveyed the level of livability of Kaohsiung judging from 5 aspects: the eco-environment, culture & education, economic development, urban living & service, and medical & social welfare, reflecting the difference between the importance and performance of each aspect. Below are the suggestions concluded from the results of the research, which pointed out the improvements needed for Kaohsiung and the items that can use less attention: 1. According to the time aptitude objective statistics, Kaohsiung has shown mostly positive growth on cultural education, especially on holding cultural events and replenishing books for the public libraries. However, the economy has shown negative growth, 2. Based on importance-performance analysis, out of 23 indications, 4 of them (17.38%) fell on keep-doing area, 4(17.38%) fell on excessive supply area, 6(26.1%) fell on lower-priority area, 9(39.14%) fell on improvement-focused area. 3. Combing the data gathered from the questionnaires and statistical analysis, the items require grave improvement are raising the wages of the residents, lowering unemployment rate, and resolving the problem of abuse to children and teenagers. From both the subjective and objective analysis, items that are overly supplied are the number of times of holding cultural events and replenishing books for the pubic libraries. Based on the results from the research, it is suggested that Kaohsiung put resource to economy and medical & social welfare, while decrease overly investing cultural education.
15

Performance of hard handoff in 1xev-do rev. a systems

Al-Shoukairi, Maher 15 May 2009 (has links)
1x Evolution-Data Optimized Revision A (1xEV-DO Rev. A) is a cellular communications standard that introduces key enhancements to the high data rate packet switched 1xEV-DO Release 0 standard. The enhancements are driven by the increasing demand on some applications that are delay sensitive and require symmetric data rates on the uplink and the downlink. Some examples of such applications being video telephony and voice over internet protocol (VoIP). The handoff operation is critical for delay sensitive applications because the mobile station (MS) is not supposed to lose service for long periods of time. Therefore seamless server selection is used in Rev. A systems. This research analyzes the performance of this handoff technique. A theoretical approach is presented to calculate the slot error probability (SEP). The approach enables evaluating the effects of filtering, hysteresis as well as the system introduced delay to handoff execution. Unlike previous works, the model presented in this thesis considers multiple base stations (BS) and accounts for correlation of shadow fading affecting different signal powers received from different BSs. The theoretical results are then verified over ranges of parameters of practical interest using simulations, which are also used to evaluate the packet error rate (PER) and the number of handoffs per second. Results show that the SEP gives a good indication about the PER. Results also show that when considering practical handoff delays, moderately large filter constants are more efficient than smaller ones.
16

Performance Analysis of Concurrent Search in Mobile Networks

Chen, Hsin-chou 24 July 2004 (has links)
In mobile communications networks, a location management scheme is responsible for tracking mobile users. Typically, a location management scheme consists of a location update scheme and a paging scheme. Gau and Haas first proposed the concurrent search(CS) approach that could simultaneously locate a number of mobile users in mobile communications networks. We propose to use the theory of the discrete-time Markov chain to analyze the performance of the concurrent search approach. In particular, we concentrate on the worst case in which each mobile user appears equally likely in all the cells of the network. We analyze the average paging delay, the call blocking probability and the system size. We show that our analytical results are consistent with the simulation results of the concurrent search.
17

Performance analysis and network path characterization for scalable internet streaming

Kang, Seong-Ryong 10 October 2008 (has links)
Delivering high-quality of video to end users over the best-effort Internet is a challenging task since quality of streaming video is highly subject to network conditions. A fundamental issue in this area is how real-time applications cope with network dynamics and adapt their operational behavior to offer a favorable streaming environment to end users. As an effort towards providing such streaming environment, the first half of this work focuses on analyzing the performance of video streaming in best-effort networks and developing a new streaming framework that effectively utilizes unequal importance of video packets in rate control and achieves a near-optimal performance for a given network packet loss rate. In addition, we study error concealment methods such as FEC (Forward-Error Correction) that is often used to protect multimedia data over lossy network channels. We investigate the impact of FEC on the quality of video and develop models that can provide insights into understanding how inclusion of FEC affects streaming performance and its optimality and resilience characteristics under dynamically changing network conditions. In the second part of this thesis, we focus on measuring bandwidth of network paths, which plays an important role in characterizing Internet paths and can benefit many applications including multimedia streaming. We conduct a stochastic analysis of an end-to-end path and develop novel bandwidth sampling techniques that can produce asymptotically accurate capacity and available bandwidth of the path under non-trivial cross-traffic conditions. In addition, we conduct comparative performance study of existing bandwidth estimation tools in non-simulated networks where various timing irregularities affect delay measurements. We find that when high-precision packet timing is not available due to hardware interrupt moderation, the majority of existing algorithms are not robust to measure end-to-end paths with high accuracy. We overcome this problem by using signal de-noising techniques in bandwidth measurement. We also develop a new measurement tool called PRC-MT based on theoretical models that simultaneously measures the capacity and available bandwidth of the tight link with asymptotic accuracy.
18

Performance Analysis of Decode-and-Forward Protocols in Unidirectional and Bidirectional Cooperative Diversity Networks

LIU, PENG 14 September 2009 (has links)
Cooperative communications have the ability to induce spatial diversity, increase channel capacity, and attain broader cell coverage with single-antenna terminals. This thesis focuses on the performance study of both unidirectional and bidirectional cooperative diversity networks employing the decode-and-forward (DF) protocol. For the unidirectional cooperative diversity network, we study the average bit-error rate (BER) performance of a DF protocol with maximum-likelihood (ML) detection. Closed-form approximate average BER expressions involving only elementary functions are presented for a cooperative diversity network with one or two relays. The proposed BER expressions are valid for both coherent and non-coherent binary signallings. With Monte-Carlo simulations, it is verified that the proposed BER expressions are extremely accurate for the whole signal-to-noise ratio (SNR) range. For the bidirectional cooperative diversity network, we study and compare the performance of three very typical bidirectional communication protocols based on the decode-and-forward relaying: time division broadcast (TDBC), physical-layer network coding (PNC), and opportunistic source selection (OSS). Specifically, we derive an exact outage probability in a one-integral form for the TDBC protocol, and exact closed-form outage probabilities for the PNC and OSS protocols. For the TDBC protocol, we also derive extremely tight upper and lower bounds on the outage probability in closed-form. Moreover, asymptotic outage probability performance of each protocol is studied. Finally, we study the diversity-multiplexing tradeoff (DMT) performance of each protocol both in the finite and infinite SNR regimes. The performance analysis presented in this thesis can be used as a useful tool to guide practical system designs for both unidirectional and bidirectional cooperative diversity networks. / Thesis (Master, Electrical & Computer Engineering) -- Queen's University, 2009-09-12 14:36:05.05
19

Performance of dual hop relay systems with imperfect CSI

Soysa, Madushanka Dinesh Unknown Date
No description available.
20

Performance Analysis of Distributed MAC Protocols for Wireless Networks

Ling, Xinhua 01 May 2007 (has links)
How to improve the radio resource utilization and provide better quality-of-service (QoS) is an everlasting challenge to the designers of wireless networks. As an indispensable element of the solution to the above task, medium access control (MAC) protocols coordinate the stations and resolve the channel access contentions so that the scarce radio resources are shared fairly and efficiently among the participating users. With a given physical layer, a properly designed MAC protocol is the key to desired system performance, and directly affects the perceived QoS of end users. Distributed random access protocols are widely used MAC protocols in both infrastructure-based and infrastructureless wireless networks. To understand the characteristics of these protocols, there have been enormous efforts on their performance study by means of analytical modeling in the literature. However, the existing approaches are inflexible to adapt to different protocol variants and traffic situations, due to either many unrealistic assumptions or high complexity. In this thesis, we propose a simple and scalable generic performance analysis framework for a family of carrier sense multiple access with collision avoidance (CSMA/CA) based distributed MAC protocols, regardless of the detailed backoff and channel access policies, with more realistic and fewer assumptions. It provides a systematic approach to the performance study and comparison of diverse MAC protocols in various situations. Developed from the viewpoint of a tagged station, the proposed framework focuses on modeling the backoff and channel access behavior of an individual station. A set of fixed point equations is obtained based on a novel three-level renewal process concept, which leads to the fundamental MAC performance metric, average frame service time. With this result, the important network saturation throughput is then obtained straightforwardly. The above distinctive approach makes the proposed analytical framework unified for both saturated and unsaturated stations. The proposed framework is successfully applied to study and compare the performance of three representative distributed MAC protocols: the legacy p-persistent CSMA/CA protocol, the IEEE 802.15.4 contention access period MAC protocol, and the IEEE 802.11 distributed coordination function, in a network with homogeneous service. It is also extended naturally to study the effects of three prevalent mechanisms for prioritized channel access in a network with service differentiation. In particular, the novel concepts of ``virtual backoff event'' and ``pre-backoff waiting periods'' greatly simplify the analysis of the arbitration interframe space mechanism, which is the most challenging one among the three, as shown in the previous works reported in the literature. The comparison with comprehensive simulations shows that the proposed analytical framework provides accurate performance predictions in a broad range of stations. The results obtained provide many helpful insights into how to improve the performance of current protocols and design better new ones.

Page generated in 0.0981 seconds