Spelling suggestions: "subject:"long term devolution - LTE"" "subject:"long term c.volution - LTE""
11 |
Ανάλυση της απόδοσης του Forward Error Correction σε κινητά δίκτυα επικοινωνιών τεχνολογίας Long Term EvolutionΚανάκης, Νικόλοας 06 October 2011 (has links)
Long Term Evolution (LTE) είναι το όνομα που έδωσε το 3GPP στο πρόγραμμα εξέλιξης των κινητών συστημάτων επικοινωνιών τρίτης γενιάς UMTS, με στόχο την αντιμετώπιση των μελλοντικών απαιτήσεων της αγοράς των κινητών δικτύων. Βασικοί στόχοι του νέου προτύπου LTE αποτελούν η αυξημένη απόδοση, το μικρότερο κόστος υλοποίησης, η μείωση της πολυπλοκότητας λειτουργίας και η παροχή νέων εξελιγμένων υπηρεσιών, κάνοντας χρήση νέων φασμάτων λειτουργίας. Οι προσδοκίες από το LTE είναι ιδιαίτερα υψηλές και βασίζονται στην παροχή υπηρεσιών απαλλαγμένων από τις απώλειες. Συνεπώς, η επιτυχία του LTE θα καθοριστεί από την ικανότητα παροχής κινητών συσκευών και ασύρματης υποδομής που θα ανταποκρίνονται στα πρότυπα του 3GPP και θα είναι ιδιαίτερα ανθεκτικά στις απώλειες της ασύρματης μετάδοσης, ώστε να προσφέρονται βελτιωμένες υπηρεσίες υψηλής απόδοσης.
Μία βασική πτυχή των προδιαγραφών του συστήματος LTE είναι η ενίσχυση της υπηρεσίας Multimedia Broadcast/Multicast Services, όπου το ίδιο περιεχόμενο μεταδίδεται σε πολλαπλούς χρήστες που βρίσκονται σε μία συγκεκριμένη περιοχή μετάδοσης της υπηρεσίας. Η υπηρεσία MBMS πρωτοπαρουσιάστηκε σαν ένα νέο χαρακτηριστικό στην έκδοση 6 του 3GPP, ώστε να προσφέρει broadcast και multicast μετάδοση πολυμεσικού περιεχομένου σε κινητούς χρήστες μέσω MBMS ασύρματων φορέων μετάδοσης. Η υπηρεσία MBMS παρέχει δύο διαφορετικές μεθόδους διανομής περιεχομένου, τη μέθοδο download και τη μέθοδο streaming. Για να υποστηρίξει την αποδοτική διανομή MBMS περιεχομένου, το 3GPP συμπεριέλαβε ένα μηχανισμό Forward Error Correction στο επίπεδο εφαρμογών (AL-FEC).
Το FEC είναι μία μέθοδος που προσφέρει έλεγχο λαθών κατά τη μετάδοση δεδομένων, η οποία χρησιμοποιείται για να υποστηρίξει ή να αντικαταστήσει άλλες μεθόδους παροχής αξιοπιστίας. Στο FEC, ο αποστολέας εισάγει πλεονάζουσα πληροφορία στη μετάδοση δεδομένων. Αυτή η πληροφορία επιτρέπει στο δέκτη να ανακατασκευάσει τα αρχικά δεδομένα. Τέτοια σχήματα αναπόφευκτα προσθέτουν ένα σταθερό overhead στα μεταδιδόμενα δεδομένα με αποτέλεσμα να είναι υπολογιστικά ακριβά. Παρόλα αυτά στα multicast πρωτόκολλα, η χρήση FEC τεχνικών προσφέρει πολύ ισχυρά κίνητρα. Η κωδικοποίηση μπορεί να εξαλείψει τις ανεξάρτητες απώλειες στους διαφορετικούς δέκτες. Επιπρόσθετα, η δραματική μείωση του ρυθμού απώλειας πακέτων μειώνει κατά ένα μεγάλο μέρος την ανάγκη επαναποστολής των χαμένων πακέτων από τον αποστολέα. Επομένως, οι FEC τεχνικές είναι πολύ εύκολο να εκπληρώσουν έναν πρωταρχικό σκοπό των multicast κινητών υπηρεσιών, που είναι η προσαρμοστικότητα σε εφαρμογές με μεγάλο αριθμό δεκτών. Αυτός είναι ο λόγος που το 3GPP συνιστά τη χρήση του AL-FEC για την υπηρεσία MBMS και πιο συγκεκριμένα, υιοθετεί τη χρήση των συστηματικών κωδίκων Raptor.
Λαμβάνοντας υπόψιν τα παραπάνω, στόχος της παρούσας μεταπτυχιακής διπλωματικής εργασίας είναι η μελέτη της απόδοσης του AL-FEC πάνω σε υπηρεσίες MBMS, εστιάζοντας στην υπηρεσία της streaming μεθόδου παράδοσης δεδομένων.
Προς αυτή την κατεύθυνση, στην παρούσα εργασία μελετάμε την εφαρμογή του μηχανισμού AL-FEC πάνω σε multicast streaming υπηρεσίες των LTE, διερευνώντας πως το ποσό της πλεονάζουσας πληροφορίας ποικίλλει σε πολλαπλούς χρήστες χρησιμοποιώντας ρεαλιστικά περιβάλλοντα προσομοίωσης. Επίσης, εξετάζουμε την απόδοση του AL-FEC για διάφορες διατάξεις του δικτύου ασύρματης πρόσβασης, διάφορα μοντέλα κινητικότητας των χρηστών καθώς και διάφορες παραμέτρους της FEC κωδικοποίησης. / Long Term Evolution (LTE) is the name given to a project within the Third Generation Partnership Project (3GPP) to improve the UMTS 3G mobile system standard to cope with future requirements. Goals include improving efficiency, lowering cost, reducing complexity and improving services, making use of new spectrum opportunities. Expectations are high for LTE and are based on the premise of fault-free performance. Therefore, LTE’s initial success will be determined by the ability of handset and radio infrastructure manufacturers to deliver products that conform to 3GPP standards and are robust enough to allow operators to introduce improved services.
A key aspect of LTE specifications is the enhancement of Multimedia Broadcast/Multicast Services (MBMS), where the same content is transmitted to multiple users located in a specific service area. MBMS was firstly standardized as a new feature in 3GPP Release 6, in order to broadcast and multicast multimedia content to mobile terminals via MBMS radio bearer. The MBMS provides two different delivery methods, the download delivery and the streaming delivery method. To support efficient download and streaming delivery, 3GPP has included Application Layer Forward Error Correction (AL-FEC) in the MBMS standard.
FEC is a method for error control for data transmission that is used to augment or replace other reliability methods. In FEC, the sender introduces redundant information in the data transmitted. This information allows the receiver to reconstruct the source data. Such schemes inevitably add a constant overhead in the transmitted data and are computationally expensive. In multicast protocols however, the use of FEC techniques has very strong motivations. The encoding eliminates the effect of independent losses at different receivers. This makes these schemes able to scale irrespectively of the actual loss pattern at each receiver. Additionally, the dramatic reduction in the packet loss rate largely reduces the need for retransmission of lost data from the sender. FEC schemes are therefore so simple as to meet a prime objective for mobile multicast services, which is scalability to applications with thousands of receivers. This is the reason why 3GPP recommends the use of AL-FEC for MBMS and, more specifically, adopts the use of systematic Raptor code.
After taking into account the above analysis, objective of this master thesis is the study of AL-FEC in MBMS, focusing on the streaming delivery method.
To this direction, in this work we study the application of AL-FEC for the streaming delivery method over LTE networks. We investigate how the amount of FEC overhead varies at multiple receivers using realistic simulation scenarios and we investigate the performance of AL-FEC overhead considering different cell deployments, user mobility models and FEC encoding parameters.
|
12 |
High Slew-Rate Adaptive Biasing Hybrid Envelope Tracking Supply Modulator for LTE ApplicationsJanuary 2017 (has links)
abstract: As wireless communication enters smartphone era, more complicated communication technologies are being used to transmit higher data rate. Power amplifier (PA) has to work in back-off region, while this inevitably reduces battery life for cellphones. Various techniques have been reported to increase PA efficiency, such as envelope elimination and restoration (EER) and envelope tracking (ET). However, state of the art ET supply modulators failed to address high efficiency, high slew rate, and accurate tracking concurrently.
In this dissertation, a linear-switch mode hybrid ET supply modulator utilizing adaptive biasing and gain enhanced current mirror operational transconductance amplifier (OTA) with class-AB output stage in parallel with a switching regulator is presented. In comparison to a conventional OTA design with similar quiescent current consumption, proposed approach improves positive and negative slew rate from 50 V/µs to 93.4 V/µs and -87 V/µs to -152.5 V/µs respectively, dc gain from 45 dB to 67 dB while consuming same amount of quiescent current. The proposed hybrid supply modulator achieves 83% peak efficiency, power added efficiency (PAE) of 42.3% at 26.2 dBm for a 10 MHz 7.24 dB peak-to-average power ratio (PAPR) LTE signal and improves PAE by 8% at 6 dB back off from 26.2 dBm power amplifier (PA) output power with respect to fixed supply. With a 10 MHz 7.24 dB PAPR QPSK LTE signal the ET PA system achieves adjacent channel leakage ratio (ACLR) of -37.7 dBc and error vector magnitude (EVM) of 4.5% at 26.2 dBm PA output power, while with a 10 MHz 8.15 dB PAPR 64QAM LTE signal the ET PA system achieves ACLR of -35.6 dBc and EVM of 6% at 26 dBm PA output power without digital pre-distortion (DPD). The proposed supply modulator core circuit occupies 1.1 mm2 die area, and is fabricated in a 0.18 µm CMOS technology. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2017
|
13 |
Conception de systèmes multi-antennaires pour techniques de diversité et MIMO : application aux petits objets nomades communicantsDioum, Ibra 12 December 2013 (has links) (PDF)
La demande de transmissions à débits de plus en plus élevés s'accentue davantage avec l'essor de nouveaux services dans les réseaux de communication sans fils. Pour répondre à cette demande, une solution consiste à augmenter la capacité de transmission du canal radiofréquence entre la station de base et le terminal portatif. Ceci peut être réalisé en augmentant le nombre d'éléments rayonnant impliqués à l'émission et à la réception de cette liaison radiofréquence : on parle alors de technique MIMO (Multiple Input, Multiple Output). Cette thèse porte principalement sur la conception, l'optimisation et la caractérisation de systèmes multi-antennaires pour techniques de diversité et MIMO en bandes LTE (Long Term Evolution). Trois prototypes multi-bandes sont proposés dont deux systèmes planaires et un système d'antennes IFAs compactes. De nouvelles solutions multi-bandes et l'influence de la position de l'antenne sur le plan de masse sont étudiées pour réaliser de la diversité spatiale, de polarisation et de diagramme de rayonnement avec une faible corrélation entre les signaux reçus sur chaque antenne mais surtout une bonne efficacité totale. Une ligne de neutralisation est utilisée pour isoler les antennes et un fonctionnement multi-bande est réalisé. L'impédance d'entrée des antennes est étudiée avec la méthode de Youla & Carlin afin d'améliorer la bande passante de la structure compacte de type IFA. Les performances en diversité et en MIMO de ces systèmes sont évaluées dans différents environnements de propagation. Elles montrent que ces systèmes peuvent être utilisés efficacement pour des applications en diversité et MIMO.
|
14 |
Controle de potÃncia e estratÃgias de eficiÃncia energÃtica para comunicaÃÃes D2D subjacentes redes celulares / Power control and energy efficiency strategies for D2D communications underlying cellular networksYuri Victor Lima de Melo 14 July 2015 (has links)
Ericsson Brasil / Conselho Nacional de Desenvolvimento CientÃfico e TecnolÃgico / Em um mundo onde as pessoas contam com
smartphone, smartwatch, tablet e outros
dispositivos para mantÃ-las conectadas onde quer que vÃo, t
odos esperam que seus
aplicativos sejam executados sem problemas, tais como cham
adas abandonadas,
download
lento e vÃdeos com saltos.
Neste contexto, comunicaÃÃo dispositivo-a-dispositivo (
do inglÃs, Device-to-Device (D2D))
constitui uma tecnologia promissora, pois à um tipo de comun
icaÃÃo direta e utiliza baixa
potÃncia entre dispositivos prÃximos, permitindo-se desv
iar o trÃfego da rede mÃvel, aumentar
a eficiÃncia espectral e de potÃncia. Do ponto de vista do assi
nante, D2D significa usar
aplicaÃÃo sem problemas e aumentar o tempo de vida da bateria
do celular.
No entanto, a fim de realizar os ganhos potenciais das comunic
aÃÃes D2D, algumas
questÃes-chave devem ser abordadas, pois as comunicaÃÃes D
2D podem aumentar a
interferÃncia co-canal e comprometer a qualidade do enlace
das comunicaÃÃes celulares.
Esta dissertaÃÃo foca em tÃcnicas de Gerenciamento de Recur
sos de RÃdio (do inglÃs, Radio
Resource Management (RRM)) para mitigar a interferÃncia co
-canal para comunicaÃÃes D2D
que se baseiam na EvoluÃÃo de Longo Prazo (do inglÃs, Long Ter
m Evolution (LTE)), visando
a reduÃÃo da interferÃncia intra- e inter-celular e na melho
ria da eficiÃncia energÃtica. Os
principais esquemas de Controle de PotÃncia (do inglÃs, Pow
er Control (PC)) (e.g. OLPC,CLPC
e SDPC) e um esquema hÃbrido (CLSD) sÃo calibrados e utilizad
os no cenÃrio macro ou micro
multicelular, usando diferentes cargas e InformaÃÃo do Est
ado do Canal (do inglÃs, Channel
State Information (CSI)) perfeita ou imperfeita. AlÃm diss
o, o impacto da inclinaÃÃo da antena
(
downtilt
) à analisado, que à usada para ajustar o raio de cobertura de u
ma Evolved Node
B (eNB) e reduzir a interferÃncia co-canal, aumentando o iso
lamento de cÃlulas.
Os resultados numÃricos indicam que os regimes de controle d
e potÃncia e inclinaÃÃo
da antena, devidamente calibrados, podem fornecer ganhos p
ara a comunicaÃÃo celular e
D2D. Em outras palavras, a tecnologia D2D pode ser utilizada
para aumentar ainda mais
a eficiÃncia de espectro e a eficiÃncia energÃtica se algoritm
os de RRM forem utilizados
adequadamente / In a world where people count on their smartphone, smartwatch, tablet and other devices to keep them connected wherever they go, they expect its application to run without problems, such as dropped calls, slow download and choppy videos.
In this context, Device-to-Device (D2D) communication represents a promising technology, because it is a direct and low-power communication between devices close, allowing to offload the data transport network, increase spectral and power efficiency. From the subscriber point of view, D2D means to use applications without problem and increase battery life. However, in order to realize the potential gains of D2D communications, some key issues must be tackled, because D2D communications may increase the co-channel interference and compromise the link quality of cellular communications.
This masterâs thesis focuses on Radio Resource Management (RRM) techniques, especially Power Control (PC) schemes, to mitigate the co-channel interference for D2D communications
underlaying a Long Term Evolution (LTE) network, aiming at the reduction of the intra- and inter- cell interference and at the improvement of energy efficiency. The main PC schemes (e.g. OLPC, CLPC and SDPC) and a hybrid scheme (CLSD) are calibrated and used in macro- or micro- multicell scenario, using different loads and imperfect Channel State Information (CSI).
In addition, the impact of downtilt is analyzed, which is used to adjust the coverage radius of an Evolved Node B (eNB) and reduce co-channel interference by increasing cell isolation.
The numerical results indicate that PC schemes and downtilt, duly calibrated, can provide gains to cellular and D2D communications. In other words, D2D technology can be used to further increase the spectral and energy efficiency if RRM algorithms are used suitably.
|
15 |
Análise de modelos de predição de perdas de propagação em redes de comunicações LTE e LTE-Advanced usando técnicas de inteligência artificialCavalcanti, Bruno Jácome 20 October 2017 (has links)
Submitted by Automação e Estatística (sst@bczm.ufrn.br) on 2018-04-11T20:06:38Z
No. of bitstreams: 1
BrunoJacomeCavalcanti_TESE.pdf: 5397909 bytes, checksum: 5a245eec570a69adf8ca5d791aaddf70 (MD5) / Approved for entry into archive by Arlan Eloi Leite Silva (eloihistoriador@yahoo.com.br) on 2018-04-16T20:46:19Z (GMT) No. of bitstreams: 1
BrunoJacomeCavalcanti_TESE.pdf: 5397909 bytes, checksum: 5a245eec570a69adf8ca5d791aaddf70 (MD5) / Made available in DSpace on 2018-04-16T20:46:19Z (GMT). No. of bitstreams: 1
BrunoJacomeCavalcanti_TESE.pdf: 5397909 bytes, checksum: 5a245eec570a69adf8ca5d791aaddf70 (MD5)
Previous issue date: 2017-10-20 / A perfeita funcionalidade dos sistemas de comunicações de 3ª. e 4ª. gerações requerem, entre outras coisas, do conhecimento dos valores numéricos da predição das perdas de propagação dos sinais propagantes nos ambientes urbano, suburbano e rural. Portanto, o estudo das condições de propagação em um ambiente qualquer sempre será uma preocupação dos engenheiros projetistas. A análise e desenvolvimento de modelos robustos de predição de perdas de propagação em redes de comunicações Long Term Evolution (LTE) e Long Term Evolution Advanced (LTE-A) usando técnicas de Inteligência Artificial são realizadas neste trabalho. Os procedimentos metodológicos empregados foram aplicados no melhoramento da predição dos modelos de perda de propagação empíricos SUI, ECC-33, Ericsson 9999, TR 36.942 e o modelo do Espaço Livre, aplicados em redes LTE e LTE-A nas frequências de 800 MHz, 1800 MHz e 2600 MHz, para ambientes suburbanos em cidades de porte médio do nordeste do Brasil. Assim, nesta tese propõem-se dois modelos de Redes Neurais Artificiais (RNA): (i) o modelo de rede neural com entradas baseadas em erro (RNBE), utilizando como principal alimentador da rede o erro entre dados medidos e simulados, e, (ii) o modelo de rede neural com entradas baseadas no terreno (RNBT). O desempenho desses modelos foram comparados com os modelos de propagação considerados no trabalho e também as versões otimizadas utilizando Algoritmos Genéticos (AG) e o Método dos Mínimos Quadrados (LMS). Também foram realizadas comparações com valores medidos, obtidos a partir de uma campanha de medição realizada na cidade de Natal, Estado do Rio Grande do Norte. Os resultados finais obtidos através de simulações e medições apresentaram boas concordâncias métricas, com destaque para a performance do modelo RNBE. A principal contribuição dessa tese é que, ao utilizar essas técnicas que fazem uso de maneira mais eficiente dos modelos de propagação empíricos, pode-se estimar sinais propagantes realistas, evitando erros no planejamento e implementações de redes sem fio LTE e LTE-A em áreas suburbanas. / The perfect functionality of the 3rd and 4th generation of wireless systems requires, among other parameters, knowledge of the numerical values of the prediction of loss of propagation of propagation signals in urban, suburban and rural environments. Therefore, the study of propagation conditions in any environment will always be a concern of design engineers. The analysis and development of robust propagation loss prediction models in Long Term Evolution (LTE) and Long Term Evolution Advanced (LTE-A) communications networks using Artificial Intelligence techniques is performed in this work. The methodologies used were applied to improve the prediction of loss of empirical propagation SUI, ECC-33, Ericsson 9999, TR 36.942 models and the Free Space model applied in LTE and LTE-A networks in the frequencies of 800 MHz, 1800 MHz and 2600 MHz, for suburban environments in mid-sized cities in northeastern Brazil. Thus, in these thesis two models of Artificial Neural Networks (RNA) are proposed: (i) the neural network model with inputs based on error (RNBE) using as main feeder of the network the error between measured and simulated data, and (ii) the neural network model with land-based inputs (RNBT). The performance of these models was compared with the models of propagation considered in the work and also the versions optimized using Genetic Algorithms (AG) and the Least Square Method (LMS). Comparisons were also made with measured values, obtained from a measurement campaign carried out in the city of Natal, state of Rio Grande do Norte. The final results obtained through simulations and measurements presented good metric concordances, with emphasis on the performance of the RNBE model. Thus, the main contribution of this thesis is that, by using these techniques that make more efficient use of empirical propagation models, we can estimate realistic propagation signals, avoiding errors in the planning and implementations of LTE and LTE- A wireless networks in suburban areas.
|
16 |
The Combined Effect Of Reduced Feedback, Frequency-Domain Scheduling, And Multiple Antenna Techniques On The Performance Of LTEDonthi, Sushruth N 04 1900 (has links) (PDF)
Frequency-domain scheduling, multiple antenna techniques, and rate adaptation enable next generation orthogonal frequency division multiple access (OFDMA) cellular systems such as Long Term Evolution (LTE) to achieve significantly higher downlink spectral efficiencies. However, this comes at the expense of increased feedback overhead on the uplink. LTE uses a pragmatic combination of several techniques to reduce the channel state feedback required by a frequency-domain scheduler.
In subband-level feedback scheme specified in LTE, the user reduces feedback by only reporting the channel quality indicator (CQI) computed over groups of resource blocks called subbands. LTE also specifies an alternate user selected subband feedback scheme, in which the feedback overhead is reduced even further by making each user feed back the indices of the best M subbands and only one CQI value averaged over all the M subbands. The coarse frequency granularity of the feedback in the above schemes leads to an occasional incorrect determination of rate by the scheduler for some resource blocks. The overall throughput of LTE depends on the method used to generate the CQI and the statistics of the channel, which depends on the multiple antenna technique used.
In this thesis, we develop closed-form expressions for the throughput achieved by the user selected and subband-level CQI feedback schemes of LTE. The comprehensive analysis quantifies the joint effects of four critical components on the overall system throughput, namely, scheduler, multiple antenna mode, CQI feedback scheme, and CQI generation method. The performance of a wide range of schedulers, namely, round robin, greedy, and proportional fair schedulers and several multiple antenna diversity modes such as receive antenna diversity and open-and closed-loop transmit diversity is analyzed. The analysis clearly brings out the dependence of the overall system throughput on important parameters such as number of resource blocks per subband and the rate adaptation thresholds. The effect of the coarse subband-level frequency granularity of feedback is explicitly captured. The analysis provides an independent theoretical reference and a quick system parameter optimization tool to an LTE system designer. It also helps us theoretically understand the behavior of OFDMA feedback reduction techniques when operated under practical system constraints.
Another contribution of this thesis is a new statistical model for the effective exponential SNR mapping (EESM), which is a highly non-linear mapping that is widely used in the design, analysis, and simulation of OFDMA systems. The statistical model is shown to be both accurate and analytically tractable, and plays a crucial role in facilitating the analysis of the throughput of LTE when EESM is used to generate the CQI.
|
17 |
Development of an antenna system for a relay-based wireless networkPetropoulos, Ioannis January 2012 (has links)
The proliferation of modern wireless networks increases demand for high capacity and
throughput in order to provide faster, more robust, efficient and broadband services to end users.
Mobile WiMAX and LTE are examples of such networks in which for some cases they have
exposed limited connectivity due to harsh environment. Relay stations are preferred to
overcome problems of weak or no access for such network devices, that are placed in specific
positions to maintain high quality of data transfer at low cost and provide the required
connectivity anywhere anytime. These stations should be equipped with an antenna system
capable of establishing communication between base station (backhaul link) and end users
(access link).
This thesis focuses on the design and development of a new antenna system that is suitable for a
relay-based wireless network. Planar geometries of microstrip patch antennas are utilized. The
antenna system comprises two antenna modules: a new design of a single antenna for access
link and a new design of an antenna array for backhaul link realization. Both antenna
specifications are compatible with the IEEE802.16j protocol standard. Hence, relay station
should be capable of pointing its radiation pattern to the base station antenna, thus to achieve the
desired radiation pattern of the relay station, a new beam-forming module is proposed, designed
and developed to generate the proper radiation pattern. The beam-forming module incorporating
digital phase shifters and attenuator chips is fabricated and tested. The optimization process
using the Least Mean Square (LMS) algorithm is considered in this study to assign the proper
phase and amplitude that is necessary to each radiation element excitation current, to produce
the desired steered radiation pattern.
A comprehensive study on the coupling effects for several relative positions between two new
backhaul and access link antenna elements is performed. Two new antenna configurations for
coupling reduction are tested and the simulated and measured results in terms of antenna
radiation performances were compared and commented.
|
18 |
Perspectives of Jamming, Mitigation and Pattern Adaptation of OFDM Pilot Signals for the Evolution of Wireless NetworksRao, Raghunandan M. 28 September 2016 (has links)
Wireless communication networks have evolved continuously over the last four decades in order to meet the traffic and security requirements due to the ever-increasing amount of traffic. However this increase is projected to be massive for the fifth generation of wireless networks (5G), with a targeted capacity enhancement of 1000× w.r.t. 4G networks. This enhanced capacity is possible by a combination of major approaches (a) overhaul of some parts and (b) elimination of overhead and redundancies of the current 4G. In this work we focus on OFDM reference signal or pilot tones, which are used for channel estimation, link adaptation and other crucial functions in Long-Term Evolution (LTE). We investigate two aspects of pilot signals pertaining to its evolution - (a) impact of targeted interference on pilots and its mitigation and (b) adaptation of pilot patterns to match the channel conditions of the user.
We develop theoretical models that accurately quantify the performance degradation at the user’s receiver in the presence of a multi-tone pilot jammer. We develop and evaluate mitigation algorithms to mitigate power constrained multi-tone pilot jammers in SISO- and full rank spatial multiplexing MIMO-OFDM systems. Our results show that the channel estimation performance can be restored even in the presence of a strong pilot jammer. We also show that full rank spatial multiplexing in the presence of a synchronized pilot jammer (transmitting on pilot locations only) is possible when the channel is flat between two pilot locations in either time or frequency.
We also present experimental results of multi-tone broadcast pilot jamming (Jamming of Cell Specific Reference Signal) in the LTE downlink. Our results show that full-band jamming of pilots needs 5 dB less power than jamming the entire downlink signal, in order to cause Denial of Service (DoS) to the users. In addition to this, we have identified and demonstrated a previously unreported issue with LTE termed ‘Channel Quality Indicator (CQI) Spoofing’. In this scenario, the attacker tricks the user terminal into thinking that the channel quality is good, by transmitting interference transmission only on the data locations, while deliberately avoiding the pilots. This jamming strategy leverages the dependence of the adaptive modulation and coding (AMC) schemes on the CQI estimate in LTE.
Lastly, we investigate the idea of pilot pattern adaptation for SISO- and spatial multiplexing MIMO-OFDM systems. We present a generic heuristic algorithm to predict the optimal pilot spacing and power in a nonstationary doubly selective channel (channel fading in both time and frequency). The algorithm fits estimated channel statistics to stored codebook channel profiles and uses it to maximize the upper bound on the constrained capacity. We demonstrate up to a 30% improvement in ergodic capacity using our algorithm and describe ways to minimize feedback requirements while adapting pilot patterns in multi-band carrier aggregation systems. We conclude this work by identifying scenarios where pilot adaptation can be implemented in current wireless networks and provide some guidelines to adapt pilots for 5G. / Master of Science / Wireless communications have evolved continuously over the last four decades in order to meet the ever-increasing number of users. The next generation of wireless networks, named 5G, is expected to interconnect a massive number of devices called the Internet of Things (IoT). Compared to the current generation of wireless networks (termed 4G), 5G is expected to provide a thousandfold increase in data rates. In addition to this, the security of these connected devices is also a challenging issue that needs to be addressed. Hence in the event of an attack, even if a tiny fraction of the total number of users are affected, this will still result in a large number of users who are impacted.
The central theme of this thesis is the evolution of <i>Orthogonal Frequency Division Multiplexing (OFDM) pilot signals</i> on the road from 4G to 5G wireless networks. In OFDM, pilot signals are sent in parallel to data in order to aid the receiver in mitigating the impairments of the wireless channel. In this thesis, we look at two perspectives of the evolution of pilots: a) targeted interference on pilot signals, termed as ‘Multi-tone pilot jamming’ and b) adapting pilot patterns to optimize throughput.
In the first part of the thesis, we investigate the (a) impact of multi-tone pilot jamming and (b) propose and evaluate strategies to counter multi-tone pilot jamming. In particular, we propose methods that (a) have the potential to be implemented in the Third Generation Partnership Project Long-Term Evolution (3GPP LTE) standard, and (b) have the ability to maintain high data rates with a multi-antenna receiver, in the presence of a multi-tone pilot jammer. We also experiment and analyze the behavior of LTE in the presence of such targeted interference.
In the second half of the thesis, we explore the idea of adapting the density of pilots to optimize throughput. Increasing the pilot density improves the signal reception capabilities, but reduces the resources available for data and hence, data rate. Hence we propose and evaluate strategies to balance between these two conflicting requirements in a wireless communication system.
In summary, this thesis provides and evaluates ideas to mitigate interference on pilot signals, and design data rate-maximizing pilot patterns for future OFDM-based wireless networks.
|
19 |
Context-aware mechanisms for device discovery optimization / Mécanismes sensibles au contexte pour l’optimisation de la découverte des appareilsBen Mosbah, Aziza 28 November 2017 (has links)
La recherche dans les réseaux de communication cherche à améliorer la capacité et les performances des technologies de réseaux tout en satisfaisant à la fois la demande croissante d’instantanéité des accès et des échanges d’information. Par exemple, les travaux sur les systèmes sans-fil de cinquième génération (5G) visent à augmenter le débit de données et l’efficacité spectrale mais aussi à réduire la latence et la consommation d’énergie. Dans ce contexte, la mise en réseau basée sur la proximité est envisagée afin d’améliorer l’échange d’information entre périphériques proches, même dans le cas où aucune infrastructure n’est disponible. Une composante essentielle de ces solutions est la capacité de rapidement détecter (ou découvrir) les autres systèmes à proximité. Bien que l’utilisation de la découverte des systèmes et de services ne soit pas à proprement parler une nouveauté dans les réseaux, son adoption dans les réseaux sans-fil a augmenté l’importance et la pertinence de ce type de mécanismes. Par conséquence, l’objectif de cette thèse est d’optimiser les performances du processus de découverte en utilisant des mécanismes contextuels. Dans un premier temps, nous commençons par une description préliminaire des défis auxquels sont confrontés les utilisateurs du réseau et comment les solutions actuelles (c’est-à-dire Long Term Evolution (LTE)) ne peuvent pas couvrir leurs besoins. Dans un deuxième temps, nous présentons l’architecture utilisée pour évaluer nos propositions: l’architecture appareil-à-appareil (D2D) qui est définie par le programme de partenariat de troisième génération (3GPP) pour être utilisée dans les réseaux LTE. Nous mettrons tout particulièrement l’accent sur la description du processus de découverte tel qu’il est défini dans les spécifications. Finalement, nous présentons une étude analytique, avec un modèle de mise en oeuvre pour tester et valider les performances de la découverte directe. En utilisant cette analyse, nous proposons un algorithme de transmission adaptatif qui optimise le processus de découverte pour les topologies statiques. Cette contribution sert de base à des algorithmes étendus et améliorés ciblant premièrement des scénarios où la disponibilité de données historiques permet de prédire les fluctuations de la densité des utilisateurs, et deuxièmement des situations entièrement dynamiques sans infrastructure ou support externe, montrant comment les mécanismes contextuels peuvent fournir des performances presque optimales. Toutes ces contributions et ces analyses sont supportées et validées par des modèles de simulation et des expériences qui montrent l’importance et l’exactitude de nos propositions dans l’optimisation de la performance et de la fiabilité dans le cadre de la découverte directe / Research in communication networks aims to improve the capabilities and performance of network technologies, and to satisfy the ever increasing demand for instant information access and exchange. For example, work on Fifth Generation (5G) Wireless Systems aims to increase data rates and spectral efficiency while lowering latency and energy consumption. Within this context, proximity-based networking is being considered in order to improve the data sharing between nearby devices, regardless of the availability of additional infrastructure. An integral component of these solutions is the ability to quickly detect (or discover) other systems in the vicinity. While system and service discovery has been a concept used in networks for some time, its adoption by wireless networks has increased the importance and relevance of this type of mechanisms. Therefore, the goal of this thesis is to optimize the performance of the discovery process by using context-aware mechanisms. First, we start by an introductory description of the challenges faced by network users and how current solutions (i.e. Long Term Evolution (LTE)) are unable to cover their needs. Second, we present the architecture we will use to evaluate our proposals, namely the device-to-device (D2D) architecture defined by the Third-Generation Partnership Program (3GPP) for use in LTE networks, with an emphasis on the description of the discovery process as defined in the standard specifications. Then, we present an analytical study, along with an implementation model to test and validate the performance of direct discovery. Building upon that analysis, we propose an adaptive transmission algorithm that optimizes the discovery process for static topologies. This contribution is used as the foundation for extended and enhanced algorithms targeting scenarios where the availability of historic data allows for predicting user density fluctuations, and fully dynamic situations without external infrastructure or support, showing how context-aware mechanisms can provide almost optimal performance. All these contributions and analysis are supported and validated by simulation models and experiments that showcase the importance and correctness of our proposals in the optimization of the performance and reliability in D2D direct discovery
|
20 |
Device-device communication and multihop transmission for future cellular networksAmate, Ahmed Mohammed January 2015 (has links)
The next generation wireless networks i.e. 5G aim to provide multi-Gbps data traffic, in order to satisfy the increasing demand for high-definition video, among other high data rate services, as well as the exponential growth in mobile subscribers. To achieve this dramatic increase in data rates, current research is focused on improving the capacity of current 4G network standards, based on Long Term Evolution (LTE), before radical changes are exploited which could include acquiring additional/new spectrum. The LTE network has a reuse factor of one; hence neighbouring cells/sectors use the same spectrum, therefore making the cell edge users vulnerable to inter-cell interference. In addition, wireless transmission is commonly hindered by fading and pathloss. In this direction, this thesis focuses on improving the performance of cell edge users in LTE and LTE-Advanced (LTE-A) networks by initially implementing a new Coordinated Multi-Point (CoMP) algorithm to mitigate cell edge user interference. Subsequently Device-to-Device (D2D) communication is investigated as the enabling technology for maximising Resource Block (RB) utilisation in current 4G and emerging 5G networks. It is demonstrated that the application, as an extension to the above, of novel power control algorithms, to reduce the required D2D TX power, and multihop transmission for relaying D2D traffic, can further enhance network performance. To be able to develop the aforementioned technologies and evaluate the performance of new algorithms in emerging network scenarios, a beyond-the-state-of-the-art LTE system-level simulator (SLS) was implemented. The new simulator includes Multiple-Input Multiple-Output (MIMO) antenna functionalities, comprehensive channel models (such as Wireless World initiative New Radio II i.e. WINNER II) and adaptive modulation and coding schemes to accurately emulate the LTE and LTE-A network standards. Additionally, a novel interference modelling scheme using the 'wrap around' technique was proposed and implemented that maintained the topology of flat surfaced maps, allowing for use with cell planning tools while obtaining accurate and timely results in the SLS compared to the few existing platforms. For the proposed CoMP algorithm, the adaptive beamforming technique was employed to reduce interference on the cell edge UEs by applying Coordinated Scheduling (CoSH) between cooperating cells. Simulation results show up to 2-fold improvement in terms of throughput, and also shows SINR gain for the cell edge UEs in the cooperating cells. Furthermore, D2D communication underlaying the LTE network (and future generation of wireless networks) was investigated. The technology exploits the proximity of users in a network to achieve higher data rates with maximum RB utilisation (as the technology reuses the cellular RB simultaneously), while taking some load off the Evolved Node B (eNB) i.e. by direct communication between User Equipment (UE). Simulation results show that the proximity and transmission power of D2D transmission yields high performance gains for a D2D receiver, which was demonstrated to be better than that of cellular UEs with better channel conditions or in close proximity to the eNB in the network. The impact of interference from the simultaneous transmission however impedes the achievable data rates of cellular UEs in the network, especially at the cell edge. Thus, a power control algorithm was proposed to mitigate the impact of interference in the hybrid network (network consisting of both cellular and D2D UEs). It was implemented by setting a minimum SINR threshold so that the cellular UEs achieve a minimum performance, and equally a maximum SINR threshold to establish fairness for the D2D transmission as well. Simulation results show an increase in the cell edge throughput and notable improvement in the overall SINR distribution of UEs in the hybrid network. Additionally, multihop transmission for D2D UEs was investigated in the hybrid network: traditionally, the scheme is implemented to relay cellular traffic in a homogenous network. Contrary to most current studies where D2D UEs are employed to relay cellular traffic, the use of idle nodes to relay D2D traffic was implemented uniquely in this thesis. Simulation results show improvement in D2D receiver throughput with multihop transmission, which was significantly better than that of the same UEs performance with equivalent distance between the D2D pair when using single hop transmission.
|
Page generated in 0.1208 seconds