• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 2
  • Tagged with
  • 9
  • 9
  • 5
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Channel Estimation Optimization in 5G New Radio using Convolutional Neural Networks / Kanalestimeringsoptimering i 5G NR med konvolutionellt neuralt nätverk

Adolfsson, David January 2023 (has links)
Channel estimation is the process of understanding and analyzing the wireless communication channel's properties. It helps optimize data transmission by providing essential information for adjusting encoding and decoding parameters. This thesis explores using a Convolutional Neural Network~(CNN) for channel estimation in the 5G Link Level Simulator, 5G-LLS, developed by Tietoevry. The objectives were to create a Python framework for channel estimation experimentation and to evaluate CNN's performance compared to the conventional algorithms Least Squares~(LS), Minimum Mean Square Error~(MMSE) and Linear Minimum Mean Square Error~(LMMSE). Two distinct channel model scenarios were investigated in this study. The results from the study suggest that CNN outperforms LMMSE, LS, and MMSE regarding Mean Squared Error~(MSE) for both channel models, with LMMSE at second place. It managed to lower to the MSE by 85\% compared to the LMMSE for the correlated channel and 78\% for the flat fading channel. In terms of the overall system-level performance, as measured by Bit-Error Rate (BER), the CNN only managed to outperform LS and MMSE. The CNN and the LMMSE yielded similar results. This was due to that the LMMSE's MSE was still good enough to demodulate the symbols for the QPSK modulation scheme correctly.  The insights in this thesis work enables Tietoevry to implement more machine learning algorithms and further develop channel estimation in 5G telecommunications and wireless communication networks through experiments in 5G-LLS. Given that the CNN did not increase the performance of the communication system, future studies should test a broader range of channel models and consider more complex modulation schemes. Also, studying other and more advanced machine learning techniques than CNN is an avenue for future research. / Kanalestimering är en process i trådlösa kommunikationssystem som handlar om att analysera och förstå det trådlösa mediumets egenskaper. Genom effektiv kanalestimering kan dataöverföringen optimeras genom att anpassa signalen efter den trådlösa kanalen. Detta arbete utforskar användningen av ett konvolutionellt neuralt nätverk (CNN) för kanalestimering i Tietoevrys 5G-datalänkslagersimulator (5G-LLS). Målen är att (1) skapa ett Python-ramverk för kanalestimeringsexperiment samt att (2) utvärdera CNN:s prestanda jämfört med konventionella algoritmerna minsta kvadratmetoden (LS), minimalt medelkvadratsfel (MMSE) och linjärt minimalt medelkvadratsfel (LMMSE). Två olika kanalmodellsituationer undersöks i detta arbete. Resultaten visar att CNN överträffar LMMSE, LS och MMSE i form av medelkvadratisk fel (MSE) för båda kanalmodellerna, med LMMSE på andra plats. CNN:n lyckades minska MSE:n med 85\% jämfört med LMMSE för den korrelerade kanalen och med 78\% för den snabbt dämpande kanalen. Vad gäller systemnivåprestanda, mätt med hjälp av bitfelsfrekvens (BER), lyckades CNN endast överträffa LS och MMSE. CNN och LMMSE gav liknande resultat. Detta beror på att LMMSE:s MSE fortfarande var tillräckligt låg för att korrekt demodulera symbolerna för QPSK-modulationsschemat. Resultatet från detta examensarbete möjliggör för Tietoevry att implementera fler maskininlärningsalgoritmer och vidareutveckla kanalestimering inom 5G-telekommunikation och trådlösa kommunikationsnätverk genom experiment i 5G-LLS. Med tanke på att CNN inte överträffade samtliga kanalestimeringstekniker bör framtida studier testa ett bredare utbud av kanalmodeller och överväga mer komplexa moduleringsscheman. Framtida arbeten bör även utforska fler och mer avancerade maskininlärningsalgoritmer än CNN.
2

Implementace mechanismů zajišťujících “RAN Slicing” v simulačním nástroji Network Simulator 3 / Implementation of mechanisms ensuring “RAN Slicing” in the simulation tool Network Simulator 3

Motyčka, Jan January 2021 (has links)
This thesis deals with the topic of network slicing technology in 5G networks, mainly on the RAN part. In the theoretical part, basic principles of 5G network slicing in core network part and RAN part are presented. Practical part contains a simulation scenario created in NS3 simulator with LENA 5G module. Results of this simulation are presented and discussed with the emphasis on RAN slicing.
3

Performance Analysis Of Massive MIMO With Port Reduction / Prestandaanalys av massiv MIMO med portreduktion

Zhang, Tingrui January 2022 (has links)
In centralized radio access network (C-RAN) architecture, the base-band unit (BBU) is connected to one or more remote radio units (RRUs) via a fronthaul (FH) interface. Upgrading base station antennas in C-RAN to support massive multiple-input multiple-output (MIMO) technology can improve network spectral efficiency and largely boost the capacity of the 5G system. Those great benefits also introduce new challenges to the FH interface since the required FH capacity increases proportionally to the number of transceiver units (TXRUs) for traditional receiver processing at the BBU. To reduce the FH link load, different base-band splitting options between RRUs and BBU are considered in practical C-RAN networks. In this project, we investigate three beamforming algorithms (MRC, DFT and Enhanced) which are expected to reduce the number of streams on FH link, and evaluate their performance for single-user MIMO in different mobility scenarios via system-level simulations. The results show that we successfully reach the goal of reducing the number of streams to one-fourth the number of TXRUs meanwhile maintaining relatively good performance. Additionally, we observe that the Enhanced algorithm performs the best in majority of scenarios. / I en centraliserad nätverksarkitektur för radioåtkomst är basbandsenheten ansluten till en eller flera fjärrradionheter via ett fronthaul-gränssnitt. Genom att uppgradera basstationsantennerna i C-RAN för att stödja massiv multipel input-multipel output-teknik kan man förbättra nätverkets spektraleffektivitet och till stor del öka kapaciteten i 5G-systemet. Dessa stora fördelar medför också nya utmaningar för FH-gränssnittet eftersom den nödvändiga FH-kapaciteten ökar proportionellt mot antalet sändare för traditionell mottagarbearbetning vid BBU. För att minska belastningen på FH-länken övervägs olika alternativ för uppdelning av basbandet mellan RRUs och BBU i praktiska C-RAN-nät. I det här projektet undersöker vi tre strålformningsalgoritmer (MRC, DFT och Enhanced) som förväntas minska antalet strömmar på FH-länken och utvärderar deras prestanda för single-user MIMO i olika mobilitetsscenarier med hjälp av simuleringar på systemnivå. Resultaten visar att vi lyckas uppnå målet att minska antalet strömmar till en fjärdedel av antalet TXRU:s samtidigt som vi behåller en relativt god prestanda. Dessutom kan vi konstatera att den förbättrade algoritmen presterar bäst i de flesta scenarier.
4

Real-Time Inspired Hybrid Scheduler for 5G New Radio

Andersson, Tommy January 2022 (has links)
As an increasing position of the world’s communication moves towards the cloud and wireless solutions the requirement for good throughput and low delay increases. One step towards meeting higher requirements is the move from 4G Long Term Evolution (LTE) to 5G New Radio (NR). In order to utilize the potential of 5G NR, software needs to be improved. With the goal to lower the delay for delay critical applications and services when using 5G NR this thesis studies a new scheduler inspired by Earliest Deadline First (EDF) as a soft real-time system scheduler. This new scheduler called Real-Time Inspired Hybrid Scheduler (RTIHS) is proposed where two different schedulers are used depending on if the network traffic is delay critical or not. Delay critical traffic is served by the new Deadline Inspired Scheduler (DIS) scheduler and other traffic by a traditional Round Robin (RR) scheduler. The transmissions that are delay critical are prioritized differently with the use of a constant that has two functions. To determine if the transmission is in time or not, and how close the transmission is to the fixed deadline. Up until the deadline the priority of that transmission is increased with a factor that is affected by how close the current time is to the deadline. If the deadline has been missed the priority is, however, decreased with respect to how much it missed the deadline.RTIHS is implemented and tested in a state-of-the-art system simulator where services such as;Cloud Gaming (CG), Video on Demand (VoD), and web browsing are evaluated. An already existing technology named Low Latency Low Loss Scalable Throughput (L4S) is included in the evaluations to investigate how RTIHS scales. The performance of RTIHS is then compared to a Delay Scheduler(DS) and a RR scheduler that act as the baseline. The results show that RTIHS performs better for CG in terms of delay and nominal rate than the baseline, especially when the network is under high load. Using RTIHS compared to a DS shows an average increase of nominal rate by roughly 6 % and an average decrease of delay by 16 % for the average users. At the same time RTIHS does show higher delay and lower throughput for services such as VoD and web browsing, making its performance for those service worse than the baseline, especially when the network load is high. With the same comparison as before, RTIHS shows an average 17 % higher delay for web browsing and 1 % lower requested video rate than the DS for the average users. This is due to how the evaluated services are prioritized with the limited resources available. Since RTIHS prioritizes CG more, less resources remain for the other services. The baseline is not as biased towards CG and therefore has a lower nominal rate and higher delay for said service but better for the other services in comparison. Activating L4S mitigates the underwhelming performance of RTIHS for VoD and web browsing further improves the performance for CG. This is also true for the baseline which increases performance for CG and further increases the performance for other services with a small amount. With L4S activated the average increase of nominal rate in CG for RTIHS compared to the DS is 13 % for the average users and the average delay decrease is 9 %. Meanwhile the average requested video rate for RTIHS is less than 1 % lower than that of the DS for the average users and the delay for web browsing is 10 % higher than the DS for the average users.
5

Extension of an Existing Simulator for Cellular Communication with Support for 5G NR : Porting of MIMO Channel Estimation Methods form a prototype to an existing Link-Level Simulator / Utökning av en Existerande Simulator för Telekommunikation med Stöd för 5G NR : Portering av Metoder för MIMO Channel Estimation från en Prototypsimulator till en Link-Level Simulator

Haj Hussein, Majed, Alnahawi, Abdulsalam January 2022 (has links)
Multiple Input Multiple Output (MIMO) and Orthogonal Frequency Division Multiplexing (OFDM) are two efficient technologies used to achieve higher data rate, lowlatency, robustness against fading used in 5G New Radio (NR). At the receiver end,the data arrives distorted due to disturbance during transfer over the wireless channel.Channel estimation is the applied technique at the receiver end to overcome this problemand mitigate the effect of the disturbance over the wireless channel. The main objective of this thesis is to port an existing channel estimator from a prototypesimulator for 5G to a complete Link-Level simulator that currently has support for 4Gtraffic. Two channel estimation algorithms have been investigated and implemented inthe Link-Level simulator based on MIMO-OFDM system. The channel estimators arethe Least Square (LS) and the Linear Minimum Mean Square Error (LMMSE). Theperformance of the channel estimators is evaluated in terms of Bit Error Rate vs Signalto Noise Ratio. The effectiveness of those implemented algorithms is evaluated using a simulation,where the results show that each channel estimation algorithm is suitable for a specificuse case and depends on channel properties and different scenarios but regardless thetime complexity, the LMMSE has better performance than the LS.
6

Fuzzing Radio Resource Control messages in 5G and LTE systems : To test telecommunication systems with ASN.1 grammar rules based adaptive fuzzer / Fuzzing Radio Resource Control-meddelanden i 5Goch LTE-system

Potnuru, Srinath January 2021 (has links)
5G telecommunication systems must be ultra-reliable to meet the needs of the next evolution in communication. The systems deployed must be thoroughly tested and must conform to their standards. Software and network protocols are commonly tested with techniques like fuzzing, penetration testing, code review, conformance testing. With fuzzing, testers can send crafted inputs to monitor the System Under Test (SUT) for a response. 3GPP, the standardization body for the telecom system, produces new versions of specifications as part of continuously evolving features and enhancements. This leads to many versions of specifications for a network protocol like Radio Resource Control (RRC), and testers need to constantly update the testing tools and the testing environment. In this work, it is shown that by using the generic nature of RRC specifications, which are given in Abstract Syntax Notation One (ASN.1) description language, one can design a testing tool to adapt to all versions of 3GPP specifications. This thesis work introduces an ASN.1 based adaptive fuzzer that can be used for testing RRC and other network protocols based on ASN.1 description language. The fuzzer extracts knowledge about ongoing RRC messages using protocol description files of RRC, i.e., RRC ASN.1 schema from 3GPP, and uses the knowledge to fuzz RRC messages. The adaptive fuzzer identifies individual fields, sub-messages, and custom data types according to specifications when mutating the content of existing messages. Furthermore, the adaptive fuzzer has identified a previously unidentified vulnerability in Evolved Packet Core (EPC) of srsLTE and openLTE, two open-source LTE implementations, confirming the applicability to robustness testing of RRC and other network protocols. / 5G-telekommunikationssystem måste vara extremt tillförlitliga för att möta behoven för den kommande utvecklingen inom kommunikation. Systemen som används måste testas noggrant och måste överensstämma med deras standarder. Programvara och nätverksprotokoll testas ofta med tekniker som fuzzing, penetrationstest, kodgranskning, testning av överensstämmelse. Med fuzzing kan testare skicka utformade input för att övervaka System Under Test (SUT) för ett svar. 3GPP, standardiseringsorganet för telekomsystemet, producerar ofta nya versioner av specifikationer för att möta kraven och bristerna från tidigare utgåvor. Detta leder till många versioner av specifikationer för ett nätverksprotokoll som Radio Resource Control (RRC) och testare behöver ständigt uppdatera testverktygen och testmiljön. I detta arbete visar vi att genom att använda den generiska karaktären av RRC-specifikationer, som ges i beskrivningsspråket Abstract Syntax Notation One (ASN.1), kan man designa ett testverktyg för att anpassa sig till alla versioner av 3GPP-specifikationer. Detta uppsatsarbete introducerar en ASN.1-baserad adaptiv fuzzer som kan användas för att testa RRC och andra nätverksprotokoll baserat på ASN.1- beskrivningsspråk. Fuzzer extraherar kunskap om pågående RRC meddelanden med användning av protokollbeskrivningsfiler för RRC, dvs RRC ASN.1 schema från 3GPP, och använder kunskapen för att fuzz RRC meddelanden. Den adaptiva fuzzer identifierar enskilda fält, delmeddelanden och anpassade datatyper enligt specifikationer när innehållet i befintliga meddelanden muteras. Dessutom har den adaptiva fuzzer identifierat en tidigare oidentifierad sårbarhet i Evolved Packet Core (EPC) för srsLTE och openLTE, två opensource LTE-implementeringar, vilket bekräftar tillämpligheten för robusthetsprovning av RRC och andra nätverksprotokoll.
7

Optimization of Physical Uplink Resource Allocation in 5G Cellular Network using Monte Carlo Tree Search / Optimering av fysisk resurstilldelning för uppkoppling i 5G-cellulärt nätverk med hjälp av Monte Carlo Tree Search

Girame Rizzo, Gerard January 2022 (has links)
The Physical Uplink Control Channel (PUCCH), which is mainly used to transmit Uplink Control Information (UCI), is a key component to enable the 5G NR system. Compared to LTE, NR specifies a more flexible PUCCH structure to support various applications and use cases. In the literature, however, an optimized solution that exploits those degrees of freedom is missing and fixed-heuristic solutions are just implemented in current 5G networks. Consequently, the predefined PUCCH format configuration is inefficient because it proposes a one-size-fits-all solution. In short, the number of symbols dedicated to PUCCH resources are often pre-determined and fixed without considering the UE’s specific needs and requirements. Failure to exploit the diversity of PUCCH format configurations and sticking to the one-size-fits-all solution, translates into a poor PUCCH resource allocation in the physical grid. To overcome this, a solution is presented by introducing a more efficient PUCCH re-distribution algorithm that exploits the same Physical Resource Block (PRB) domain. This leads into a combinatorial optimization problem with the objective of minimizing the PRBs utilization while maximizing the number of resources allocated and, in essence, the number of UEs “served”. For this purpose, we utilize a Monte Carlo Tree Search (MCTS) method to find the optimal puzzle on the grid, which offers clear advantages in search time benchmarked against an exhaustive search method. A wide variety of cases and scenario-dependent solutions are allowed using this puzzling technique. Overall results indicate that the optimal solutions devised by MCTS in conjunction with the new resource allocation algorithm bring substantial improvement compared to the one-size-fits-all baseline. In particular, this novel implementation, nonexistent to date in the 3GPP standard, reduces the dedicated PUCCH resource region by 1=6 without sacrificing any user’s allocation, while reusing the remaining PRBs (an increase of up to 11:36%) for the UL data channel or PUSCH. As a future work, we expect to observe similar improvements in higher layers metrics and KPIs, once link-level reception details are implemented and simulated for UL control channels based on our resource allocation solution. / PUCCH, som huvudsakligen används för att överföra UCI, är en nyckelkomponent för att möjliggöra 5G NR-systemet. Jämfört med LTE specificerar NR en mer flexibel PUCCH-struktur för att stödja olika tillämpningar och användningsfall. I litteraturen saknas dock en optimerad lösning som utnyttjar dessa frihetsgrader, och fasta heuristiska lösningar har bara implementerats i nuvarande 5G-nät. Följaktligen är den fördefinierade konfigurationen av PUCCH-formatet ineffektiv eftersom den föreslår en lösning som passar alla. Kort sagt, antalet symboler som är avsedda för PUCCH-resurser är ofta förutbestämda och fastställda utan att man tar hänsyn till UE:s specifika behov och krav. Om man inte drar nytta av den mångfald av PUCCH-formatkonfigurationer och håller sig till en lösning som passar alla, kommer det att leda till en dålig PUCCH-resursallokering i det fysiska resursnätet. För att lösa detta presenteras en lösning genom att införa en effektivare algoritm för omfördelning av PUCCH som utnyttjar samma PRB-domän. Detta leder till ett kombinatoriskt optimeringsproblem med målet att minimera PRB-utnyttjandet och samtidigt maximera antalet tilldelade resurser och, i huvudsak, antalet betjänadeänvändare. För detta ändamål använder vi en MCTS-metod för att hitta det optimala pusslet på rutnätet, vilket ger klara fördelar i söktid jämfört med en uttömmande sökmetod. En mängd olika fall och scenarioberoende lösningar tillåts med hjälp av denna pusselteknik. De övergripande resultaten visar att de optimala lösningarna som MCTS har tagit fram tillsammans med den nya resursfördelningsalgoritmen ger avsevärda förbättringar jämfört med den grundläggande lösningen med en enda lösning som passar alla. Denna nya implementering, som hittills inte funnits i 3GPP-standarden, minskar det dedikerade PUCCH-resursområdet med 1=6 utan att offra någon användarallokering, samtidigt som de återstående PRB:erna återanvänds (en ökning med upp till 11:36%) för UL-datakanalen eller PUSCH. Som ett framtida arbete förväntar vi oss att observera liknande förbättringar i mätvärden och KPI:er på högre nivåer, när mottagningsdetaljer på länknivå har genomförts och simulerats för uplink-kontrollkanaler baserade på vår resursallokeringslösning.
8

Rekonfigurovatelný generátor 5G NR signálů na RFSoC FPGA / Reconfigurable 5G NR signal generator on RFSoC FPGA

Indrák, Dominik January 2020 (has links)
This work deal with simulation of basic structure of OFDM modulator and demodulator of the upcoming standard 5G NR. In MATLAB are simulated basic parts including modulation, reference signal inserting, Fourier transform, cyclic prefix inserting, AWGN and multi-path propagation. In this work is proposed implementation of the modulator and demodulator into RFSoC board and his configuration. Designed generator is implemented with the use of STEMLab RedPitaya platform. In Matlab software is generated 5G OFDM signal used to transmitt. Received signal is evaluated in Matlab software.
9

A Systematic Framework For Analyzing the Security and Privacy of Cellular Networks

Syed Rafiul Hussain (5929793) 16 January 2020 (has links)
<div>Cellular networks are an indispensable part of a nation's critical infrastructure. They not only support functionality that are critical for our society as a whole (e.g., business, public-safety message dissemination) but also positively impact us at a more personal level by enabling applications that often improve our quality of life (e.g., navigation). Due to deployment constraints and backward compatibility issues, the various cellular protocol versions were not designed and deployed with a strong security and privacy focus. Because of their ubiquitous presence for connecting billions of users and use for critical applications, cellular networks are, however, lucrative attack targets of motivated and resourceful adversaries. </div><div><br></div><div></div><div>In this dissertation, we investigate the security and privacy of 4G LTE and 5G protocol designs and deployments. More precisely, we systematically identify design weaknesses and implementation oversights affecting the critical operations of the networks, and also design countermeasures to mitigate the identified vulnerabilities and attacks. Towards this goal, we developed a systematic model-based testing framework called LTEInspector. LTEInspector can be used to not only identify protocol design weaknesses but also deployment oversights. LTEInspector leverages the combined reasoning capabilities of a symbolic model checker and a cryptographic protocol verifier by combining them in a lazy fashion. We instantiated \system with three critical procedures (i.e., attach, detach, and paging) of 4G LTE. Our analysis uncovered 10 new exploitable vulnerabilities along with 9 prior attacks of 4G LTE all of which have been verified in a real testbed. Since identifying all classes of attacks with a unique framework like \system is nearly impossible, we show that it is possible to identify sophisticated security and privacy attacks by devising techniques specifically tailored for a particular protocol and by leveraging the findings of LTEInspector. As a case study, we analyzed the paging protocol of 4G LTE and the current version of 5G, and observed that by leveraging the findings from LTEInspector and other side-channel information and by using a probabilistic reasoning technique it is possible to mount sophisticated privacy attacks that can expose a victim device's coarse-grained location information and sensitive identifiers when the adversary is equipped only with the victim's phone number or other soft-identity (e.g., social networking profile). An analysis of LTEInspector's findings shows that the absence of broadcast authentication enables an adversary to mount a wide plethora of security and privacy attacks. We thus develop an attack-agnostic generic countermeasure that provides broadcast authentication without violating any common-sense deployment constraints. Finally, we design a practical countermeasure for mitigating the side-channel attacks in the paging procedure without breaking the backward compatibility.</div>

Page generated in 0.0244 seconds