Return to search

Machine Learning assisted gNodeB Data Link Layer Capacity Management

In the uplink direction of 5G New Radio, signals are sent between Ra-dio Units and Digital Units. The production of these signals is non-deterministic, leading to signals often being produced in bursts. Thesesignal bursts can lead to exceeding the Data Link Layer capacity, whichcauses packet losses. It is possible to control the burstiness by delay-ing signals over time. However, excessive delays should be avoidedsince the processing of signals must be completed within strict time con-straints. In this paper, two machine-learning-based algorithms with theobjective of avoiding packet losses by introducing delays to signals wereproposed. One algorithm was based on the symbol number of the sig-nals, and the other one used a queue-based approach. Only the symbol-based algorithm was thoroughly evaluated. Visualizations of test data,as well as lab tests, showed that the symbol-based algorithm was ableto efficiently delay signals in order to reduce the maximum load on theData Link Layer.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:uu-515095
Date January 2023
CreatorsAxelsson, Adam
PublisherUppsala universitet, Avdelningen för systemteknik
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess
RelationUPTEC IT, 1401-5749 ; 23040

Page generated in 0.0093 seconds