Spelling suggestions: "subject:"ltea"" "subject:"ltera""
71 |
Analysis of LTE Radio Frame by eliminating Cyclic Prefix in OFDM and comparison of QAM and Offset-QAMSelvakumar, Vinodhkumar, Nemalladinne, Samuel, Arumugam, Premkumar January 2012 (has links)
Spectral efficiency is the key factor for the development of future wireless communications. Orthogonal Frequency Division Multiple Access (OFDMA) is the multiple access technology used at physical layer of latest wireless communication technologies. Anything on the improvement or overcoming the disadvantage of the present system will be considered for the future wireless systems. Long Term Evolution (LTE) is one of the 4th generation wireless communications and it is taken as the reference system in this thesis. The main concern of this thesis is to analyze the LTE radio frame. We designed and simulated the OFDM system with cyclic prefix, its Bit Error Rate (BER) is verified by changing the Signal to Noise Ratio (SNR) value and we investigated the OFDM system by eliminating the cyclic prefix. By eliminating cyclic prefix bandwidth efficiency is achieved, though using cyclic prefix in OFDM has more advantages. Filter banks are used to compensate the advantages of cyclic prefix when it is removed. Introducing Offset in QAM results in less distortion and amplitude fluctuations. We designed, simulated and compared the QAM digital modulation with Offset-QAM digital modulation its BER vs. SNR are verified using simulations on MATLAB.
|
72 |
Frequency Domain Link Adaptation for OFDM-based Cellular Packet DataRuberg, Anders January 2006 (has links)
In order to be competitive with emerging mobile systems and to satisfy the ever growing request for higher data rates, the 3G consortium, 3rd Generation Partnership Project (3GPP), is currently developing concepts for a long term evolution (LTE) of the 3G standard. The LTE-concept at Ericsson is based on Orthogonal Frequency Division Multiplexing (OFDM) as downlink air interface. OFDM enables the use of frequency domain link adaptation to select the most appropriate transmission parameters according to current channel conditions, in order to maximize the throughput and maintain the delay at a desired level. The purpose of this thesis work is to study, implement and evaluate different link adaptation algorithms. The main focus is on modulation adaptation, where the differences in performance between time domain and frequency domain adaptation are investigated. The simulations made in this thesis are made with a simulator developed at Ericsson. Simulations show in general that the cell throughput is enhanced by an average of 3% when using frequency domain modulation adaptation. When using the implemented frequency domain power allocation algorithm, a gain of 23-36% in average is seen in the users 5th percentile throughput. It should be noted that the simulations use a realistic web traffic model, which makes the channel quality estimation (CQE) difficult. The CQE has great impact on the performance of frequency domain adaptation. Throughput improvements are expected when using an improved CQE or interference avoidance schemes. The gains with frequency domain adaptation shown in this thesis work may be too small to motivate the extra signalling overhead required. The complexity of the implemented frequency domain power allocation algorithm is also very high compared to the performance enhancement seen.
|
73 |
Network Capacity, Coverage Estimation and Frequency Planning of 3GPP Long Term EvolutionZhang, Liang January 2010 (has links)
The recent increase of mobile data usage and emergence of new applications such as Online Gaming, mobile TV, Web 2.0, Streaming Contents have greatly motivated the 3rd Generation Partnership Project (3GPP) to work on the Long Term Evolution (LTE). The LTE is the latest standard in the mobile network technology tree. It inherits and develops the GSM/EDGE and UMTS/HSPA network technologies and is a step toward the 4th generation (4G) of radio technologies designed to optimize the capacity and speed of 3G mobile communication networks. In this thesis, the LTE system capacity and coverage are investigated and a model is proposed on the base of the Release 8 of 3GPP LTE standards. After that, the frequency planning of LTE is also studied. The results cover the interference limited coverage calculation, the traffic capacity calculation and radio frequency assignment. The implementation is achieved on the WRAP software platform for the LTE Radio Planning.
|
74 |
Investigation and development of a verification protocol test tool for LTEGeorgis, Jolian January 2011 (has links)
At Ericsson IoDT test department testing tools are used to control the traffics between the EU and the eNB to check the typical problems issues. By storing these problems in a log file will help mobile vendors to eliminate the mentioned problems by developing solutions for more significant products. The IoDT needed to develop the testing results even further for more specific results. The purpose of this thesis is to developing the LogTool to make it easier for the IoDT testing group to read the log files, whereas the implemented results from the log files simplifies for the testing group. To make this happens; the LogTool was developed by creating a new Analyzer that analyses the log file and presents the results in a way which makes it easier to be read. The Analyzer was created using Eclipce program as RCP plug-in, in Java programming language. The new Analyzer managed to print the results in a simpler way for the CQI and MCS test. Basically, the new Analyzer was created and implemented in standardized way that make it possible to be developed even further in the future without losing its’ functionality. This procedure was achieved regarding the needed requirements of Ericsson and Linköping University. This document will guide the reader through the steps that used to accomplish the project with illustrating figures, methods and code examples in order to give a closer vision of the work.
|
75 |
LTE Uplink Modeling and Channel EstimationAhmed, Mohsin Niaz January 2011 (has links)
This master thesis investigates the uplink transmition from User Equipment (UE) to base station in LET (Long Term Evolution) and channel estimation using pilot symbols with parameter defined in 3GPP (3rd Generation Partnership Project) specifications. The purpose of the thesis was to implement a simulator which can generate uplink signal as it is generated by UE. The Third Generation (3G) mobile system was given the name LTE. This thesis focus on the uplink of LTE where single carrier frequency division multiple access (SC-FDMA) is utilized as a multiple access technique. The advantage over the orthogonal frequency division multiple access (OFDMA), which is used in downlink is to get better peak power characteristics. Because in uplink communication better peak power characteristic is necessary for better power efficiency in mobile terminals. To access the performance of uplink transmition realistic channel model for wireless communication system is essential. Channel models used are proposed by International Telecommunication Union (ITU) and the correct knowledge of these models is important for testing, optimization and performance improvements of signal processing algorithms. The channel estimation techniques used are Least Square (LS) and Least Minimum Mean Square Error (LMMSE) for different channel models. Performance of these algorithms has been measured in term of Bit Error Rate (BER) and Signal to Noise Ratio (SNR).
|
76 |
A distributed, load-aware, power and frequency bargaining protocol for LTE-based networksSajid, Muhammad, Siddiqui, Imran January 2012 (has links)
In this thesis a distributed, dynamic, load aware, joint power and frequency allocation protocol for 4G networks along with system-level simulated results are presented. In all cellular networks, a key limiting factor for throughput is inter-cell interference, especially at the cell edges. Several methods have been proposed and adopted in each mobile network generation to cancel or suppress its effects, whereas each method has its drawbacks in terms of receiver complexity or additional control nodes. However, the proposed protocol presented here does not impose any architectural changes. In 4G networks such as LTE, the choice of OFDMA for the air interface has paved the way for selective frequency and power allocation in the available spectrum. Taking advantage of this opportunity, fractional frequency reuse (FFR) has been proposed in OFDMA based mobile networks in order to reduce the throughput loss at the cell edges due to inter-cell interference. In FFR, center users lose part of available spectrum that is dedicated to the edge users. Our protocol aims to minimize this loss of center users incurred by FFR, at the cost of minimal degradation at the edges. An eNodeB, only when overloaded, requests its neighbours’ edge band to be used for its center users at a reduced power level. This is done via small message exchange between the eNodeBs. The neighbors of the overloaded eNodeBs solve a small local knapsack problem to decide whether band lending is feasible or not. A distinguishing feature of this protocol is the power level adjustment for the borrowed band, which is mutually decided by the borrower and lender. The band is released when it is not needed or it is causing unacceptable loss to the lender. The implementation is done in a Matlab based LTE system level simulator. For the implementation of our protocol in the simulator, starting point was implementation of FFR-3 functionality, a prerequisite and a baseline for comparison with our protocol. Results are compared among three different setups of re-use1, FFR-3 and our protocol by varying number of overloaded eNodeBs for various numbers of scenarios and the comparison is made based on the center users’ throughput, edge users’ throughput. An estimation of time and protocol overhead is also presented. We have observed center users’ throughput gain up to 46%, at the cost of 9% edge users’ throughput loss, when compared to the classic FFR-3 scheme. The overall system throughput goes up to 26 % in heavily loaded scenario. The utility of the protocol for an LTE system is evident from the results, which is supported by the dynamic and decentralized nature of the protocol. This ensures better utilization of spectrum, by temporarily allocating more bandwidth where it is needed more.
|
77 |
Time and Frequency Synchronization and Cell Search in 3GPP LTEKe, Hung-Shiun 05 August 2011 (has links)
Long Term Evolution (LTE) developed by Third Generation Partnership Project (3GPP) is expected to be the standard of the Fourth-Generation (4G) of wireless communication system. LTE supports Frequency Division Duplex (FDD) and Time Division Duplex (TDD), and both of them are based on Orthogonal Frequency Division Multiplexing (OFDM) system in downlink. OFDM systems are sensitive to timing and frequency offset. Therefore, synchronization plays an important role in OFDM systems.
In this thesis, we study synchronization problems of a LTE FDD baseband receiver. Particularly, we develop a complete procedure to deal with the synchronization problems. The basic design concept and procedure are as follows: The receiver estimates and compensates the timing and frequency offset by using the repetition property of the cyclic prefix. In the meanwhile, the receiver also detects cyclic prefix mode (or the length of the cyclic prefix). After the frequency offset has been compensated, the receiver then processes cell search. To this end, we multiply each subcarrier by the synchronization sequence provided by LTE specification and transform them into time domain. We then estimate the channel energy in time domain to detect the Cell Identity (Cell ID). Using computer simulations, we verify that the designed receiver performs well.
|
78 |
HARQ Packet Scheduling Based on RTT Estimation in LTE NetworksLi, Yi-Wei 15 February 2012 (has links)
In an LTE (Long-Term Evolution) network, HARQ (Hybrid Automatic Repeat reQuest) is used to reduce the error probability of retransmitted packets. However, HARQ cannot guarantee delay constraints for real-time traffic when RBs (Resource Blocks) are allocated improperly. To avoid the retransmitted real-time packets exceeding their delay constraints, we propose an HARQ scheduling scheme based on RTT (Round-Trip-Time) estimation. In this scheme, traffic are classified into real-time and non-real-time queues in which real-time queue are further classified into four sub-queues according to their retransmission times; i.e., the first transmission queue,
the first retransmission queue, the second retransmission queue, and the third retransmission queue. For the four real-time queues, we estimate RTT and compute the number of RBs required satisfying the delay constraints. To prevent from starvation of non-real-time traffic, after allocating the RBs for real-time traffic, the remaining RBs are allocated for non-real-time traffic according to their MBR
(Minimum Bit Rate). To analyze the proposed scheduling scheme, we build a mathematical model to derive the successful probability of retransmitted packets and the expected value of packet retransmission times. Finally, we compute average packet delay, average packet loss rate, and the throughput for both real-time and none-real-time traffic by varying packet error probability and the delay constraints of real-time traffic.
|
79 |
Investigation of Automated Terminal Interoperability Test / Undersökning av automatiserad interoperabilitetstest av mobila terminalerBrammer, Niklas January 2008 (has links)
<p>In order to develop and secure the functionality of its cellular communications systems, Ericsson deals with numerous R&D and I&V activities. One important aspect is interoperability with mobile terminals from different vendors on the world market. Therefore Ericsson co-operates with mobile platform and user equipment manufacturers. These companies visit the interoperability developmental testing (IoDT) laboratories in Linköping to test their developmental products and prototypes in order to certify compliance with Ericsson's products. The knowledge exchange is mutual, Ericsson as well as the user equipment manufacturers benefit from the co-operation.</p><p>The goal of this master's thesis performed at Ericsson AB is to suggest areas in which the IoDT testing can be automated in order to minimize time consuming and tedious work tasks. Primarily the search should be aimed at replacing manual tasks in use today.</p><p>The thesis suggests a number of IoDT tasks that might be subject for automation. Among these one is chosen for implementation. The thesis also includes an implementation part. The task that has been chosen for implementation is the network verification after base station controller software upgrade procedure. This is not a core IoDT function but it implies a lot of work, and is often performed.</p><p>The automation project is also supposed to act as a springboard for future automation within IoDT. The forthcoming LTE standard will require a lot of IoDT testing, and therefore the automation capabilities should be investigated. The thesis shows that automation work is possible, and that the startup process is straightforward. Existing tools are easy to use, and well supported. The network verification automated test scope has been successful.</p>
|
80 |
Channel Quality Information Reporting and Channel Quality Dependent Scheduling in LTEEriksson, Erik January 2007 (has links)
<p>Telecommunication systems are under constant development. Currently 3GPP is working on an evolution of the 3G-standard, under the name 3G Long Term Evolution (LTE). Some of the goals are higher throughput and higher peak bit rates. A crucial part to achieve the higher performance is channel dependent scheduling (CDS). CDS is to assign users when they have favorable channel conditions. Channel dependent scheduling demands accurate and timely channel quality reports. These channel quality indication (CQI) reports can possibly take up a large part of the allocated uplink. This thesis report focuses on the potential gains from channel dependent scheduling in contrast to the loss in uplink to reporting overhead.</p><p>System simulations show that the gain from channel dependent scheduling is substantial but highly cell layout dependent. The gain with frequency and time CDS, compered to CDS in time domain only, is also large, around 20\%. With a full uplink it can still be a considerable gain in downlink performance if a large overhead is used for channel quality reports. This gives a loss in uplink performance, and if the uplink gets to limited it will severely affect both uplink and downlink performance negatively.</p><p>How to schedule and transmit CQI-reports is also under consideration. A suggested technique is to transmit the CQI reports together with uplink data. With a web traffic model simulations show that a high uplink load is required to get the reports often enough. The overhead also gets unnecessary large, if the report-size only depends on the allocated capacity.</p>
|
Page generated in 0.0321 seconds