One of the key technologies enabling long-term evolution (LTE) is Orthogonal Frequency Division Multiplexing (OFDM). OFDM is a combination of modulation and multiplexing, and is a particular case of Frequency Division Multiplexing (FDM). In a real environment, the performance of wireless communication systems is mainly governed by the wireless channel environment. It is rather dynamic and unpredictable, which often makes an exact analysis of the wireless communication system difficult. This is why channel estimation is one of the most important aspects of an OFDM system.
In LTE downlink (DL) transmission, the user equipment (UE) gives feedback on the quantized channel state information (CSI), while in uplink (UL) transmission, the base station (BS) can directly estimate the UE channel. To work with non-quantized channel estimates, we restrict ourselves to the UL. Then, the BS performs link adaptations resulting in a delay owing to the time taken by the physical layer to process the information.
The study revealed that the main drop in throughput performance happens at 1 and 2 Transmission Time Intervals (TTI), while 3GPP noted a time delay of 5 TTI.
Our aim is to reduce the effect of time delay on performance in order to increase the throughput of 4G LTE at a relatively high velocity. This study thus employs Support Vector Machines (SVMs) for regression in order to perform interpolation between pilot symbols for channel estimation and extrapolation to predict the channel state for the future frames. Our proposed method was implemented in the Vienna LTE Advanced Uplink Link Level Simulator that was developed by Vienna University of Technology (TU Wien) and which provides a real LTE environment with real channel conditions.
This research was published and presented to the 20th International ITG Workshop on Smart Antennas on 11 March 2016, in Munich, Germany.
23 May 2016