You are on page 1of 5

The LTE Radio Interface Key Characteristics and Performance

Anders Furuskr, Tomas Jnsson, and Magnus Lundevall Ericsson Research, Sweden
AbstractMobile broadband usage is taking off, demanding improved services and increased capacity of mobile networks. To meet these requirements, 3GPP has defined LTE (the 3GPP Long Term Evolution). This paper presents some key characteristics of the LTE radio interface, including physical layer and radio resource management functions, and evaluates their impact on system performance. As compared to a reference system with more basic characteristics, represented by Mobile WiMAX, results point to a combined gain in spectrum efficiency of 60% in downlink and 100% in uplink. Cell-edge bitrate gains are about 100% in both downlink and uplink. A closer analysis of the individual system characteristics indicates that these performance differences are due to rather uniform contributions from a set of distinctive features. Index TermsLTE, Performance, WiMAX

The paper is outlined as follows: After an introduction to the basic structure of the LTE radio interface in Section II, Section III provides a qualitative discussion of distinctive features of the evaluated system concepts, and their impact on performance. Models and assumptions are summarized in Section IV, followed by numerical results in Section V. Finally, a summary is provided in Section 6. II. AN OVERVIEW OF THE LTE RADIO INTERFACE Comprehensive descriptions of the LTE radio interface are available in [1]. In short, LTE is based on Orthogonal Frequency Domain Multiplexing (OFDM). The numerology includes a subcarrier spacing of 15kHz, support for bandwidths up to 20MHz, and resource allocation granularity of 180kHz x 1ms (a so-called resource block pair). In the uplink, a precoder is used to limit peak-to-average power ratios, and thereby reduce terminal complexity. Based on channel quality, modulation (up to 64QAM) and channel coding rates are dynamically selected. Both FDD, TDD, and half duplex FDD are supported. A variety of antenna concepts targeting different scenarios is included: transmit diversity for improved robustness of control channels, beamforming for improved channel quality in general, and multi-stream (MIMO) transmission for improved data rates in scenarios with good channel quality. On the MAC layer, dynamic scheduling is done on a resource block pair basis, based on QoS parameters and channel quality. Retransmissions are handled with two loops, a fast inner loop taking care of most errors complemented with a very robust outer loop for residual errors. III. KEY LTE CHARACTERISTICS Some of the more fundamental features discussed in the previous section are not unique to LTE. E.g. OFDM, multiantenna transmission, and adaptive modulation and coding are standard techniques used by many systems. On a more detailed level however, LTE distinguishes itself by using more sophisticated solutions than other systems. A list of such characteristics is presented in Table I. For reference, the corresponding solutions used in more basic systems are also listed. This is represented by Mobile WiMAX Wave 2. It should be noted that there are several other features differing between these systems which are not listed, e.g. control signaling robustness, higher layer overhead, and mobility aspects.

I. INTRODUCTION

sage of mobile broadband services, supported by the introduction of High Speed Packet Access (HSPA), is taking off. To meet the future increased demand for such services, corresponding improvements in the supply of services are required, including higher bit rates, lower delays, and higher capacity. This is the target of 3GPPs two radio access networks HSPA and LTE [1], of which the latter is the focus of this paper. LTE brings unprecedented performance. Examples include peak data rates exceeding 300Mbps, delays below 10ms, and manifold spectrum efficiency gains over early 3G system releases. Further, LTE can be deployed in new and existing frequency bands, has a flat architecture with few nodes, and facilitates simple operation and maintenance. While targeting a smooth evolution from legacy 3GPP and 3GPP2 systems, LTE also constitutes a major step towards IMT-Advanced systems. In fact, LTE includes many of the features originally considered for future fourth generation system. General LTE concept descriptions are available in [1]. In this paper, the focus is on key characteristics of the LTE radio interface. A set of such key characteristics are both qualitatively discussed and quantitatively evaluated in terms of downlink and uplink user data rates and spectrum efficiency generated by means of system level simulations. For reference, the LTE characteristics are compared to more conventional solutions. These are represented by corresponding functionalities in Mobile WiMAX with Partial Usage of SubChannels (PUSC) [2].

978-1-4244-2644-7/08/$25.00 2008 IEEE

TABLE I LTE KEY CHARACTERISTICS

Function
(slogan used in Fig. 1-2)

LTE OFDM in DL, DFT-spread OFDM in UL Fractional pathloss compensation Channel dependent in time and frequency domain Horizontal encoding (multiple codewords), closed loop with precoding Fine granularity (1-2dB apart)

Mobile WiMAX wave 2 OFDM in DL and UL Full pathloss compensation Channel dependent in time domain Vertical encoding (single codeword) Coarse granularity (2-3dB apart)

Performance impact DFT-spread OFDM reduces the peak-to-average power ratio and reduces terminal complexity, requires one-tap equalizer in base station receiver Fractional pathloss compensation enables flexible trade off between average and cell-edge data rates Access to the frequency domain yields larger scheduling gains Horizontal encoding enables per-stream link adaptation and successive interference cancellation (SIC) receivers Finer granularity enables better link adaptation precision

Multiple access
(MA)

Uplink power control


(PC)

Scheduling
(Scheduling)

MIMO scheme
(MIMO)

Modulation and coding scheme granularity (MCS)

Hybrid ARQ II
(HARQ)

Incremental redundancy 1ms subframes Relatively low OH (while control channels are robust)

Chase combining 5ms frames Relatively high OH

Frame duration
(CQI delay)

Overhead / control channel efficiency


(OH / CCH eff)

Incremental redundancy is more efficient (lower SNR required for given error rate) Shorter subframes yield lower user plane delay and reduced channel quality feedback delays Lower overhead improves performance

IV. MODELS AND ASSUMPTIONS Models and assumptions are aligned with the NGMN recommendations in [3]. Table II contains a brief summary. The evaluation methodology is based on time-dynamic, multi-cell system simulations. V. NUMERICAL RESULTS This section presents downlink and uplink user throughput and spectrum efficiency for a selection of system configurations and scenarios. More specifically, the following subsections cover (A) baseline configurations with 2x2 and 1x2 antenna configurations, (B) more advanced multi-antenna configurations, and (C) results for file transfer (non-full buffer) traffic models. A. LTE and Mobile WiMAX Baseline Scenario Downlink user throughput and spectrum efficiency figures for LTE FDD, LTE TDD, and Mobile WiMAX are summarized in Fig. 1. Note that in this special case, as there are 10 full-buffer users per sector in average, and the spectrum allocation is 10MHz, the spectrum efficiency, measured in bps/Hz/sector, and the average user throughput, measured in Mbps, are the same. For the TDD systems, the spectrum efficiency is calculated by down-scaling the denominator (system bandwidth) with the relative time utilization in the direction in question (measured in data symbols). Distributions of user throughput normalized with spectrum allocation and TDD utilization are also presented. It is seen that LTE is some 60% better than Mobile WiMAX in the average metrics, and about a factor two better in celledge performance. The reasons for these differences are a combination of the distinctive features presented in Table I.
Parameter Traffic Model

TABLE II MODELS AND ASSUMPTIONS Value a) Full buffer (10 users per sector) or b) File transfer (100KB fixed file size) with variable load Uniform distribution 500m 2.0GHz 10MHz L = I + 37.6log10(R) + P, R in km, I = 128.1 for 2GHz, P = 20dB penetration loss 8dB std dev, 50m correlation distance, 0.5 correlation between sites 3GPP SCM, Urban Macro High Spread (15 deg), extended to 10MHz 3km/h 46dBm / 23dBm BS: 2-4 transmit and receive Terminal: 1 transmit, 2-4 receive LTE: DL: Proportional fair in time and frequency, UL: Quality-based FDM WiMAX: DL: Proportional fair in time domain, UL: FDM LTE: Codebook-based pre-coded adaptive rank MIMO WiMAX: Dynamic switching between spatial multiplexing MIMO and STC LTE: Open loop with fractional pathloss compensation (=0.8), SNR target 10dB at cell edge WiMAX: Open loop, SNR target 15dB (full pathloss compensation) LTE: MMSE with SIC in DL WiMAX: MMSE LTE: 4:3, WiMAX: 22:15

User location Site-to-site distance Carrier frequency Carrier bandwidth Distance-dependent pathloss Lognormal shadowing Channel model Terminal speed BS / Terminal power Antenna configurations Scheduler

MIMO

Power control

Receiver type TDD asymmetry

Downlink Avg cell tp [bps/Hz/cell] 1.5 1 0.5 0


Avg cell tp [bps/Hz/cell]

Uplink 1 0.8 0.6 0.4 0.2 0 LTE FDD LTE TDD WiMAX TDD

1.73

1.70 1.06

1.05

0.98

0.43

LTE FDD

LTE TDD

WiMAX TDD
Cell-edge user tp [bps/Hz]

Cell-edge user tp [bps/Hz]

0.06

0.05 0.04 0.03 0.02

0.049

0.04

0.052

0.050 0.028

0.045

0.02

0.018
0.01 0 LTE FDD LTE TDD WiMAX TDD

LTE FDD

LTE TDD

WiMAX TDD

Downlink 100 100

Uplink

80

80

C.D.F. [%]

C.D.F. [%]

60

60

40

40

20
LTE FDD mc/mu/ce 1.73/0.173/0.052 bps/Hz LTE TDD mc/mu/ce 1.70/0.170/0.050 bps/Hz WiMAX TDD mc/mu/ce 1.06/0.106/0.028 bps/Hz

20
LTE FDD mc/mu/ce 1.05/0.105/0.049 bps/Hz LTE TDD mc/mu/ce 0.98/0.098/0.045 bps/Hz WiMAX TDD mc/mu/ce 0.43/0.043/0.018 bps/Hz

0 0

0.1 0.2 0.3 Normalised User Throughput [bps/Hz]


Downlink LTE 0% 0% -15% -26% -35% -37% -39% -40% -40% 1.28 1.13 1.09 1.06 1.05 1.03 1.03 1.06 0.5 1 1.5 Avg cell throughput [bps/Hz/cell] 1.47 1.73

0.4

0 0

0.05 0.1 0.15 Normalised User Throughput [bps/Hz]


Uplink LTE 0% 0% 0.66 0.56 0.50 0.46 1.05

0.2

Scheduling -15% CQI delay -13% OH / CCH eff -11% MIMO: precoding -4% MIMO: vertical/no SIC -2% MCS -2% HARQ -1% LTE WiMAX-like -40% WiMAX PUSC -39% 0

OH / CCH eff CQI delay PC Scheduling MCS HARQ MA (OFDM) LTE WiMAX-like WiMAX PUSC 0

-37% -37% -14% -46% -11% -52% -9% -6% -1% -1% -58% -59% 0.2 -56%

-59% 0.43 -59% 0.43 -60% 0.42 0.44 0.43 0.4 0.6 0.8 1 Avg cell throughput [bps/Hz/cell] 1.2

Fig. 1. Summary of baseline downlink normalized user throughput and spectrum efficiency results, and feature analysis.

Fig. 2. Summary of baseline uplink normalized user throughput and spectrum efficiency results, and feature analysis.

The individual impact of each such feature has been assessed by, in the simulations, replacing the LTE functionality with the corresponding WiMAX functionality. The result is shown in the lower bar graph in Fig. 1. The percentage figure to the left represents the individual feature impact, and the percentage figure to the right the accumulated impact of the features combined. It is seen that the total difference is not due to a single distinctive feature, but rather a combination of distinctive features, headed by frequency domain scheduling, faster

channel quality feedback, and control channel efficiency. Note also that when all distinctive features are replaced, the performance is the same, confirming that these features are indeed the reason for the overall difference in performance. A similar analysis can be made for the cell-edge metric. Similar results for the uplink are summarized in Fig. 2. In this direction, it is seen that LTE is more than a factor two better than Mobile WiMAX in both average and cell-edge metrics. Also here, the distinctive features jointly make up the

Downlink Avg cell tp [bps/Hz/cell] Avg cell tp [bps/Hz/cell] 3

Uplink

2.82
2

1.5

1.52
1 0.5 0

1.68

1.73
1

2.05

0.72

0.86

LTE 2x2

LTE 4x2

LTE 4x4 Cell-edge user tp [bps/Hz]

LTE 1x4

LTE 1x4 MM2

WiMAX 1x4

WiMAX 1x4 MM2

Cell-edge user tp [bps/Hz]

0.08 0.06 0.04 0.02 0

0.08 0.06 0.04 0.02 0

0.077 0.063 0.052

0.074

0.077

0.032

0.032

LTE 2x2

LTE 4x2

LTE 4x4

LTE 1x4

LTE 1x4 MM2

WiMAX 1x4

WiMAX 1x4 MM2

Downlink 100
100

Uplink

80

80

C.D.F. [%]

60

C.D.F. [%]

60

40

40

20
LTE 2x2 mc/mu/ce 1.73/0.173/0.052 bps/Hz LTE 4x2 mc/mu/ce 2.05/0.205/0.063 bps/Hz LTE 4x4 mc/mu/ce 2.82/0.282/0.077 bps/Hz

20

LTE 1x4 mc/mu/ce 1.52/0.152/0.074 bps/Hz LTE 1x4 MM2 mc/mu/ce 1.68/0.168/0.077 bps/Hz WiMAX 1x4 mc/mu/ce 0.72/0.072/0.032 bps/Hz WiMAX 1x4 MM2 mc/mu/ce 0.86/0.086/0.032 bps/Hz

0.2 0.4 0.6 0.8 1 Normalised User Throughput [bps/Hz] Fig. 3. Summary of downlink results with additional antenna concepts.

0 0

0.1 0.2 0.3 0.4 Normalised User Throughput [bps/Hz] Fig. 4. Summary of uplink results with additional antenna concepts.

0 0

total performance difference, lead by control channel efficiency (OH), faster channel quality feedback, and more flexible power control. The small difference between LTE FDD and TDD depends on the TDD guard period and differences in channel quality feedback delays. B. Results with Additional Antenna Concepts In this section results with more advanced antenna concepts are presented. Downlink results are summarized in Fig. 3. It is seen that both average and cell-edge performance are improved by using 4x2 and 4x4 MIMO solutions. For the uplink, results with four receive antennas are presented in Fig. 4. In addition to receive diversity results, results for multi-user MIMO, with 2 users multiplexed, are shown. A significant performance increase is achieved already using 4branch receive diversity. The additional gain provided by MUMIMO is smaller. With two receive antennas the MU-MIMO gains are even smaller. C. Results for File Transfer Traffic Models In addition to the full buffer traffic model, for LTE, evaluations with a file transfer traffic model have also been performed. For simplicity, a fixed file size of 100KB is assumed.

Although simple, this model captures a number of realistic phenomena not covered by the full buffer model. These include the equal buffer effect of users with low data rates dominating the link usage, and the effect of interference variations caused by transmitters switching on and off. The file transfer model also enables the possibility to study achievable user data rates under varying load conditions. A number of traffic load levels, realized by different session arrival intensities, are evaluated and user bitrates are logged. The user bitrate is measured as the file size divided by the time between arrival in the system and successful reception. Queuing delays are hence included. The baseline system configurations are assumed (2x2 DL and 1x2 UL). Results in the form of 5th, 50th, and 95th percentile user bitrates as a function of served traffic per sector are presented in Fig. 5. It is seen that very high user bitrates are achieved. At low load (1-2Mbps/sector), the cell edge bitrate exceeds 20Mbps in downlink, and is almost 10Mbps in UL. Average values are about a factor two higher, and the 95th percentile values are not far from the theoretical peak data rates (72Mbps in DL and 26Mbps in UL with the overhead assumptions made). Further, the capacity, here measured as the maximum sector throughput for a certain 5th percentile bitrate (e.g. 1Mbps), is not very much lower than in the full buffer case.

Traffic model: download 70 5 , 50 , 95 perc. of user bitrate [Mbps] 60 50 40 30 20 10 0

The above file transfer results, especially the bitrates achieved at low to moderate load where file transfers are completed in tens of ms, depend on the initial selection of modulation, coding and MIMO scheme. These settings have not been optimized. Further, the control channel overhead assumptions are pessimistic for scenarios with low to moderate load, as typically quite few users are scheduled in the same frame. VI. SUMMARY The results presented indicate that in normalized metrics LTE, with its more sophisticated radio interface, outperforms the more basic Mobile WiMAX Wave 2 in both downlink and uplink and for both FDD and TDD operation. In the downlink, LTE is about 60% better in spectrum efficiency and average user throughput, and 100% better in cell-edge user throughput. In the uplink, LTE is 100% better in both average and celledge performance. Utilizing the full LTE potential (4x4 MIMO, 20MHz carriers, FDD), the differences are even greater. The large gains for LTE can not be attributed to a single feature, but are rather the effect of a number of distinctive characteristics, each contributing to the overall gain. With non-full buffer traffic, very high user bitrates are achieved for LTE. In low to moderate load scenarios, bitrates of tens of Mbps are achievable at the cell-edge, and theoretical peak data rates are approached closer to the base station. In general, absolute performance values depend largely on the scenario, models, and assumptions used, here aligned with the NGMN recommendations. Although relevant and welldesigned, different results are achieved in other scenarios and under different assumptions. For example, refined base station antenna models, based on realistic antenna patterns and modeling vertical antenna diagrams, typically yield improved absolute spectrum efficiency and user throughput values [4]. REFERENCES
[1] [2] [3] [4] E. Dahlman et al, 3G Evolution: HSPA and LTE for Mobile Broadband, Academic Press, Oxford, UK, 2007 WiMAX Forum, Mobile System Profile, Release 1.0 Approved Specification, Revision 1.4.0. NGMN, NGMN Radio Access Performance Evaluation Methodology, Version 1.2, June 2007, www.ngmn.org. F. Gunnarsson et al., Downtilted Base Station Antennas A Simulation Model Proposal and Impact on HSPA and LTE Performance, in IEEE VTC 2008 fall.

th

th

th

4 6 8 System throughput [Mbps/cell]


Traffic model: upload

10

12

25 5 , 50 , 95 perc. of user bitrate [Mbps]

20

15

10

th th

th

0 1

3 4 5 6 System throughput [Mbps/cell]

Fig. 5. Downlink and uplink bitrate percentiles v traffic load for file transfer traffic

The small difference indicates a fair system. It may be noted that, in theory, with a resource fair scheduler, the full buffer capacity is determined by the average of the user bitrates R, here denoted mean(R). The equal buffer capacity on the other hand may be estimated by the inverse of the average normalized delay (D=1/R), and is hence given by 1/mean(1/R), which is less than or equal to mean(R).

You might also like