Академический Документы
Профессиональный Документы
Культура Документы
QoS functions provided by any network, whether wired or wireless, are all based on
standards (IETF RFC, IEEE 802, 3GPP TS, etc.). They may work differently using different
standards depending on whether the network is wired (Ethernet/IP/MPLS) or wireless
(LTE/WiBro/Wi-Fi). But, basically what the QoS is about is that traffic quality is guaranteed
(i) if you pay more, or (ii) for high-priority traffic (e.g. voice or video traffic that is more
sensitive to delay in its nature than Internet traffic).
Practically, (i) does not sound very likely because no network operator offers a service plan
that guarantees certain level of QoS to those who pay more. But, (ii) sounds like a more
practical and sensible reason for most network operators to have QoS functions. In a wired
network, the most common usage of QoS would be for VoIP or IPTV services. I've been
using KT IPTV. KT provides a higher QoS level for its IPTV (Live & VoD) traffic than for its
Internet traffic (with differentiated treatment, e.g. 802.1p for L2, DSCP for IP, and EXP field
of MPLS header for MPLS), guaranteeing the quality of the IPTV traffic even when there is
very high Internet traffic. So, I can watch PSY dancing without any service interruption,
which makes me a very satisfied subscriber of KT.
Now, we will look into QoS in LTE, a wireless network. We will go over the basic features of
the LTE QoS only this time, and will revisit it for a more in-depth description in the later
posts.
As you may recall, when a UE attaches to an LTE network, an EPS bearer connecting from
the UE to a PGW (UE - eNB - S-GW - P-GW) is created as a combination of one logical
channel and two GTP tunnels. Each UE can have more than one EPS bearer depending on
the services in use (e.g. three if using Internet, IPTV and VoIP. The number of bearers will
be determined according to the policy of the network operator.). There are two types of EPS
bearers, default and dedicated, depending on when they are created.
Once the P-GW processes QoS at SDF level and has each SDF mapped to the EPS bearer, it
processes QoS at EPS bearer level from the P-GW to the UE where SDF remains
undisclosed.
We will look a little bit further into LTE QoS that we discussed last time, and learn what QoS
parameters are for.
There are two types of EPS bearers: default and dedicated. In the LTE network, the EPS
bearer QoS is controlled using the following LTE QoS parameters:
For an EPS bearer, having a GBR resource type means the bandwidth of the bearer is
guaranteed. Obviously, a GBR type EPS bearer has a "guaranteed bit rate" associated (GBR
will be further explained below) as one of its QoS parameters. Only a dedicated EPS bearer
can be a GBR type bearer and no default EPS bearer can be GBR type. The QCI of a GBR
type EPS bearer can range from 1 to 4.
For an EPS bearer, having a non-GBR resource type means that the bearer is a best effort
type bearer and its bandwidth is not guaranteed. A default EPS bearer is always a Non-GBR
bearer, whereas a dedicated EPS bearer can be either GBR or non-GBR. The QCI of a non-
GBR type EPS bearer can range from 5 to 9.
QCI = 1
: Resource Type = GBR, Priority = 2, Packet Delay Budget = 100ms, Packet Error Loss Rate
= 10-2 , Example Service = Voice
QCI = 9
: Resource Type = Non-GBR, Priority = 9, Packet Delay Budget = 300ms, Packet Error Loss
Rate = 10-6, Example Service = Internet
QoS to be guaranteed for an EPS bearer or SDF varies depending on the QCI values
specified.
QCI, though a single integer, represents node-specific parameters that give the details of
how an LTE node handles packet forwarding (e.g. scheduling weights, admission thresholds,
queue thresholds, link layer protocol configuration, etc). Network operators have their LTE
nodes pre-configured to handle packet forwarding according to the QCI value.
By pre-defining the performance characteristics of each QCI value and having them
standardized, the network operators can ensure the same minimum level QoS required by
the LTE standards is provided to different services/applications used in an LTE network
consisting of various nodes from multi-vendors.
QCI values seem to be mostly used by eNBs in controlling the priority of packets delivered
over radio links. That's because practically it is not easy for S-GW or P-GW, in a wired link,
to process packets and also forward them based on the QCI characteristics all at the same
time (As you may know, a Cisco or Juniper router would not care about delay or error loss
rate when it processes QoS of packets. It would merely decide which packet to send first
through scheduling (WFQ, DWRR, SPQ, etc.) based on the priority of the packets
(802.1p/DSCP/MPLS EXP)).
When a new EPS bearer is needed in an LTE network with insufficient resources, an LTE
entity (e.g. P-GW, S-GW or eNB) decides, based on ARP (an integer ranging from 1 to 15,
with 1 being the highest level of priority), whether to:
remove the existing EPS bearer and create a new one (e.g. removing an EPS bearer with
low priority ARP to create one with high priority ARP); or
refuse to create a new one.
So, the ARP is considered only when deciding whether to create a new EPS bearer or not.
Once a new bearer is created and packets are delivered through it, the ARP does not affect
the priority of the delivered packet, and thus the network node/entity forwards the packets
regardless of their ARP values.
One of the most representative examples of using the ARP is an emergency VoIP call. So,
an existing EPS bearer can be removed if a new one is required for a emergency 119 (911
in US, 112 in EC, etc) VoIP call.
GBR (UL/DL)
This parameter is used for a GBR type bearer, and indicates the bandwidth (bit rate) to be
guaranteed by the LTE network. It is not applied to a non-GBR bearer with no guaranteed
bandwidth (UL is for uplink traffic and DL is for downlink traffic).
MBR (UL/DL)
MBR is used for a GBR type bearer, and indicates the maximum bit rate allowed in the LTE
network. Any packets arriving at the bearer after the specified MBR is exceeded will be
discarded.
APN-AMBR (UL/DL)
As you read the foregoing paragraph, you may wonder why a non-GBR type bearer does not
have a "bandwidth limit"? In case of non-GBR bearers, it is the total bandwidth of all the
non-GBR EPS bearers in a PDN that is limited, not the individual bandwidth of each bearer.
And this restriction is controlled by APN-AMBR (UL/DL). As seen in the figure above, there
are two non-GBR EPS bearers, and their maximum bandwidths are specified by the APN-
AMBR (UL/DL). This parameter is applied at UE (for UL traffic only) and P-GW (for both DL
and UL traffic).
UE-AMBR (UL/DL)
In the figure above, APN-AMBR and UE-AMBR look the same. But, please take a look at the
one below.
A UE can be connected to more than one PDN (e.g. PDN 1 for Internet, PDN 2 for VoIP using
IMS, etc.) and it has one unique IP address for each of its all PDN connections. Here, UE-
AMBR (UL/DL) indicates the maximum bandwidth allowed for all the non-GBR EPS bearers
associated to the UE no matter how many PDN connections the UE has. Other PDNs are
connected through other P-GWs, this parameter is applied by eNBs only.
Reference Signal
Downlink
Most of the channels (e.g, DPSCH, DPCCH, PBCH etc) is for carrying a special information (a
sequence of bits) and they have some higher layer channel connected to them, but
Reference Signal is a special signal that exists only at PHY layer. This is not for delivering
any specific information. The purpose of this Reference Signal is to deliver the reference
point for the downlink power.
When UE try to figure out DL power (i.e, the power of the signal
from a eNode B), it measure the power of this reference signal
and take it as downlink cell power. Another important role of
reference signal is to help the receiver demodulate the received
signal. Since the reference signal is made up of data known to
both transmitter and receiver, the receiver can figure out how the
communication channel destort the data by comparing the
decoded received reference signal and predefined reference signal
and use the result of this comparison to equalize (post process)
the received user data. The process for the receiver to perform
this comparison and figure out the characteristics of a
communication channel is called 'Channel Estimation' which is one
of the most critical part of many high-end wireless communication
like LTE. (If you are really interested in the detailed procedure, I
would strongly suggest you to study the basic concept of channel
estimation)
As LTE gets evolved into higher version, we are getting more and
more reference signal which is mapped to a specific antenna port.
And we are getting more and more confused as a result -:)
To implement this signal, you need to go through two steps - signal generation and
resource allocation. The details of signal generation and resource allocation would vary on
the type of reference signal. In this page, I would focus mostly on Cell Specific Reference
Signal to give you general idea.
Signal generation is done by the following procedure. You would notice that Cell ID is a key
parameter for the sequence and you would guess the sequence will be unique for each Cell
ID.
Another thing you would notice here would be that downlink reference signal is a kind of
Gold sequence whereas most of UL reference signal and DL Synchronization signal is based
on Zadoff Chu sequence. (c(n) is a Gold Sequence(Psuedo Random Sequence)and you see
the reference signal is generated by combining the two Gold Sequences). Following equation
is based on 36.211 6.10.1.1
Once you have generated the sequence, next step is to allocate each data point of the
sequence to a specified resource elements. That is done by the following process. The
resulting location of the process is as shown in Reference Signal section of Downlink Frame
Structurepage. Following equation is based on 36.211 6.10.1.2
< LTE Downlink Reference Signal - SISO - Location based on Physical Cell ID >
If you examined the location of the reference signal (black cells) shown above, you may ask
"According to the RE mapping formula, the reference signal shift is done by ' CellID mod 6'
which imply that the reference signal location would repeat at every 6 CellID interval, but
according to the grid shown above it seems the location repeats at every 3 CellID interval,
not 6. Because of this, UE may experience some degree of RS interference at every 3 CellID
intervals even though RS location is shifted by 'CellID mod 6'. (You may see this kind of
interference at Intra Frequency Interference between LTE and LTE with Varying Physical Cell
ID (PCI) )
Actually this is a kind of illusion. In reality the location repeats at every 6 cellID, but the
interval of the reference signal in frequency domain (vertical direction) repeats at every 3
RE(resource element) and all the reference signal is marked in a same color (black). That's
why it looks as if it repeat at every 3 CellID. To remove this confusion, I marked the RS at a
symbol and marked in different color with 6 REs in vertical direction(I picked only one
symbol.. it was too much work to mark different colors manually in Windows Paint :). Now
you may (hopefuly) see the shift by 6 interval.
< LTE Downlink Reference Signal - SISO - Location based on Physical Cell ID >
Reference Signal (Antenna Port Number) vs Transmission Mode
Reference Signals are used for various purpose and the type of reference signal being used
varies depending on transmission mode. Some of the possible combination of refernece
signal and transmission mode are as follows.
Reference Signal (Antenna Port)
No of No of No of DCI
TM No of CW
Layers Tx Rx Format Control CH PDSCH CSI Meas UE Specific
TM1 1 1 1 1 1, 1A p0 p0 p0 N/A
2 2 1, 1A p0,p1 p0,p1 p0,p1 N/A
TM2 1 2
4 2 1, 1A p0,p1,p2,p3 p0,p1,p2,p3 p0,p1,p2,p3 N/A
1 1 1 1A p0 p0 p0 N/A
1 1 1 1 p0 p5 ? p5
1 2 2 1 p0,p1 p5 ? p5
TM7 1 1 4 2 1 p0,p1,p3,p4 p5 ? p5
1 2 2 2B p0 p7 or p8 ? p7 or p8
1 2 2 2 1A p0,p1 p7 or p8 ? p7 or p8
TM8 4 4 2 1A p0,p1,p3,p4 p7 or p8 ? p7 or p8
Note : ',' indicate "AND". (E.g, p0, p1 means that p0 AND p1 are used)
Note : UE Specific means "UE Specific Reference Signal, UE Specific Antenna ports" or is
called "DMRS (Demodulation Reference Signal) as well.
Note : "No of Tx" means the number of Tx antenna on eNodeB and "No of Rx" means
number of Rx antenna on UE
Note : TM9 can have much more combinations, but I listed only on the combination I have
seen until now (Jun 2014)
The highest-level view from 36.211 for FDD LTE is as follows. It only shows the structure of
one frame in time domain. It does not show any structure in frequency domain.
Some of high-level description you can get from this figure would be
i) Time duration for one frame (One radio frame, One system frame) is 10 ms. This means
that we have 100 radio frame per second.
ii) the number of samples in one frame (10 ms) is 307200 (307.200 K) samples. This means
that the number of samples per second is 307200 x 100 = 30.72 M samples.
iii) Number of subframe in one frame is 10.
iv) Number of slots in one subframe is 2. This means that we have 20 slots within one
frame.
So one slot is the smallest structure in time domain ? No, if you magnify this frame
structure one step further, you would get the following figure.
Now you see that one slot is made up of 7 small blocks called 'symbol'. (One symbol is a
certain time span of signal that carry one spot in the I/Q constellation.).
And you see even smaller structures within a symbol. At the beginning of symbol you see a
very small span called 'Cyclic Prefix' and the remaining part is the real symbol data.
There are two different type of Cyclic Prefix. One is normal Cyclic Prefix and the other is
'Extended Cyclic Prefix' which is longer than the Normal Cyclic Prefix. (Since the length of
one slot is fixed and cannot be changed, if we use 'Extended Cyclic Prefix', the number of
symbols that can be accomodated within a slot should be decreased. So we can have only 6
symbols if we use 'Extended Cyclic Prefix').
If you magnify a subframe to show the exact timing and samples, it can be illustrated as
below. The length shown in this illustration does not vary with the Sampling Rate, but the
number of samples in each symbol and CP varies with the sampling rate. The number of
samples shown in this illustration is based on the case of 30.72 Mhz sampling rate.
Following shows the overal subframe structure from "LTE Resource Grid" (I realized that this
site is not available any more. Fortunately, recently another expert put great effort to create
another resource grid application and allowed me to share with everybody. Here
goes Sandesh Dhagle's Resource Grid)
Now let's magnify the structure even further, but this time expand in frequency domain, not
in time domain. You will get the following full detail diagram.
< FDD LTE Frame Structure with Focus On Physical Channels >
The first thing you have to be very familiar with as an engineer working on LTE is the
following channel map shown above.
We can represent an LTE signal in a two-dimensional map as shown above. The horizontal
axis is time domain and the vertical axis is frequency domain. The minimum unit on vertical
axis is a sub carrier and the minimum unit on horizontal axis is symbol. For both time
domain and frequency domain, there are multiple hiarachies of the units, meaning a
multiple combination of a smaller unit become a larger units.
Q> What is the space between a subcarrier and the next sub carrier ? A> 15 Khz
Q> What is the number of channels(sub carriers) for 20 Mhz LTE band ? A> 1200 sub
carriers.
Q> What is the number of channels(sub carriers) for 10 Mhz LTE band ? A> 600 sub
carriers.
Q> What is the number of channels(sub carriers) for 5 Mhz LTE band ? A> 300 sub carriers.
Got any feelings about sub carriers and it's relation to system bandwidth ?
Now let's look at the basic units of horizontal axis which is time domain. The minimum unit
of the time domain is a Symbol, which amounts to 66.7 us. Regardless of bandwidth, the
symbol length does not changes.In case of time domain, we have a couple of other
structures as well. The largest unit in time domain is a frame, each of which is 10 ms in
length. Each of the frame consists of 10 sub frames, each of which is 1 ms in length. Each
of sub frame consists of 2 slots, each of which is 0.5 ms in length.Each of slots consists of 7
symbols, each of which is 66.7 us.
With this in mind, let's think about the scale in reverse direction.
Now let's look at the units which is made up of both time domain (horizontal axis) and
frequency domain (vertical axis). Let's call this type of unit a two-dimensional unit.
The minimum two dimensional unit is resource element which is made up of one symbol in
time domain and one sub carrier in frequency domain. Another two dimensional unit is
resource block(RB) which is made up of one slot in time domain and 12 sub-carrier in
frequency domain. Resource Block(RB) is the most important units in LTE both for protocol
side and RF measurement side.
Now here goes questions.
Now it's time to combine all the units we covered. The following questions are very
important to read any of the LTE specification.
Q> How many resource blocks in a 20 Mhz band ? A> 100 resource blocks.
Q> How many resource blocks in a 10 Mhz band ? A> 50 resource blocks.
Q> How many resource blocks in a 5 Mhz band ? A> 25 resource blocks.
I have seen this type of mapping so many times from so many different sources, but do I
really understand all the details of the map ? No not yet. It will take several years to
understand every aspects of the map.
Probably what I do as the first step is to describe each part of the map in a verbal form
Followings are examples for various TDD UL/DL configurations. I got all of the following
examples using Sandesh Dhagle's Resource Grid.
Followings are the examples showing the radio resource grid with different Special
Subframe Configuration. In these examples, just pay attention to how the symbol structure
in subframe 0 and subframe 6 varies.
From 3GPP Rel 13, a new frame structure (Frame Structure Type 3) and major application
of this type is LAA. I don't find much details on this as of 36.211 V13.1.0. Following
description is all for now. I think the green part is same as the existing frame type (Type 1 /
Type 2) and the blue part is unique to Type 3.
Frame structure type 3 is applicable to LAA secondary cell operation with normal cyclic
prefix only. Each radio frame is Tf = 307200⋅Ts =10ms long and consists of 20 slots of
lengthTslot = 15360⋅Ts = 0.5 ms , numbered from 0 to 19. A subframe is defined as two
consecutive slots where subframe i consists of slots 2i and 2i +1 .
The 10 subframes within a radio frame are available for downlink transmissions. Downlink
transmissions occupy one or more consecutive subframes, starting anywhere within a
subframe and ending with the last subframe either fully occupied or following one of the
DwPTS durations in Table 4.2-1.
Another differences in Rel 13 is in following table. As you see here, in Rel 13 the length of
UpPTS became parameterized (become a variable). If you set X = 0, the table is indentical
to Frame type 2 case. I think X would get different values in Frame Type 3, but I haven't
found any details about the value range of this parameter X yet (I will update as I find more
information).
< 36.211 Table 4.2-1: Configuration of special subframe (lengths of DwPTS/GP/UpPTS) >
Now I will talk about the details of various type of physical channels that will be embedded
into the frame structure shown above. The description on this page is just an overview of
each physical channels. It is too much to put all the details of each physical channels in
single page. I recommend you to use this as a summary (cheat sheet) for each channels
and refer to other pages linked under each descriptions if you want to get further details.
This is one of the most confusing area of the map because multiple channels are located in
this area. On the first symbol is PCFICH but PCFICH takes only part of the resource blocks
on the first symbol not all. PHICH is carried by this area as well. And the remaining space
not occupied by PCFICH and PHICH is allocated for PDCCH.
PHICH
Carries H-ARQ Feedback for the received PUSCH
After UE trasmitted the data in UL, it is waiting for PHICH for the ACK.
It is like E-HICH in HSPA
Sometimes several PHICH constitutes a PHICH group using the same resource
elements.
Refer to Physical Layer : PHICH and Matlab Toolbox : PHICH for the details
PRACH
It carries the random access preamble
It is occupying 72 subcarriers (6 RB) of bandwidth in the frequency domain.
Within this channel is Random Access Preamble. This Random Access Preamble is
generated with Zadoff-Chu sequence.
Refer to RACH page and Matlab Toolbox : PRACH page for the details.
SSS is a specific physical layer signal that is used for radio frame synchronization. It has
characterstics as listed below.
Mapped to 72 active sub carriers (6 resource blocks), centered around the DC
subcarrier in slot 0 (Subframe 0) and slot 10 (Subframe 5) in FDD.
The sequence of SSS in subframe 0 and the one in subframe 5 are different from
each other
Made up of 62 Scrambling Sequence (based on m-sequence)
The value in odd indexed resource element and the one in even indexed resource
elements is generated by different equation
Used for Downlink Frame Synchronization
One of the critical factors determining Physical Cell ID
Refer to Physical Layer : SSS and Matlab Toolbox : SSS for the details
May Not be a big issues for most of the case since it would be working fine for most of the
device that you have for test. Otherwise it would have not been given to you for test.
However, If you are a developer working at early stage of LTE chipset (especially at
baseband area), this would be one of the first signal you have to implement.
Most of the channels (e.g, PDSCH, PDCCH, PBCH etc) is for carrying a special information (a
sequence of bits) and they have some higher layer channel connected to them, but
Reference Signal is a special signal that exists only at PHY layer. This is not for delivering
any specific information. The purpose of this Reference Signal is to deliver the reference
point for the downlink power.
When UE try to figure out DL power (i.e, the power of the signal from a eNode B), it
measure the power of this reference signal and take it as downlink cell power.
These reference signal are carried by multiples of specific Resource Elements in each slots
and the location of the resource elements are specifically determined by antenna
configuration.
In the figures below, Red/Blue/Green/Yellow is the part where the reference signal are
carried and the resource elements marked in gray are the ones reserved for reference
signal, but are not carrying Reference Signal for that specific antenna. (Follwing illustration
is based on 36.211 Figure 6.10.1.2-1. Mapping of downlink reference signals (normal cyclic
prefix))
< LTE Cell Specific Reference Signal (CRS) >
There are two different types of reference signal : Cell Specific Reference Signa and UE
specific Reference Signal
Cell Specific Reference Signal : This reference signal is being transmitted at every
subframe and it spans all across the operating bandwidht. It is being transmitted by
Antenna port 0,1,2,3.
UE Specific Reference Signal : This reference signal is being transmitted within the
resource blocks allocated only to a specific UE and is being transmitted by Antenna
port 5.
Is the Resource element for the cell specific reference signal fixed ?
One of the determining value of this sequence is Physical Cell ID, meaning that the physical
cell ID influences the value of the reference signal as well.
At the initial phase of LTE develoyment, you haven't doubt on that every LTE subframe
would carries CRS(Cell Sepecific Reference Signal) in it. But as we see more diverse types of
subframe structure (FDD-frame structure Type1, TDD -frame structure Type 2, LAA - frame
structure type 3), I see my confidence gets weaker. The question is 'Does every frame
type(type 1, 2, 3) transmit CRS ?
The short anwer is 'YES'. More detailed answer is following as stated in 36.211 - 6.10.1 Cell-
specific Reference Signal (CRS).
CRS is transmitted in all downlink subframes for frame structure type 1,
CRS is transmitted in all downlink subframes and DwPTS for frame structure type 2,
CRS is transmitted in non-empty subframes for frame structure type 3
Following is based on 36.211 Figure 6.10.5.2-1: Mapping of CSI reference signals (CSI
configuration 0, normal cyclic prefix)
Following is a snapshot showing the whole channels described above. Of course this is not
to give you the detailed information. It is to give you a overall picture of a whole frame.
Would you be able to identify the locations of each channels described above ? Just try it, it
will be a good practice.
Each components in this grid has it's own role and used in various different context. If you
are interested in how each of these channels are used in real communication process, refer
to following sections in Quick Reference page.
Cell ID Detection and System Information Detection
Uplink Data Transmission Scheduling - Persistent Scheduling
Uplink Data Transmission Scheduling - Non Persistent Scheduling
Downlink Data transmission Process
Channel Coding Processing for DL SCH/PCH/MCH
Physical Channel Processing
Following diagram shows overall sequence of Uplink/Downlink data transmission. You would
be able to associate the data transmission sequence diagram and the specific location of
each channels in DL/UL frame structure.
Now let's look at another example, which might look more complicated and confusing but
hopefully look more interesting :). This shows an example of what's happening during the
initial process (RACH process) after you turn on your mobile phone.
Again, the log and background RB map is from Amarisoft LTE Network simulator. All the
labels were put manually (If you roll over the mouse pointer onto each channel it shows
some detailed information, but it would not show information on the exact contents. This is
understandable.. because Physical channel by itself does not have any detailed knowledge
on the contents).
< LTE RB Map - Example 02 - 1 >
How can I figure out all the details printed on each labels shown above ? It came from the
text based log as shown below.
It took me almost an hour to pul all the lables shown above based on the log below.
However, this can be a good practice if you are at learning phase of LTE protocol.. or you
HAVE TO go through this tedious process when you are in troubleshooting situation.
Downlink Interference
I want to talk a little bit about Interference between multi cells (Inter Cell Interference). In
this page, I will show you some of the measurement result but don't pay too much attention
to the exact measurement values. You would get different value in different situations or the
value shown here would be different from the value measured by UE in live network. But
general trend and characteristics of the interence would apply to most of the cases. You
should be well aware of this kind of properties especially if you are working on test cases
related to various mobility issues (Cell Selection, Reselection, Measurement Report,
Handover, Redirection etc).
< Inter Frequency Interference between LTE and LTE with Varying Channel Power
>
In this example, I setup two cells with different band (Inter frequency, Inter band) as
follows. I used BTS1 as a serving cell and BTS2 as a neighbouring interfering cell. and I
used a Vector signal analyzer as a kind of DUT and measured EVM (Error Vector Magnitude)
detected by the serving cell.
BTS 1 = Band 4
BTS 2 = Band 17
Test Variable : Cell of BTS2 changes
Measurement : BTS1
As you see in the result shown below, there is almost no difference in the measured EVM at
BTS1 regardless of the power of interfering cell. (In this example, the frequency of two cell
is very far away from each other, so interference from the other cell is negligible, if the
fequency of neighbouring cell is closer to each other, you would see stronger interference
than in this example).
< Intra Frequency Interference between LTE and LTE with Varying Channel Power
>
In this example, I setup two cells with same frequency (Intra frequency) as follows. I used
BTS1 as a serving cell and BTS2 as a neighbouring interfering cell. and I used a Vector
signal analyzer as a kind of DUT and measured EVM (Error Vector Magnitude) detected by
the serving cell. I set the BTS1 power to be fixed and increased the power of BTS2 step by
step and checked how the measurement EVM varies.
BTS 1 = Band 4
BTS 2 = Band 4
Test Variable : Cell of BTS2 changes
Measurement : BTS1
Just by looking at the constellation, you can intuitively notice that EVM gets larger
(constellation gets worse) as the interferering cell power gets higher.
The EVM result shown here is a little extreme case. According to this result, the DUT (Vector
signal Analyzer) fail to decode the signal if the cell power difference between two cell is less
than 20. But in live network, the situation would be much better than this. Isolation
between the two cell would be much better than this test environment and real UE (mobile
phone) can decode signal much which is much worse than this since the chipset has channel
estimation and use various error correction. However the point is that UE would experience
pretty serious interference when it seems multiple cells with the same frequency around the
UE.
You can see obvious difference (outstanding difference) if you compare this result with the
previous case (Inter frequency case).
< Intra Frequency Interference between LTE and LTE with Varying Physical Cell ID
(PCI) >
In this example, I setup two cells with same frequency (Intra frequency) as follows. I used
BTS1 as a serving cell and BTS2 as a neighbouring interfering cell. and I used a Vector
signal analyzer as a kind of DUT and measured EVM (Error Vector Magnitude) detected by
the serving cell.
BTS 1 = Band 4
BTS 2 = Band 4
Test Variable : Cell of BTS2 changes
Measurement : BTS1
I set the PCI of BTS1 to be fixed and changed PCI of BTS2 to various different values to
how the measured EVM changes. This is to check how the location of reference signal of the
serving cell and neigbhouring cell can influence on the interference (If you are not sure
about how PCI is related to Reference Signal location, refer to Downlink Reference Signal
page)
As shown in the following result, when there is almost no inteference between two cells
when the cell power of the two different cell is very large.
But the cell power difference between two cells is relatively small (actually this difference in
this example is still very big difference in live network), you would see the DUT (Signal
Analyzer) fails to measure signal when the PCI is configured in such a way that the
reference signal of the two cells are same.