Tag Archives: Multi-antenna techniques

Network planning and testing for LTE-Advanced

15 Nov
Streaming video, gaming, advanced applications, and more are putting demands on today’s wireless networks and increasing the need for capacity and tower density.  In response, carriers are looking at options, such as Wi-Fi underlay and backhaul, to limit the load on networks.
Simultaneously, various carriers are conducting trials and looking to introduce LTE-Advanced to the masses in 2013, which promises to make a performance leap by bringing more low-powered nodes closer to the user.  However, issues around standards and how each carrier will make network handovers complicate things when deploying heterogeneous network components such, as smaller cell sites (e.g., picocells, femtocells, etc.).

Why LTE-Advanced?
LTE-Advanced (LTE-A) is a 3rd Generation Partnership Project (3GPP) specification in response to International Telecommunication Union (ITU) requirements for International Mobile Telecommunications-Advanced (IMT-Advanced) systems.  These requirements define what fully compliant 4th generation cell phone mobile communications system needs to satisfy, most importantly:
•    Peak speed requirements at 100 Mb/s for high mobility communication (such as from trains and cars) and;
•    Peak speed requirements at 1 Gb/s for low mobility communication (such as pedestrians and stationary users).

Even though Mobile WiMAX and LTE don’t meet these objectives, they are considered “4G” technologies since they are significantly better performing and capable then initial 3rd generation systems and are early versions of fully IMT-Advance compliant Mobile WiMAX Release 2 and LTE-Advanced.

For the LTE-Advanced system, 3GPP has further required the following:
•    Higher spectral efficiency (from a maximum of 16 bps/Hz in LTE to 30 bps/Hz in LTE-A);
•    Improved performance at cell edges (e.g. for downlink 2×2 MIMO at least 2.40 bps/Hz/cell).

What is new in LTE-A
To meet these requirements, LTE-A systems have some improvements compared to LTE, namely, carrier aggregation (CA), improved multi-antenna techniques and support for relay nodes (RN).  We will look at each one.

Carrier Aggregation
LTE-A systems need considerably more signal bandwidth to meet the requirement of the significant throughput increase compared to 3G and LTE.  Since LTE-A also needs to maintain backward compatibility with the LTE terminals, the LTE-A bandwidth increase is performed by aggregating multiple LTE carriers into a single LTE-A signal.  Each LTE carrier that comprises the LTE-A signal is called component carrier (CC).  A single LTE-A signal can consist of component carriers with different bandwidths as defined in the LTE specification.  This allows for effective utilization of the available spectrum.

Figure 1. The LTE-A component carriers can be of different bandwidth.  The downlink (DL) can have extra carriers compared to the uplink (UL).

Component carriers can be aggregated contiguously, with spectrum gaps between them (non-contiguously) or even across multiple bands (inter-band).  The current standard limits aggregation across no more than any three bands within 0.3 – 6.0 GHz.  In the future, each CC could be located in a separate band.  This again allows for effective utilization of the available spectrum.

Figure 2. The LTE-A Component Carriers can be discontinuous and even in different bands.

LTE terminals utilize only one of these carriers while the LTE-A terminals can utilize up to five CC.  Per the LTE specification, the maximum bandwidth of a single CC is 20 MHz.  Therefore, the maximum LTE-A bandwidth achievable when aggregating five CCs is 100 MHz.  It is also important to note that the downlink (DL) and uplink (UL) do not need to be symmetrical, and the downlink can have the same number or more CCs than in the uplink transmission direction.  This again allows for effective utilization of the available spectrum, and optimization of the channel based on the throughput required by the user, which is often also asymmetrical.

Figure 3. The LTE-A component carriers (transmitted from the same tower) may form the independent cells with different footprints. 

LTE-A radio resource control (RRC) is handled by one of the CC.  This CC is called LTE-A Primary CC (PCC).  Other CCs are called LTE-A Secondary CCs (SCC).  Some additional RRC messages are introduced in LTE-A to support this division.  Each CC, however, forms an independent cell with potentially different coverage.  This is due to the freedom to adjust transmitting power for each CC but also due to different band and antennas potentially being used for different CCs.  The positive aspect of this is that a heterogeneous network can be formed this way (as we will see later), but the downside is that the LTE-A terminal might not always be able to aggregate all the CCs.

Improved multi-antenna techniques
To meet the requirement of increased spectral efficiency (throughput per bandwidth), LTE-A had to build on LTE multi-antenna techniques.  In high signal-to-noise environments, LTE uses a spatial multiplexing technique called multiple input multiple output (MIMO).  MIMO allows higher throughput communication by using two or more transmit (Tx) streams received (Rx) by two or more antennas at the same time while occupying the same bandwidth.  Each transmit antenna uses a different reference signal which allows separation of the signals by the receiver.  If these antennas are appropriately spaced on the tower and on the terminal, then propagation paths between the transmitting and receiving antennas can be spatially sufficiently different to provide higher throughput with same time/frequency resources.

LTE-A increases the maximum number of the DL antennas from four present in LTE to eight and the maximum number of the UL antennas from the two present in LTE to four.  This results in almost double the spectral efficiency in high signal-to-noise environments.

Support for Relay Nodes
One of the hardest things to achieve in a cellular network is good performance at the edge of a cell just before a new cell begins.  LTE-A addresses this by using a mix of large and small cell sizes.  This mixed layout is called a heterogeneous network (HetNet).  We mentioned above how the carrier aggregation feature allows for adjustments to the footprint of every component carrier in the base station (enhanced Node B or eNB).  An additional LTE-A feature that helps achieve improved cell edge performance is the relay node (RN).  These are small and low power base stations that don’t require backhaul.

RNs are typically deployed at the edge of the cell to increase capacity and throughput there.  Instead of having dedicated backhaul (e.g. fiber, cable, radio link), RNs use a slightly different part of the LTE-A air interface (defined as Un) than that used by normal terminals (known as Uu).  This allows the RN to communicate with the donor/anchor base station (Donor eNB) for the backhaul purposes and provide a local terminal its expected Uu link.  RNs can use the same frequency or a different frequency for communication with the donor base station and the terminals.  In case of a RN operating on the same frequency as its donor base station, steps need to be taken (and will be explained later) to prevent the RN from interfering with itself.  For example, it could be a significant problem if the RN were to be transmitting to the terminals at the same time it’s trying to receive from the donor base station on the same frequency.

Figure 4. The RN uses the Un interface to communicate with the donor base station and the Uu interface to communicate with the terminals.

From the terminal perspective, RNs are fully fledged base stations.  They transmit their own cell identification information (known as a Cell Id) and handle all aspects of the air interface (sync and reference signal, scheduling, control channels, etc.) up to mobility management (handovers).  Donor base stations hide (i.e., abstract) RNs from the rest of the network.

LTE-A also employs advanced interference mitigation techniques between the elements of this heterogeneous network.  These techniques include intelligent node association and the adaptive time/frequency resource allocation (e.g., enhanced time-domain adaptive resource partitioning).  This provides dynamic network load balancing at different locations and times of day.

LTE-A network deployment considerations
Here we review some items to consider when deploying LTE-A networks.

Carrier Placement
LTE allows for a 100 kHz raster (the step between possible signals center frequencies) for carrier RF placement.  In LTE, the OFDM subcarriers are spaced at 15 kHz per the standard.  So in order to maintain the subcarrier orthogonality when aggregating multiple component carriers in the same band, adjacent CCs need to be spaced at 300 kHz raster.

It is also advantageous for component carriers to be placed symmetrically within the band due to the type of receiver (zero IF) used in most terminal and user devices.  If the component carriers are asymmetrical within the band, LTE-A resource elements (frequencies) in the middle of the band will be overlapped within a zero IF receiver’s internal mask and would be lost for communication.

Figure 5. The component carriers need to be symmetric within a band.

If the CCs are available in multiple bands, then the ones with better propagation characteristics (typically lower RF frequencies) should be dedicated for the macro coverage or the cell edge coverage and their antennas properly configured for larger footprint use.  The CCs with physical attributes promoting shorter propagation are better suited for the close-in coverage, and should be dedicated to increasing the terminal’s data throughput.

Figure 6. The Component Carrier with the lower frequency is used for the macro (cell edge) coverage and the higher frequency Component Carrier is used for the throughput increase.

RF implementation of the carrier aggregation can be very challenging depending on the spectrum distance between the component carriers.  If the CCs are in the same band, interference from other signals in the spectrum between CCs creates significant problems.  If the particular band (e.g. PCS) UL frequencies are fairly close to the DL frequencies and the CCs are spread over most of the band, then there can be filtering problems to sufficiently separate DL and UL.  Also, component carrier inter-modulation products could be present that could interfere with the system’s ability to achieve efficient full duplex communication.

MIMO Antennas
On the base station side, each sector could have eight MIMO antennas on the tower.  With this large number of antennas, sectorization on the tower requires close attention.  For example, for a particular tower, would three sectors on the tower with each having eight antennas or maybe six sectors on the tower with each having four antennas produce a better solution?  Further, instead of having eight sets of the expensive and lossy cables going up the tower for each sector, it might start making more sense to use remote radio heads on the tower in the proximity of the antennas.

On the terminal side, having four antennas properly spaced is a challenge for mobile phones.  It is also challenging to meet the LTE-A processing power and throughput needs in this form factor.  It is therefore expected that the initial LTE-A terminals will be laptop and tablet computers.

Relay Node Carrier Assignment
If the RN uses the same component carrier for communication with the donor base station and the terminals, self interference can result.  RN transmission to the donor base station interferes with RN reception from the terminals and vice-versa.  To avoid this, RN antennas intended to communicate with the donor base station should be isolated (separated, distanced) from the RN antennas meant for communication with terminals.

If this is not practical (which often is the case), the RN needs to use a different component carrier than the donor base station for communication with the terminals.  Another solution is to schedule the backhaul and access communication in the RN at different time instances.

Relay Nodes vs Pico Cells
A key decision the network planner must make is how to choose between augmenting the macro cell with RNs rather than deployment of pico cells.  The biggest factor is usually availability of a backhaul needed for a pico cell where it is intended to be deployed.  If increased capacity is required in the macro cell and the backhaul to the small cell location is less expensive than the LTE spectrum, you would typically want to place a pico cell there, leaving the spectrum that would be consumed by the backhaul available to serve other terminals.

If, on the other hand, a small cell’s purpose is to patch a gap in the coverage, there are no significant capacity issues in the macro cell yet.  Or, if another form of backhaul isn’t readily availability, it is probably appropriate to install a RN to serve the area.  As discussed, this will use part of the unused macro cell spectral capacity for its backhaul, which may be of little concern in more rural/less dense areas.  If in the future macro cell utilization begins to approach its capacity, then converting this RN into a pico cell and providing a dedicated backhaul for it might be necessary.

Other Areas for Consideration with LTE-A
Wi-Fi Underlay
If the pico cells are required to augment the macro LTE-A base stations, it is prudent to look at possibility for Wi-Fi to fill in this role.  The majority of the current mobile terminals are equipped with Wi-Fi.  There exists greater availability of spectrum for Wi-Fi as compared to that available for cellular communications.  Wi-Fi has recently also been significantly improved through IEEE standardization work.  The IEEE 802.11ac standard will enable speed of up to 1 Gbit/s in the 5-GHz band, which is on par with LTE-A.  By the end of 2012, this new standard is expected to be finalized and by the end of 2013, be fully approved.  Integration of Wi-Fi into the cellular networks is also being standardized as is roaming between the two technologies.  The primary standard being developed for this roaming is IEEE 802.11u.  Maintaining the cellular quality of service, while roaming on the Wi-Fi networks, is also being addressed.

Wi-Fi backhaul
If, due to the capacity issues, pico cells need to be installed, but there is no backhaul capability for them, then point-to-point Wi-Fi links might be the right solution.  The existing 802.11n devices can probably handle most of the LTE-A Pico cell requirements, but, if not, the above mentioned 802.11ac should be available in time for the first LTE-A deployments.

Receiver/Scanner and Test Tools
Initial fields test before the LTA-A signals start to be transmitted are no different than for any other cellular technology.  If the new part of the spectrum is to be utilized we first need to make sure that no interference is present in it from any other system.  This can be accomplished by performing a drive test in the area where LTE-A deployment is to be made using a scanning receiver that just measures power in the part of the spectrum to be used by the LTE-A system.  If interference is discovered, then a more detailed drive test might need to be performed to localize and try to identify it using the spectrum analyzer feature of the scanning receiver.  Identified interference should be removed in order to properly deploy the LTE-A system in this spectrum.  Alternative tools for this phase of deployment can be a fast and sensitive spectrum analyzer or a specialized interference hunting tool with directional antennas.

If the LTE-A channel aggregation is utilized it is prudent to check for presence of the in and out of band interference due to inter-modulation products that can potentially be created when multiple component carriers are transmitting.  The same tools as above can be used for that.

An LTE-A deployment can be planed by determining the position and the configuration of each LTE-A base station.  If a new base station location or a new spectrum region are to be used, then it might be necessary to test the base station RF propagation characteristics to properly plan the network.  This can be done using an omni-directional test transmitter radiating in the still empty LTE-A spectrum and mounted where the base station antenna is to be located.  A local drive test with the scanning receivers measuring the power of this transmitter can be conducted to determine the amount of path loss for this base station.  Instead of a scanning receiver, a fast and sensitive spectrum analyzer could also be used for this test function.

Once the spectrum is clear of interference and the LTE-A network plan implemented, its actual performance needs to be determined and optimized before it is ready for commercial service.  This is accomplished by measuring the power and the quality of signals coming from particular base stations.  Most of the synchronization, common reference and broadcast signals are carried over from LTE to LTE-A (intentionally, due to backward compatibility reasons).  So, as in LTE, base station reference sequences are used for the power measurement that is still called reference sequence received 0ower (RSRP). Either reference sequence received quality (RSRQ) or carrier to interference and noise ratio (CINR) is used for the base-station quality measurement.  The LTE-A physical cell identities (PCIDs) detected, as in the LTE, identify the base station to which the measurements belong.  A typical LTE scanning receiver would support these LTE-A measurements and base station identities, but various LTE phone-based tools could support them too with a tradeoff between the capability, price and size of the equipment.

The RSRP and CINR should be measured in particular between the cells or at the cell edges as well as insides the buildings.  Measurements in these areas will indicate if the additional optimization of the base station parameters is necessary or installation of the relay nodes / pico cells there would increase the network performance.

Once the LTE-A network is operational with real end user traffic, it should be evaluated for the uplink and the downlink throughput as well as bit error rate.  The phone-based test tools are usually best for this.  Based on these measurements further network optimization can be undertaken.  Same scanner and phone based test tools can be used for ongoing network maintenance and re-optimization during its lifecycle.

This is all good news for the carriers, because much of their existing LTE deployment tools, ,such as Rhode and Schwarz’s ROMES and PCTEL’s SeeGull, can be directly used with some changes to how they are configured.  Any of the more advanced tools that also include decoding of the layer 3 messages, such as QRC’s ICS-Qp and ICS, as well as ROMES, should be updated to support the additional information present for the LTE-A.  That many of these tools can be utilized with fairly minor upgrade is a significant cost savings in rolling out LTE-A.

Positioning of Relay Nodes/Pico Cells
To determine if a LTE-A macro cell needs, an underlay of the RNs or the pico cells, we need to check if there are coverage gaps, low quality of service areas or capacity bottlenecks in this macro cell.  If we have problems like these, we should try to determine where in the macro cell’s coverage area they occur.  The field measurements (e.g. drive test) with scanning receivers are one way to determine this.  The area with the low DL signal strength would indicate the coverage gaps; the areas with low DL signal quality (high interference) would indicate the areas with low quality of service; and the areas with high DL signal quality but lots of UL spectral activity would indicate the area of high traffic that could cause a capacity crunch in the macro cell.  Once these areas are identified, it can be evaluated if covering them with a RN or pico cell would be cost effective.

Heterogeneous Network Interference Management
Additional improvements in terminal performance in the heterogeneous network can be achieved by employing terminals with advanced receivers that can cancel the interference from the overhead LTE-A channels (e.g. sync, broadcast, common reference signal) transmitted by the macro (donor) cell.  It is not clear at this time when these advanced receivers will be commercially available.

Conclusion
After reviewing major new aspects of LTE-A technology, we can conclude that preparation for carrier aggregation is probably the first and most important challenge.  Appropriate spectrum needs to be obtained/cleared and then allocated to cells in an optimal way.  The next biggest priority from the planning perspective will be how to improve the network with use of the RNs.  By the time service providers are ready to begin implementation of LTE-A, many LTE networks will probably already have an underlay of small cells (LTE pico or Wi-Fi), and RNs should be planed for locations where such cells are needed but there is not an appropriate backhaul for them.  Finally, LTE scanners/receivers will be appropriate for initial LTE-A field tests.

Source: http://www.edn.com/design/test-and-measurement/4401353/Network-planning-and-testing-for-LTE-Advanced