Archive | Signaling RSS feed for this section

Experiment: In search of NB-IoT, 4G & 5G signals on the air

21 Sep

It seems that at least every decade, wireless telecommunications makes a significant leap in the form of a new generation of air-interface technology which puts the latest developments in radio technology into the consumer’s hands. Right now, we are actually on the precipice of two new technologies which have the potential to improve quality of service over 4G/LTE-A technologies in densely populated areas and extend service to low-cost low-powered sensor nodes – the two technologies being 5G and NB-IoT respectively.

I was prompted to take a closer look at these technologies when a fellow colleague mentioned them in passing over a lunchtime conversation which coincided with the RoadTest for a Siretta SNYPER-LTE cellular network analyser. While I put an application in, unfortunately, I was not successful which was a bit of a disappointment, but at least I could still look at the spectrum with my Tektronix RSA306 Real-Time Spectrum Analyser.

Getting Ready for NB-IoT and 5G

Narrowband IoT (shortened to NB-IoT) is an LTE technology designed for low-power wide-area network (LPWAN) applications. It brings a lower-rate, narrower-bandwidth service which reduces cost and complexity of compatible radios and reduces power budget. This means it competes with the likes of LoRa, Sigfox and other similar technologies. Its technical specs include a 250kbit/s throughput, single-antenna configuration with 180kHz bandwidth in half-duplex mode and device transmit powers of 20/23dBm.

The big draw of NB-IoT compared to the other competing technologies is that it can be enabled simply by updating BTS firmware and configurations. Telcos are already in a prime position, having the hardware, network infrastructure, dedicated/protected spectrum and business already established while competing networks often are still building out coverage using unlicensed bands. Furthermore, the NB-IoT standard solves key interoperability, cost and power budget issues with full-function cellular modules which may accelerate the adoption of IoT devices using this form of connectivity.

Not wanting to be left behind, in Australia, Optus, Vodafone and Telstra have trialled NB-IoT in 2016 and 2017. Of them, the latter two have deployed NB-IoT service, with full service by October 2017 (Vodafone)/January 2018 (Telstra) while further extending coverage. Optus, however, does not seem to have a commercial NB-IoT service at this time. Despite this, the number of NB-IoT capable equipment is still relatively scarce in the consumer space, with development boards only recently becoming available.

In contrast, 5G seems more widely publicised as the successor to LTE, offering higher speeds and lower latencies which are often claimed to be the enabler of many new wireless applications (although this is yet to be seen). There has been a lot of confusion as to the capabilities and coverage as 5G services can be deployed in the sub-6GHz band where performance is often said to be like an “improved” LTE-A, as well as millimeter-wave bands which offer much wider bandwidth and throughput, but has very poor propagation characteristics. Present-day 5G handsets are not “standalone” yet, operating in “NSA” mode which relies on 4G network radio hardware. This may persist for a few years and is perhaps, not surprising, as many of the MVNOs still do not offer VoLTE and thus LTE-capable phones are still falling back to 3G for circuit-switched calling.

Regardless of the practicalities of deploying a technology that is still in evolution, both Telstra and Optus have made some rather public announcements of introducing 5G services in select areas in a fight for bragging rights which seems reminiscent of the 4G LTE roll-out. Notably absent is Vodafone, who perhaps are being more careful after investing heavily in their LTE refresh after Vodafail, although their joint venture with TPG has secured some spectrum.

In the Sydney area, the present Telstra map looks like this, showing isolated pockets of 5G coverage:

Meanwhile, it would seem that Optus has split their Sydney area maps into districts, where it seems one or two towers in a few select suburbs have been upgraded, likely to support their limited 5G Wireless Broadband service which is attempting to challenge the NBN.


While it doesn’t look like there are many active sites, this is because there is a lot of work being done to prepare for the sites to be active.

Near to where I live, this Telstra tower had a crane servicing the tower for about four days. I would suspect this is to prepare for the activation of 5G – especially when you see the following ads being taken out in the notices section of local papers:


This does imply that Telstra uses Service Stream, while Optus uses Metasite to work on some of their sites.

I suppose it makes sense that deployment is already underfoot, especially seeing that now, early 5G-capable handsets are starting to appear which may provide the added performance and prestige that the high-end of the market might demand (and be willing to pay for). However, aside from cost, there have been some reported downsides with some 5G handsets having shorter battery life due to greater power consumption.

Later on down the track, I suppose the network may be refreshed with new BTS hardware and antennas to support mmWave and standalone-5G deployments, while high-end users are likely to have replaced their handsets to take advantage of these advances. Mainstream users (such as myself) will still have to wait a few years for it to “trickle down”, but the benefits may be felt as the LTE network has some load shifted over to 5G. That would be especially welcome where I am as the NBN is still not here and LTE congestion is a real phenomenon.

On the Air

So I thought it would be a good idea to get out the spectrum analyser to see what the signals nearby looked like on a band-by-band basis.

700MHz (Band 28)

The “digital dividend” band which was opened up by the change to all-digital TV broadcasting is also often known as 4GX (Telstra) or 4G Plus (Optus). Band 28 support has also become the “in-joke” of OzBargainers whenever anyone posts a deal about a mobile phone, as it wasn’t a widely-supported band by most budget-mainstream phones (especially imported ones).

In this band, there is a 10MHz carrier at 763MHz (Optus) and a 20MHz wide carrier at 778MHz (Telstra). Because these are FDD-LTE, the receive carrier is equivalent width at 708MHz and 723MHz respectively. But do you see that on the right side?

The carrier at about 787.200MHz is the Telstra NB-IoT service, plainly visible on a spectrum analyser. The choice of the 700MHz band would ensure greater propagation than a higher band, but whether this frequency is well-supported by all NB-IoT radios is perhaps unknown.

850MHz (Band 5)

The 850MHz band was home to Telstra’s “NextG” 3G service as well as Vodafone’s LTE service (as they don’t have any 700MHz allocation).

In the low part of the band, we can see some digital trunking radio which still lives near the 850MHz band. The 10MHz wide Vodafone LTE carrier (875MHz, paired with 830MHz) can be seen next to two 5MHz Telstra NextG 3G carriers (885MHz paired with 840MHz). The carriers which have “rounded” shoulders are easily distinguished as 3G.

900MHz (Band 8)

The 900MHz band was formerly home to mostly GSM services, but since the 2G shutdown, it has been refarmed for 3G use mainly by Optus with Vodafone LTE (and in some places, Telstra).

The 8MHz wide Optus allocation is at the lower end of the band 947.6MHz paired with 902.6MHz, split across two carriers. The Vodafone allocation at 955.9MHz is 8MHz wide and paired with 910.9MHz according to ACMA, which seems to be split across several carriers. There is an interesting “shard” on the right hand side – this appears to be Vodafone’s NB-IoT service.

Its frequency is approximately 959.800MHz and has a very similar spectral characteristic to the Telstra carrier identified earlier.

1800MHz (Band 3)

The 1800MHz band was the home of 4G at its introduction and is one of the bands where every carrier has some allocation.


The first carrier belongs to Telstra which has a 12MHz allocation at 1811.25MHz paired with 1716.25MHz which is carrying a 10MHz wide carrier. This is followed by Vodafone with 15MHz at 1827.5MHz paired with 1732.5MHz and 1842.5MHz paired with 1747.5MHz which they seem to be using as 10+20MHz. Rounding out the band is Optus with 15MHz at 1857.5MHz paired with 1762.5MHz.

2100MHz (Band 1)

The 2100MHz band is the upper band which was used by early 3G handsets, but has also been refarmed for LTE to some extent, making it rather messy to look at.


Vodafone has a 14MHz band allocation at 2117.5MHz paired with 1927.5MHz which seems to have a 15MHz LTE carrier in it. This is followed by a 20MHz allocation to Optus centred at 2140Mhz paired with 1950MHz which seems to be carrying a 10MHz LTE carrier and a 3G carrier. This is followed by a 5MHz Telstra 3G carrier at 2127.5Mhz paired with 1937.5Mhz, then a 10MHz wide Telstra LTE carrier at 2155MHz paired with 1965Mhz. Rounding the upper part of the band seems to be a pair of 3G carriers from Vodafone which sits in a 9MHz bandwidth allocation at 2165Mhz paired with 1975MHz.

2300MHz (Band 40)

Band 40 is exclusively used by Optus by their TDD-LTE service used initially to serve data connection to their home wireless broadband product users, but now, seems to allow connection from any capable device. As this is TDD, there is no paired frequency as both directions share the same frequencies.


They have four separate 20MHz wide carriers, with compatible devices using carrier aggregation to achieve higher speeds. I believe their total allocation was 98MHz, but the upper section (near 2.4GHz) remains unused possibly due to interference from/to 2.4GHz ISM band devices. I actually get pretty decent 100Mbit/s service using 2x2CA on this band when it’s not congested and is one reason why Optus outperforms Vodafone by a big margin where I am.

2600MHz (Band 7)

Band 7 seemed initially confined to high density areas such as train stations, but now covers a wider area. This band has equal 20MHz carriers where I am at the moment.


Telstra owns 40Mhz of bandwidth at 2650Mhz paired with 2530Mhz. Optus has 20Mhz of bandwidth 2680Mhz paired with 2560Mhz. It is said that TPG has 10Mhz of spectrum in Band 7, but I don’t think I’ve seen the signal from where I am.

3400-3700MHz (5G/Sub-6)

Given that all of these bands are already used – where is 5G going to fit in the “sub-6” scheme? According to the best news I could get, we would be deploying 5G into the 3400-3700MHz range. Higher frequencies normally mean poorer penetration, so that was probably not the best news for indoor coverage. Worse still, it is basically taking over the spectrum from the pre-WiMAX wireless internet service Unwired (later, VividWireless).


While I wasn’t in a coverage area, I decided to see if I could see the signal … ultimately from home, all I saw was bleed-through noise from 4G carriers in the 2600Mhz band.

I decided to carry my gear into the city, to a location where it is covered by both Optus and Telstra 5G to see if the signal can be seen.

The sweep is 1GHz wide which took some time, with peak hold on the traces, but the 5G signal was fairly weak with lots of noise from perhaps intermodulating signals. The lower 5G carrier isn’t so obvious – the upper one is slightly more visible.

Ultimately, it took until the 18th September 2019 for the details to turn up in ACMA’s RRL database – Optus is at 3458.8Mhz with a 60MHz slice with Telstra is at 3605MHz with a 60MHz slice, both operating transmit/receive on the same set of frequencies.

Wait a Minute?

If we remember what happened on the introduction of Unwired, the choice of these frequencies is rather unfortunate for satellite enthusiasts. The extended C-band (large dish) services rely on the frequency range of about 3400-4200MHz with regular C-band occupying 3700-4200MHz.

With the carriers being within the extended C-band range transmitted terrestrially, it is very likely that a small amount of spill-over will cause LNBs (which have very high gains as they were designed to receive the very weak signals from geostationary satellites) to saturate and operate non-linearly causing reception problems for certain frequency ranges or perhaps the whole band altogether. The width of the carriers at 60MHz gives a real possibility it can wipe out a few MCPC services in one fell swoop.

While there are not many services that reach Australia in the extended portion of the band, even OCS “band-stack” LNBs which operate from 3700-4200MHz may not be sufficiently engineered to reject the signals, which are a lot closer than back in the Unwired days when ~3500MHz with a bandwidth of 10MHz was used.

While the “big ugly dish” is becoming less relevant in a world of IPTV and video-on-demand, it seems rather disappointing that yet another one of the technologies I’ve grown to understand is becoming “extinct”.

It’s also interesting to see that the NBN has been trialling fixed wireless in the 3.5GHz band (B42), so there may well be a collision between 5G sub-6GHz deployment and NBN LTE Fixed Wireless services … which would only increase the potential headaches to a C-band satellite user.


The radio bands are chock-full of 3G and LTE carriers, with NB-IoT and 5G recently joining the mix after the death of GSM. But it seems our insatiable appetite for mobile data bandwidth means that we will soon have even more spectrum than ever before, in the form of millimeter wave 5G radio interfaces. It will still be a number of years until they become mainstream despite the limited propagation characteristics and until then, it seems that sub-6GHz will be the “interim” technology that carries the 5G flag even though it is operating at microwave frequencies that are not the most favourable for propagation.

Unfortunately, it seems when the 5G sub-6GHz services are switched on, users of C-band satellite systems may experience the same problems they did when Unwired was in use. It seems that the relentless march of technology continues … for better or for worse.

21 09 19

802.11ac Adjacent Channel Interference (ACI)

4 Aug
 I was reading this article on development of 5G cellular technologies when this bit on OFDM deficiencies and the need for new waveforms to support higher capacities and user densities caught my attention (emphasis added by me):

4G and 4G+ networks employ a type of waveform called orthogonal frequency division multiplexing (OFDM) as the fundamental element in the physical layer (PHY).  In fact, almost all modern communication networks are built on OFDM because OFDM improved data rates and network reliability significantly by taking advantage of multi-path a common artifact of wireless transmissions.  However as time and demands progress, OFDM technology suffers from out-of-band spectrum regrowth resulting in high side lobes that limit spectral efficiency.  In other words, network operators cannot efficiently use their available spectrum because two users on adjacent channels would interfere with one another.  OFDM also suffers from high peak-to-average ratio of the power amplifier, resulting in lower battery life of the mobile device.  To address OFDM deficiencies, researchers are investigating alternative methods including generalized frequency division multiplexing, filter bank multi-carrier, and universal filter multi-carrier.  Researchers speculate that using one of these approaches over OFDM may improve network capacity by 30 percent or more while improving the battery life for all mobile devices.”

This aligns with most Wi-Fi professionals’ recommendations to deploy 5 GHz radios on non-adjacent channels to avoid that dreaded adjacent channel interference (ACI). 
And if you look at an OFDM Wi-Fi transmit spectral mask, either the limits defined in the standard or using a spectrum analyzer, you will see rather significant side lobes that can impact adjacent channels (and even further, depending on proximity and power levels). I have even considered including discussion of OFDM spectral masks within my 802.11ac presentations and writings due to the fact that as channel widths get wider, so to do their side lobes because the frequency distance from the main carrier signal at which the relative power level must be reduced to be in compliance are increases as well. Here is an illustration that I put together over a year ago but never published and kept in the appendix of my 11ac presentation. It illustrates how ACI can increase due to the spectral mask differences as channel widths get larger. I have inlaid two 20 MHz spectral masks inside the 40 MHz mask, and two 40 MHz masks inside the 80 MHz mask. Essentially, the side lobe power level reduction requirements are based on the size of the main signal lobe; as the main signal lobe gets larger, so too does the allowed power in side band lobes.

Spectral Mask Comparison of 20, 40, and 80 MHz Wi-Fi Channels
And below is a capture from a spectrum analyzer approximately 10 feet away from an 802.11ac AP operating in 80 MHz mode with a large amount of traffic. Notice the high signal level in adjacent channels (52-64, and likely would impact the as-of-yet unapproved U-NII 2B band). 

Spectrum Analysis Capture of an 802.11ac 80 MHz Waveform
This is why you need a minimum of 10 feet of separation between radios operating in the same frequency band (unless other shielding mechanisms are used, which increase cost), as well as the recommendation to have adjacent 5 GHz radios operating on non-adjacent channels. This will start to become a bigger issue as we deploy more 5 GHz radios to handle capacity and user density demands. More manufacturers are considering developing software-defined radios (SDR) as well as multi-radio APs that have more than one radio operating in the 5 GHz band. You should carefully research and verify (through real-world testing) these solutions to ensure that interference within the AP is not an issue.As always, the better you understand what’s going on at the physical layer, the better wireless engineer and architect you will be. 


Signal Analyzers

4 Aug


 This manual provides documentation for the following Analyzer: N9020A MXA Signal Analyzer


Download: N9020-90113


Notice: This document contains references to Agilent. Please note that Agilent’s Test and Measurement business has become Keysight Technologies. For more information, go to



Are signaling bottlenecks making LTE a victim of its own success?

5 May

LTE networks are being overwhelmed by data traffic and with signaling in the core taking the brunt of unprecedented demand, it looks as though Diameter, the ETSI and 3GPP standard, may not, on its own, be up to the task of managing a more complex and burgeoning network. So what’s the alternative? Is there a newer and better solution out there?

Welcome to Part 2 of this extended TelecomTV article. Yesterday, in Part 1, we reported that subscribers, sometimes a few, frequently a lot, are complaining that they are not getting what they were led to expect they would get from signing-up to expensive and lengthy ‘4G’ service contracts. They say they are paying premium prices for what can be less-than-robust, underperforming network services as well as apps that fall over and drop-out and data (and even voice) calls that are often actually better over much cheaper 3G networks.

Robin Kent, Director of European Operations at UK-headquartered Adax, the packet processing, security, and network infrastructure for the all-IP network company says the problem can be laid directly at the feet of the Diameter signaling standard and protocol.

He told me, “While LTE is delivering on its promise of providing users with a quicker, more data-centric mobile broadband service, the backbone of the network is not being properly supported. Diameter signaling is now the bottleneck in LTE performance, because the transport that Diameter runs over is insufficient”. Trenchant stuff.

Long Term Evolution, which is if course what the acronym LTE stands for, is exactly that – a standard, based on GSM/EDGE and UMTS/HSPA network technologies and using a different radio interface in concert with core network improvements, to permit evolutionary progression towards true 4G.

Thus network operators do not have to scrap their expensively constructed old networks and take another huge capex hit but instead can keep them and tweak them to provide 4G-like services and apps. That is one of the reasons LTE is increasingly popular and why the network signaling that manages data sessions and polices traffic policies is having to accommodate volumes and complexities that are drastically degrading performance and limiting growth. It’s an untenable situation.

Some in the industry compare the complexities of Diameter signaling to the problems exhibited by Signaling System 7 (SS7) when the first mobile networks were introduced. SS7 was the source of many difficulties when it was first introduced but matured to become robust and reliable. The question is whether or not Diameter signaling can do the same?

For SS7’s early problems pale into comparative insignificance when measured against the 77 different and unique Diameter signaling interfaces that are assigned to specific IMS and LTE network elements.
And then there’s SCTP, (the Stream Control Transmission Protocol) that focuses on the transport and session layers rather than the network layer and operates on top of IP. It is important and is supported in the Linux kernel, but, Robin Kent contends, SCTP “just can’t cope with the specific signaling needs that the LTE network demands.

Larger volumes of data transfer and consumption mean that the strain on the network is being felt at various levels, and critical, evolved signaling and transport solutions are needed to match the demands that are being placed on the network”.

He adds, “The emergence of new technology usually breeds other new technologies, largely by way of support, but with recent network developments and the uptake of LTE, current signaling transport technology is no longer sufficient because of the data pressures that are being placed on the network. Transport protocols such as SCTP, in its existing form, are insufficient and operators have to address the problem that this will pose on LTE performance”.

What is needed then is a strong, high-performing transport layer and Robin Kent is, understandably, quick to point out that his company’s product, Adax SCTP for Telecom (Or SCTP/T) was designed specifically to meet the demands of LTE and IMS wireless networks for thousands of simultaneous connections. That said, and of course, Adax is not alone in seeking to provide answers to the overwhelmed Diameter signaling problem and there are other products by other manufacturers out there on the market.

And, regardless of the manufacturer, the solution to be applied must be able to do much the same things: to perform vigilant and almost instant in-service quality monitoring of the signaling link and be able to detect degradation of link quality at very, very short, programmable intervals.

That’s because the flat all-IP architecture of the LTE network requires a very large number of signaling connections as well as signaling concentration for efficient routing. Diameter signaling is the choke-point in LTE signaling performance and while a convenient and apparently economical solution, the Linux-supplied SCTP for Diameter, is freely available, it is simply is not up to a task that it was not designed to perform.

For LTE subscribers the technology is all about the perceived quality of experience they receive and, for the network operators it is the maximisation of that experience is based on a pro-active, immediate and flexible management of network capacity including the capability to conduct near-instantaneous data analysis, packet processing, DPI and the freeing-up of host CPU resources to remove bottlenecks.

Diameter signaling cannot do this unaided and LTE congestion, service attenuation and loss and consumer frustration will only inevitably increase unless something is done to address the matter now.

The philosopher and trancendentalist Henry David Thoreau, musing in his cabin on Walden Pond in Massachusetts in the mid-19th century wrote, “We must heap up a great pile of doing, for a small diameter of being”. Now, that’s what I call foresight.


LTE speeds dive in the US as users crank up the video

3 Mar

It’s not how fast your network is, it’s probably how many people are on it and what they’re doing that counts as new figures show a dramatic fall-off in average LTE speeds in the US. By I.D. Scales.


Late last week OpenSignal ( reported a dramatic decline in LTE speeds, as experienced by users, across all the LTE networks, in the US. This news hasn’t exactly been top of the pile at Mobile World Congress for obvious reasons and because I’ve mentioned to a few people by way of opening a conversation and they hadn’t heard (especially WiFi people), I thought I’d retail the story now even though it’s gone a bit whiskery.

OpenSignal is a crowdsourced network metrics operation. It claims 6 million people have downloaded its app which gathers performance stats from the handset’s point of view (throughput, geo location, time of day/week/year, jitter etc), sends the data back to OpenSignal which compiles some meaningful statistics about comparative network performance, in this case LTE.

We all know that any method of measuring network performance will draw fire from one quarter or another, usually because the results don’t support a particular world view. OpenSignal points out that the usual measuring method might involve testing the actual network by driving a test car about to record congestion, signal strength and so on.


So there was considerable surprise when OpenSignal’s latest figures summary (released in the run-up to Mobile World Congress) gave the US a disappointing speed average in LTE when compared to other territories.  Far from ‘leading the world’ in mobile broadband, as we’re often told, according to these figures the US is in reality a laggard in user experience.  It came 15th out of 16 countries surveyed, with the average data rate dropping to 6.5 Mbit/s from 9.6Mbit/s. This is LTE we’re talking about.

The reason for the drop, of course, is almost certainly from user numbers and video consumption going up fast as users learn to flex the new speeds but end up Easter Islanding the network.

Verizon and Sprint were lowest with 7.6 Mbit/s and 4.2 Mbit/s respectively. T-Mobile was best at 11.2 Mbit/s and AT&T next at 8.9 Mbit/s.   All these speeds were well below the other country averages.

Leaders were the familiar line-up of South Korea, Hong Kong and Sweden.


LTE network speeds, according to the latest OpenSignal report

24 Feb

The United States trails 13 countries when it comes to LTE network speeds, according to the latest OpenSignal report. The report found that average LTE network speeds in this country have declined 32% this year. Australia posted the fastest LTE speeds, with an average download speed of 24.5 megabits per second. Other countries with faster LTE speeds than the 6.5 Mbps posted by the United States were (in order) Italy, Brazil, Hong Kong, Denmark, Canada, Sweden, South Korea, the United Kingdom, France, Germany, Mexico, Russia and Japan.

The United States suffered the biggest decline in network speeds of any country, as operators struggled to keep pace with increasing data downloads. Last year the U.S. ranked 8th in the OpenSignal study, with an average LTE network download speed of 9.6 Mbps.

Many of the nations with faster speeds than the United States do not have as much LTE coverage. Verizon Wireless and AT&T Mobility, which together have roughly 200 million subscribers, are both nearing completion of their LTE roll outs with more than 300 million potential customers covered. Sprint and T-Mobile US both have substantial footprints as well, having recently surpassed 200 million pops covered. “The [United States] performs well on our coverage metric, with the average user experiencing LTE coverage 67% of the time, with Australia, the fastest country, on 58%,” OpenSignal said in a press release.

When it comes to domestic network speeds, T-Mobile US had the best performance among the carriers. The carrier posted average download speeds of 11.21 Mbps, with AT&T Mobility No. 2 at 8.9 Mbps. Verizon Wireless clocked in at 7.8 Mbps and Sprint’s average download speed was 4.2 Mbps. Sprint currently has the least amount of spectrum dedicated to its network at just 10 megahertz in most markets, while the others provide at least double that amount.

The State of LTE

Network operators around the world are working hard to convince their users to make the jump to LTE. The term “4G” acts as a convenient label for marketers to emphasise the superiority of this new standard over its predecessors, but just how standard or consistent is the experience of users on LTE?

The OpenSignal app allows users to contribute to our impartial coverage maps of mobile networks, we took data from those of our 6 million users who have LTE and focussed on their experience of two key metrics: download speed, and the proportion of time spent with LTE access. All data included in this report comes from the second half of 2013.

We found that not all LTE networks are created equal, indeed there is an extremely broad range of experience across both metrics. Only about a quarter of networks surveyed achieve both good coverage and fast speeds; clearly there remains much work before LTE lives up to its full potential.

Transmit Signal Leakage in LTE Frequency-Division Duplexing Applications

13 Feb

In today’s high-speed wireless communication standards like LTE, the performance of both base transceiver stations (BTS) and user equipment (UE) transceivers is crucial. LTE supports time-division duplexing (TDD) as well as frequency-division duplexing (FDD). In this post, we look at transmit signal leakage problems that can occur in FDD applications. To do so, we show a numerical analysis using specifications from the Nutaq Radio420X and the standard LTE performance requirements.

Frequency-division duplexing and isolation

FDD implies that the transmitter and receiver operate at different carrier frequencies, allowing for constant and simultaneous transmission and reception. In full-duplex FDD mode, the transmitter signal leakage must be taken into account (this does not apply to TDD or half-duplex modes). The receiver is constantly exposed to this transmit signal leakage and its sensitivity can drop drastically if improper isolation is used. Most of the isolation is obtained with a good PCB layout and shielding, but one will always have to use effective filters/duplexers in order to achieve optimal isolation.

The Radio420X’s receiver has a software-selectable band-pass filter bank. Its filters typically have 40 dB of rejection on either side of the bandwidth. Figure 1 shows a simplified block diagram of the Radio420X transceiver section.

Figure 1 - Simplified Radio420X transceiver block diagram

Figure 1 – Simplified Radio420X transceiver block diagram

Transmit signal leakage

Clearly, the fundamental components of the transmit signal can interfere with the received signal, but this is not the only concern. The transmit signal will also generate out-of-band phase noise that falls within the receiver band. This unwanted power affects the receiver sensitivity by raising its noise floor, as shown in Figure 2.

Figure 2 - Out-of-band phase noise effects on sensitivity

Figure 2 – Out-of-band phase noise effects on sensitivity

Example calculations

Let’s look at a numerical example using the LTE Band 1. It operates within the following frequencies:

  • Uplink (UE transmit): 1920 – 1980 MHz
  • Downlink (UE receive): 2110 – 2170 MHz

Assume that we want to operate in full-duplex FDD using carrier frequencies 1920 and 2110 MHz for a UE transceiver. The Radio420X’s specifications will be used in the following calculations.

First, we determinate how much power will be leaking into the Rx path when operating at the maximum output power. We know that the fundamental Tx component will be filtered out by 40 dB when it reaches the band-pass filter. However, the first variable amplifier of the Rx chain is placed before the filter and is set to a maximum gain of +18 dB for best sensitivity. Its OP1dB is 20 dBm, so any input signal greater than 2dBm will saturate this amplifier and block the whole receiving process. Thus, we need a minimum of 16 dB Tx/Rx isolation to avoid this situation. Knowing that the PCB traces isolation is better than 55 dB, the only worry is about antenna isolation (the Radio420X uses two antennas instead of a duplexer). At 1960 MHz, 30 dB antenna isolation is achieved with a horizontal separation distance of 12 cm (for a -5 dB gain in the direction of the other antenna), or a vertical separation of 17 cm [1], which is easily realized.

The second concern about transmit signal leakage is its out-of-band phase noise. This power can enter the Rx band and affect its sensitivity. The Radio420X shows a typical phase noise of -140 dBm/Hz at a 20 MHz offset with a 2000 MHz carrier, measured with 0 dBm of output power. Assuming that the phase noise remains constant at greater offsets, -122 dBm/Hz (for 18 dBm of output power) of the transmitted signal noise spectral power density reaches the Rx band. The receiver sensitivity, -103 dBm, is measured within a 200 kHz bandwidth with a 5 dB signal-to-noise ratio (SNR). In order to allow the transmitter to affect sensitivity by no more than 0.5 dB, the transmitter noise power needs to be 9 dB below the noise floor, which correspond to -117 dBm. The corresponding phase noise power for a maximum power output of +18 dBm is -69 dBm (-122dBm/Hz + 10log(200kHz)), which is within the LTE specification of -50 dBm for maximum emission from the UE transmitter in its own receive band.

Finally, to get to the -117 dBm target, we need to isolate the antennas by 48 dB. This can be performed easily with an external low-cost ceramic duplexer. However, for a dual separate TX-RX antenna setup, this requires a horizontal and vertical spacing of about 68 cm and 41 cm respectively [1]. Keep in mind that these requirements only have to be met when the transmitter is set to maximum output power in order to not affect the receiver sensitivity.


The worked-out example shows that the main concern regarding transmit signal leakage, in typical conditions, is the transmitter phase noise. The out-of-band noise power will enter into the receiver band and affect the whole Rx path, degrading its sensitivity. This demonstrates how different specifications can critically interact with each other. In order to meet today’s wireless communication standards, transceivers such as the Nutaq Radio420X must have flawless performance for each parameter.


[1] International Telecommunication Union. Isolation between antennas of IMT base stations in the land mobile service.



Link Budget for LTE

16 Dec

Wireless Downloads/_lte/RF Design And Link Budget/




Filename Filesize Filetime Hits
 05_RA41205EN10GLA0 LTE_Link_Budget_v02.pdf 580.41 KB 8/26/2013 10:55:22 PM 48
 124638546-LTE-Radio-Link-Budgeting.pdf 218.94 KB 8/26/2013 11:05:54 PM 43
 129059391-05-LTE-Link-Budget-GC.pdf 1.50 MB 8/26/2013 11:04:38 PM 48
 129064541-06-Cell-Range-GC.pdf 489.22 KB 9/29/2013 8:09:38 PM 35
 135096675-LTE-Link-Budget.pdf 105.58 KB 8/26/2013 10:59:58 PM 39
 162152848-LTE-Link-Budget.xlsx 15.83 KB 8/26/2013 11:07:50 PM 34
 64481984-11-LTE-Radio-Network-Planning-Introduction.pdf 1.01 MB 7/13/2012 10:51:00 PM 81
 72800816-LTE-Link-Budget-Introduction-V1-0.pdf 705.12 KB 8/26/2013 11:00:36 PM 33
 82733625-LTE-Link-Budget1.pdf 196.91 KB 8/26/2013 11:05:00 PM 61
 EDX_WP_DesigninganLTEnetworkusingEDXSignalPro_Sept2010.pdf 1.83 MB 9/30/2013 3:45:28 PM 28
 finalisa-120904174438-phpapp01.pptx 4.71 MB 9/11/2013 9:19:58 AM 37
 ISwirelessSysModelingTool.pdf 995.92 KB 8/26/2013 10:51:26 PM 27
 LinkBudget.xlsx 12.39 KB 8/29/2013 3:48:36 AM 31
 LinkBudgetRFDesign.xlsx 12.38 KB 8/28/2013 6:10:06 AM 36
 LTE in Bullets – Uplink Link Budgets.pdf 248.44 KB 8/26/2013 10:28:14 PM 30
 lte_peak_rates.xlsx 12.15 KB 9/29/2013 10:27:18 PM 35
 lte-link-budget-tool-parameter-.pdf 283.36 KB 8/26/2013 10:29:54 PM 33
 lte-radio-link-bu.pdf 143.41 KB 9/3/2013 12:26:22 PM 32
 lteradionetworkplanningtutorialfromis-wireless-120530073459-phpapp02.pdf 588.45 KB 9/29/2013 8:07:30 PM 41
 pxc3881263_July.pdf 2.01 MB 9/30/2013 1:53:06 PM 26
 RF Planning and Optimization for LTE Networks.pdf 772.54 KB 7/18/2012 8:20:42 PM 35
 urn100056[1].pdf 1.14 MB 9/3/2013 11:56:42 AM 26
 WiMAX_LKB.xls 42.50 KB 9/30/2013 9:25:48 PM 26

Source: Link Budget for LTE

3G UMTS Originating Call Flows

8 Oct

3G UMTS originating voice call call setup involves complex signaling to setup and release the call.

  • RRC (Radio Resource Control) signaling between the UE and RAN sets up the radio link.
  • RANAP (Radio Access Network Application Part) signaling sets up the session between the RAN and the Core Network (MSC).

Click on the image to see the full call flow. You can click on most RANAP messages in the call flow to complete field level details of the RANAP messages.

3G UMTS Originating Call with RRC and RANAP signaling

Click here for the 3G UMTS originating voice call flow 



Synchronization Signals in LTE Downlink

23 Sep

What are synchronization signals? Why do we need them? How are they and where are they transmitted in LTE? The below article should explain the same. Basically, as the name suggests the synchronization signals are needed for a UE which is trying to enter the network to get synchronized to the eNodeB or even for a UE to maintain its already gained synchronization. There are two synchronization signal in LTE downlink, Primary synchronization signal (PSS) and Secondary synchronization signal (SSS), below you find more details about these signals,

Primary Synchronization Signal (PSS)

PSS Generation

PSS is a zadoff-Chu sequence of length 62, whose root index is chosen based on the NID2 value, which is got from the physical cell ID. There can be three different NID2 values (0, 1, 2), hence there are 3 different root indexes (25, 29, 34) corresponding to the NID2 values. The length of the PSS is 72 subcarriers or 6 resource blocks, out of 72 only 62 subcarriers are having valid PSS data, remaining 10 subcarriers (5 on each side) are zero padded.

PSS Resource Mapping

The PSS is always mapped to the central 72 subcarriers, this is to assist the UE to decode the PSS irrespective of knowing the system bandwidth. The central 72 subcarriers may not always align with the resource block boundary, it can always exist in half RBs also. For Eg: In case of 5 MHz the central 6 RBs do not exactly align with the center of the bandwidth, hence the PSS mapping is done as, first 6 subcarriers in second half of RB9, next 60 subcarriers to RB10 to RB14 and remaining 6 subcarriers in first half of RB15.  The PSS is always mapped in last symbol of first slot in subframe 0 and 5, when it is a FDD system and in 3 symbol of first slot in subframe 1 and 6, when it is a TDD system

Since PSS is a Zadoff-Chu sequence, when plotted as a constellation diagram we should see a circle

Secondary Synchronization Signal (SSS)

SSS Generation

The SSS is a combination of 2 31 length binary sequence, where these binary sequences are function of NID1, there can be 168 different NID1 values, hence there are 168 different binary sequence corresponding to these NID1. Also these sequences differ between subframe 0 and subframe 5, infact this the way UE gets the subframe number within the radio frame. These binary sequences are also scrambled with a scrambling sequence which is function of NID2, hence creating a coupling between PSS & SSS.

SSS Mapping

The SSS is also mapped similar to PSS in the frequency domain, occupying the central 72 subcarrier, with 62 valid SSS subcarriers. But SSS is mapped to the last but one symbol of first slot in subframe 0 and 5 for FDD and last symbol of second slot of subframe 0 & 5 in a TDD system. Since SSS is a binary sequence, when plotted we should see two dots on the x-axis.

The symbol location of PSS/SSS in time domain is different between a FDD and TDD system as this helps the UE to identify, if this is a FDD or a TDD system.

Since the location of PSS/SSS is always fixed in frequency domain, the UE can easily do a correlation at the expected band to get the PSS/SSS, from which the UE can aquire many parameters such as the physical cell ID (From NID1 & NID2), duplexing mode( from the location of PSS/SSS in time domain), subframe number( from the SSS sequence), slot boundary as well.

For a new UE, the PSS/SSS helps to get synchronized to the eNodeB and for a idle UE within the service of eNodeB, the PSS/SSS helps to maintain the synchronization. Hence these synchronization signals play a very important role in LTE.


%d bloggers like this: