Archive | Signaling RSS feed for this section

802.11ac Adjacent Channel Interference (ACI)

4 Aug
 I was reading this article on development of 5G cellular technologies when this bit on OFDM deficiencies and the need for new waveforms to support higher capacities and user densities caught my attention (emphasis added by me):

4G and 4G+ networks employ a type of waveform called orthogonal frequency division multiplexing (OFDM) as the fundamental element in the physical layer (PHY).  In fact, almost all modern communication networks are built on OFDM because OFDM improved data rates and network reliability significantly by taking advantage of multi-path a common artifact of wireless transmissions.  However as time and demands progress, OFDM technology suffers from out-of-band spectrum regrowth resulting in high side lobes that limit spectral efficiency.  In other words, network operators cannot efficiently use their available spectrum because two users on adjacent channels would interfere with one another.  OFDM also suffers from high peak-to-average ratio of the power amplifier, resulting in lower battery life of the mobile device.  To address OFDM deficiencies, researchers are investigating alternative methods including generalized frequency division multiplexing, filter bank multi-carrier, and universal filter multi-carrier.  Researchers speculate that using one of these approaches over OFDM may improve network capacity by 30 percent or more while improving the battery life for all mobile devices.”

 
 
This aligns with most Wi-Fi professionals’ recommendations to deploy 5 GHz radios on non-adjacent channels to avoid that dreaded adjacent channel interference (ACI). 
 
And if you look at an OFDM Wi-Fi transmit spectral mask, either the limits defined in the standard or using a spectrum analyzer, you will see rather significant side lobes that can impact adjacent channels (and even further, depending on proximity and power levels). I have even considered including discussion of OFDM spectral masks within my 802.11ac presentations and writings due to the fact that as channel widths get wider, so to do their side lobes because the frequency distance from the main carrier signal at which the relative power level must be reduced to be in compliance are increases as well. Here is an illustration that I put together over a year ago but never published and kept in the appendix of my 11ac presentation. It illustrates how ACI can increase due to the spectral mask differences as channel widths get larger. I have inlaid two 20 MHz spectral masks inside the 40 MHz mask, and two 40 MHz masks inside the 80 MHz mask. Essentially, the side lobe power level reduction requirements are based on the size of the main signal lobe; as the main signal lobe gets larger, so too does the allowed power in side band lobes.

 
Spectral Mask Comparison of 20, 40, and 80 MHz Wi-Fi Channels
And below is a capture from a spectrum analyzer approximately 10 feet away from an 802.11ac AP operating in 80 MHz mode with a large amount of traffic. Notice the high signal level in adjacent channels (52-64, and likely would impact the as-of-yet unapproved U-NII 2B band). 

Spectrum Analysis Capture of an 802.11ac 80 MHz Waveform
This is why you need a minimum of 10 feet of separation between radios operating in the same frequency band (unless other shielding mechanisms are used, which increase cost), as well as the recommendation to have adjacent 5 GHz radios operating on non-adjacent channels. This will start to become a bigger issue as we deploy more 5 GHz radios to handle capacity and user density demands. More manufacturers are considering developing software-defined radios (SDR) as well as multi-radio APs that have more than one radio operating in the 5 GHz band. You should carefully research and verify (through real-world testing) these solutions to ensure that interference within the AP is not an issue.As always, the better you understand what’s going on at the physical layer, the better wireless engineer and architect you will be. 

 

Signal Analyzers

4 Aug

 

 This manual provides documentation for the following Analyzer: N9020A MXA Signal Analyzer

 

Download: N9020-90113

 

Notice: This document contains references to Agilent. Please note that Agilent’s Test and Measurement business has become Keysight Technologies. For more information, go to http://www.keysight.com.

 

Source: http://literature.cdn.keysight.com/litweb/pdf/N9020-90113.pdf?cmpid=zzfindmxa_specifications

Are signaling bottlenecks making LTE a victim of its own success?

5 May

LTE networks are being overwhelmed by data traffic and with signaling in the core taking the brunt of unprecedented demand, it looks as though Diameter, the ETSI and 3GPP standard, may not, on its own, be up to the task of managing a more complex and burgeoning network. So what’s the alternative? Is there a newer and better solution out there?

Welcome to Part 2 of this extended TelecomTV article. Yesterday, in Part 1, we reported that subscribers, sometimes a few, frequently a lot, are complaining that they are not getting what they were led to expect they would get from signing-up to expensive and lengthy ‘4G’ service contracts. They say they are paying premium prices for what can be less-than-robust, underperforming network services as well as apps that fall over and drop-out and data (and even voice) calls that are often actually better over much cheaper 3G networks.

Robin Kent, Director of European Operations at UK-headquartered Adax, the packet processing, security, and network infrastructure for the all-IP network company says the problem can be laid directly at the feet of the Diameter signaling standard and protocol.

He told me, “While LTE is delivering on its promise of providing users with a quicker, more data-centric mobile broadband service, the backbone of the network is not being properly supported. Diameter signaling is now the bottleneck in LTE performance, because the transport that Diameter runs over is insufficient”. Trenchant stuff.

Long Term Evolution, which is if course what the acronym LTE stands for, is exactly that – a standard, based on GSM/EDGE and UMTS/HSPA network technologies and using a different radio interface in concert with core network improvements, to permit evolutionary progression towards true 4G.

Thus network operators do not have to scrap their expensively constructed old networks and take another huge capex hit but instead can keep them and tweak them to provide 4G-like services and apps. That is one of the reasons LTE is increasingly popular and why the network signaling that manages data sessions and polices traffic policies is having to accommodate volumes and complexities that are drastically degrading performance and limiting growth. It’s an untenable situation.

Some in the industry compare the complexities of Diameter signaling to the problems exhibited by Signaling System 7 (SS7) when the first mobile networks were introduced. SS7 was the source of many difficulties when it was first introduced but matured to become robust and reliable. The question is whether or not Diameter signaling can do the same?

For SS7’s early problems pale into comparative insignificance when measured against the 77 different and unique Diameter signaling interfaces that are assigned to specific IMS and LTE network elements.
And then there’s SCTP, (the Stream Control Transmission Protocol) that focuses on the transport and session layers rather than the network layer and operates on top of IP. It is important and is supported in the Linux kernel, but, Robin Kent contends, SCTP “just can’t cope with the specific signaling needs that the LTE network demands.

Larger volumes of data transfer and consumption mean that the strain on the network is being felt at various levels, and critical, evolved signaling and transport solutions are needed to match the demands that are being placed on the network”.

He adds, “The emergence of new technology usually breeds other new technologies, largely by way of support, but with recent network developments and the uptake of LTE, current signaling transport technology is no longer sufficient because of the data pressures that are being placed on the network. Transport protocols such as SCTP, in its existing form, are insufficient and operators have to address the problem that this will pose on LTE performance”.

What is needed then is a strong, high-performing transport layer and Robin Kent is, understandably, quick to point out that his company’s product, Adax SCTP for Telecom (Or SCTP/T) was designed specifically to meet the demands of LTE and IMS wireless networks for thousands of simultaneous connections. That said, and of course, Adax is not alone in seeking to provide answers to the overwhelmed Diameter signaling problem and there are other products by other manufacturers out there on the market.

And, regardless of the manufacturer, the solution to be applied must be able to do much the same things: to perform vigilant and almost instant in-service quality monitoring of the signaling link and be able to detect degradation of link quality at very, very short, programmable intervals.

That’s because the flat all-IP architecture of the LTE network requires a very large number of signaling connections as well as signaling concentration for efficient routing. Diameter signaling is the choke-point in LTE signaling performance and while a convenient and apparently economical solution, the Linux-supplied SCTP for Diameter, is freely available, it is simply is not up to a task that it was not designed to perform.

For LTE subscribers the technology is all about the perceived quality of experience they receive and, for the network operators it is the maximisation of that experience is based on a pro-active, immediate and flexible management of network capacity including the capability to conduct near-instantaneous data analysis, packet processing, DPI and the freeing-up of host CPU resources to remove bottlenecks.

Diameter signaling cannot do this unaided and LTE congestion, service attenuation and loss and consumer frustration will only inevitably increase unless something is done to address the matter now.

The philosopher and trancendentalist Henry David Thoreau, musing in his cabin on Walden Pond in Massachusetts in the mid-19th century wrote, “We must heap up a great pile of doing, for a small diameter of being”. Now, that’s what I call foresight.

Source: http://telecomtv.com/comspace_newsDetail.aspx?n=50933&id=e9381817-0593-417a-8639-c4c53e2a2a10

LTE speeds dive in the US as users crank up the video

3 Mar

It’s not how fast your network is, it’s probably how many people are on it and what they’re doing that counts as new figures show a dramatic fall-off in average LTE speeds in the US. By I.D. Scales.

 

Late last week OpenSignal (http://opensignal.com/) reported a dramatic decline in LTE speeds, as experienced by users, across all the LTE networks, in the US. This news hasn’t exactly been top of the pile at Mobile World Congress for obvious reasons and because I’ve mentioned to a few people by way of opening a conversation and they hadn’t heard (especially WiFi people), I thought I’d retail the story now even though it’s gone a bit whiskery.

OpenSignal is a crowdsourced network metrics operation. It claims 6 million people have downloaded its app which gathers performance stats from the handset’s point of view (throughput, geo location, time of day/week/year, jitter etc), sends the data back to OpenSignal which compiles some meaningful statistics about comparative network performance, in this case LTE.

We all know that any method of measuring network performance will draw fire from one quarter or another, usually because the results don’t support a particular world view. OpenSignal points out that the usual measuring method might involve testing the actual network by driving a test car about to record congestion, signal strength and so on.

 

So there was considerable surprise when OpenSignal’s latest figures summary (released in the run-up to Mobile World Congress) gave the US a disappointing speed average in LTE when compared to other territories.  Far from ‘leading the world’ in mobile broadband, as we’re often told, according to these figures the US is in reality a laggard in user experience.  It came 15th out of 16 countries surveyed, with the average data rate dropping to 6.5 Mbit/s from 9.6Mbit/s. This is LTE we’re talking about.

The reason for the drop, of course, is almost certainly from user numbers and video consumption going up fast as users learn to flex the new speeds but end up Easter Islanding the network.

Verizon and Sprint were lowest with 7.6 Mbit/s and 4.2 Mbit/s respectively. T-Mobile was best at 11.2 Mbit/s and AT&T next at 8.9 Mbit/s.   All these speeds were well below the other country averages.

Leaders were the familiar line-up of South Korea, Hong Kong and Sweden.

Source: http://telecomtv.com/comspace_newsDetail.aspx?n=50810&id=e9381817-0593-417a-8639-c4c53e2a2a10&utm_campaign=Linkedin280214LTEspeedsDive&utm_medium=

LTE network speeds, according to the latest OpenSignal report

24 Feb

The United States trails 13 countries when it comes to LTE network speeds, according to the latest OpenSignal report. The report found that average LTE network speeds in this country have declined 32% this year. Australia posted the fastest LTE speeds, with an average download speed of 24.5 megabits per second. Other countries with faster LTE speeds than the 6.5 Mbps posted by the United States were (in order) Italy, Brazil, Hong Kong, Denmark, Canada, Sweden, South Korea, the United Kingdom, France, Germany, Mexico, Russia and Japan.

The United States suffered the biggest decline in network speeds of any country, as operators struggled to keep pace with increasing data downloads. Last year the U.S. ranked 8th in the OpenSignal study, with an average LTE network download speed of 9.6 Mbps.

Many of the nations with faster speeds than the United States do not have as much LTE coverage. Verizon Wireless and AT&T Mobility, which together have roughly 200 million subscribers, are both nearing completion of their LTE roll outs with more than 300 million potential customers covered. Sprint and T-Mobile US both have substantial footprints as well, having recently surpassed 200 million pops covered. “The [United States] performs well on our coverage metric, with the average user experiencing LTE coverage 67% of the time, with Australia, the fastest country, on 58%,” OpenSignal said in a press release.

When it comes to domestic network speeds, T-Mobile US had the best performance among the carriers. The carrier posted average download speeds of 11.21 Mbps, with AT&T Mobility No. 2 at 8.9 Mbps. Verizon Wireless clocked in at 7.8 Mbps and Sprint’s average download speed was 4.2 Mbps. Sprint currently has the least amount of spectrum dedicated to its network at just 10 megahertz in most markets, while the others provide at least double that amount.

The State of LTE

Network operators around the world are working hard to convince their users to make the jump to LTE. The term “4G” acts as a convenient label for marketers to emphasise the superiority of this new standard over its predecessors, but just how standard or consistent is the experience of users on LTE?

The OpenSignal app allows users to contribute to our impartial coverage maps of mobile networks, we took data from those of our 6 million users who have LTE and focussed on their experience of two key metrics: download speed, and the proportion of time spent with LTE access. All data included in this report comes from the second half of 2013.

We found that not all LTE networks are created equal, indeed there is an extremely broad range of experience across both metrics. Only about a quarter of networks surveyed achieve both good coverage and fast speeds; clearly there remains much work before LTE lives up to its full potential.

Transmit Signal Leakage in LTE Frequency-Division Duplexing Applications

13 Feb

In today’s high-speed wireless communication standards like LTE, the performance of both base transceiver stations (BTS) and user equipment (UE) transceivers is crucial. LTE supports time-division duplexing (TDD) as well as frequency-division duplexing (FDD). In this post, we look at transmit signal leakage problems that can occur in FDD applications. To do so, we show a numerical analysis using specifications from the Nutaq Radio420X and the standard LTE performance requirements.

Frequency-division duplexing and isolation

FDD implies that the transmitter and receiver operate at different carrier frequencies, allowing for constant and simultaneous transmission and reception. In full-duplex FDD mode, the transmitter signal leakage must be taken into account (this does not apply to TDD or half-duplex modes). The receiver is constantly exposed to this transmit signal leakage and its sensitivity can drop drastically if improper isolation is used. Most of the isolation is obtained with a good PCB layout and shielding, but one will always have to use effective filters/duplexers in order to achieve optimal isolation.

The Radio420X’s receiver has a software-selectable band-pass filter bank. Its filters typically have 40 dB of rejection on either side of the bandwidth. Figure 1 shows a simplified block diagram of the Radio420X transceiver section.

Figure 1 - Simplified Radio420X transceiver block diagram

Figure 1 – Simplified Radio420X transceiver block diagram

Transmit signal leakage

Clearly, the fundamental components of the transmit signal can interfere with the received signal, but this is not the only concern. The transmit signal will also generate out-of-band phase noise that falls within the receiver band. This unwanted power affects the receiver sensitivity by raising its noise floor, as shown in Figure 2.

Figure 2 - Out-of-band phase noise effects on sensitivity

Figure 2 – Out-of-band phase noise effects on sensitivity

Example calculations

Let’s look at a numerical example using the LTE Band 1. It operates within the following frequencies:

  • Uplink (UE transmit): 1920 – 1980 MHz
  • Downlink (UE receive): 2110 – 2170 MHz

Assume that we want to operate in full-duplex FDD using carrier frequencies 1920 and 2110 MHz for a UE transceiver. The Radio420X’s specifications will be used in the following calculations.

First, we determinate how much power will be leaking into the Rx path when operating at the maximum output power. We know that the fundamental Tx component will be filtered out by 40 dB when it reaches the band-pass filter. However, the first variable amplifier of the Rx chain is placed before the filter and is set to a maximum gain of +18 dB for best sensitivity. Its OP1dB is 20 dBm, so any input signal greater than 2dBm will saturate this amplifier and block the whole receiving process. Thus, we need a minimum of 16 dB Tx/Rx isolation to avoid this situation. Knowing that the PCB traces isolation is better than 55 dB, the only worry is about antenna isolation (the Radio420X uses two antennas instead of a duplexer). At 1960 MHz, 30 dB antenna isolation is achieved with a horizontal separation distance of 12 cm (for a -5 dB gain in the direction of the other antenna), or a vertical separation of 17 cm [1], which is easily realized.

The second concern about transmit signal leakage is its out-of-band phase noise. This power can enter the Rx band and affect its sensitivity. The Radio420X shows a typical phase noise of -140 dBm/Hz at a 20 MHz offset with a 2000 MHz carrier, measured with 0 dBm of output power. Assuming that the phase noise remains constant at greater offsets, -122 dBm/Hz (for 18 dBm of output power) of the transmitted signal noise spectral power density reaches the Rx band. The receiver sensitivity, -103 dBm, is measured within a 200 kHz bandwidth with a 5 dB signal-to-noise ratio (SNR). In order to allow the transmitter to affect sensitivity by no more than 0.5 dB, the transmitter noise power needs to be 9 dB below the noise floor, which correspond to -117 dBm. The corresponding phase noise power for a maximum power output of +18 dBm is -69 dBm (-122dBm/Hz + 10log(200kHz)), which is within the LTE specification of -50 dBm for maximum emission from the UE transmitter in its own receive band.

Finally, to get to the -117 dBm target, we need to isolate the antennas by 48 dB. This can be performed easily with an external low-cost ceramic duplexer. However, for a dual separate TX-RX antenna setup, this requires a horizontal and vertical spacing of about 68 cm and 41 cm respectively [1]. Keep in mind that these requirements only have to be met when the transmitter is set to maximum output power in order to not affect the receiver sensitivity.

Conclusion

The worked-out example shows that the main concern regarding transmit signal leakage, in typical conditions, is the transmitter phase noise. The out-of-band noise power will enter into the receiver band and affect the whole Rx path, degrading its sensitivity. This demonstrates how different specifications can critically interact with each other. In order to meet today’s wireless communication standards, transceivers such as the Nutaq Radio420X must have flawless performance for each parameter.

References

[1] International Telecommunication Union. Isolation between antennas of IMT base stations in the land mobile service.http://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-M.2244-2011-PDF-E.pdf

 

Source: http://nutaq.com/en/blog/transmit-signal-leakage-lte-frequency-division-duplexing-applications

Link Budget for LTE

16 Dec

Wireless Downloads/_lte/RF Design And Link Budget/

 

 

 

Filename Filesize Filetime Hits
 05_RA41205EN10GLA0 LTE_Link_Budget_v02.pdf 580.41 KB 8/26/2013 10:55:22 PM 48
 124638546-LTE-Radio-Link-Budgeting.pdf 218.94 KB 8/26/2013 11:05:54 PM 43
 129059391-05-LTE-Link-Budget-GC.pdf 1.50 MB 8/26/2013 11:04:38 PM 48
 129064541-06-Cell-Range-GC.pdf 489.22 KB 9/29/2013 8:09:38 PM 35
 135096675-LTE-Link-Budget.pdf 105.58 KB 8/26/2013 10:59:58 PM 39
 162152848-LTE-Link-Budget.xlsx 15.83 KB 8/26/2013 11:07:50 PM 34
 64481984-11-LTE-Radio-Network-Planning-Introduction.pdf 1.01 MB 7/13/2012 10:51:00 PM 81
 72800816-LTE-Link-Budget-Introduction-V1-0.pdf 705.12 KB 8/26/2013 11:00:36 PM 33
 82733625-LTE-Link-Budget1.pdf 196.91 KB 8/26/2013 11:05:00 PM 61
 EDX_WP_DesigninganLTEnetworkusingEDXSignalPro_Sept2010.pdf 1.83 MB 9/30/2013 3:45:28 PM 28
 finalisa-120904174438-phpapp01.pptx 4.71 MB 9/11/2013 9:19:58 AM 37
 ISwirelessSysModelingTool.pdf 995.92 KB 8/26/2013 10:51:26 PM 27
 LinkBudget.xlsx 12.39 KB 8/29/2013 3:48:36 AM 31
 LinkBudgetRFDesign.xlsx 12.38 KB 8/28/2013 6:10:06 AM 36
 LTE in Bullets – Uplink Link Budgets.pdf 248.44 KB 8/26/2013 10:28:14 PM 30
 lte_peak_rates.xlsx 12.15 KB 9/29/2013 10:27:18 PM 35
 lte-link-budget-tool-parameter-.pdf 283.36 KB 8/26/2013 10:29:54 PM 33
 lte-radio-link-bu.pdf 143.41 KB 9/3/2013 12:26:22 PM 32
 lteradionetworkplanningtutorialfromis-wireless-120530073459-phpapp02.pdf 588.45 KB 9/29/2013 8:07:30 PM 41
 pxc3881263_July.pdf 2.01 MB 9/30/2013 1:53:06 PM 26
 RF Planning and Optimization for LTE Networks.pdf 772.54 KB 7/18/2012 8:20:42 PM 35
 urn100056[1].pdf 1.14 MB 9/3/2013 11:56:42 AM 26
 WiMAX_LKB.xls 42.50 KB 9/30/2013 9:25:48 PM 26

Source: Link Budget for LTE

3G UMTS Originating Call Flows

8 Oct

3G UMTS originating voice call call setup involves complex signaling to setup and release the call.

  • RRC (Radio Resource Control) signaling between the UE and RAN sets up the radio link.
  • RANAP (Radio Access Network Application Part) signaling sets up the session between the RAN and the Core Network (MSC).

Click on the image to see the full call flow. You can click on most RANAP messages in the call flow to complete field level details of the RANAP messages.

3G UMTS Originating Call with RRC and RANAP signaling

Click here for the 3G UMTS originating voice call flow 

 

Source: http://blog.eventhelix.com/2013/10/07/3g-umts-originating-call-flow/

Synchronization Signals in LTE Downlink

23 Sep

What are synchronization signals? Why do we need them? How are they and where are they transmitted in LTE? The below article should explain the same. Basically, as the name suggests the synchronization signals are needed for a UE which is trying to enter the network to get synchronized to the eNodeB or even for a UE to maintain its already gained synchronization. There are two synchronization signal in LTE downlink, Primary synchronization signal (PSS) and Secondary synchronization signal (SSS), below you find more details about these signals,

Primary Synchronization Signal (PSS)

PSS Generation

PSS is a zadoff-Chu sequence of length 62, whose root index is chosen based on the NID2 value, which is got from the physical cell ID. There can be three different NID2 values (0, 1, 2), hence there are 3 different root indexes (25, 29, 34) corresponding to the NID2 values. The length of the PSS is 72 subcarriers or 6 resource blocks, out of 72 only 62 subcarriers are having valid PSS data, remaining 10 subcarriers (5 on each side) are zero padded.

PSS Resource Mapping

The PSS is always mapped to the central 72 subcarriers, this is to assist the UE to decode the PSS irrespective of knowing the system bandwidth. The central 72 subcarriers may not always align with the resource block boundary, it can always exist in half RBs also. For Eg: In case of 5 MHz the central 6 RBs do not exactly align with the center of the bandwidth, hence the PSS mapping is done as, first 6 subcarriers in second half of RB9, next 60 subcarriers to RB10 to RB14 and remaining 6 subcarriers in first half of RB15.  The PSS is always mapped in last symbol of first slot in subframe 0 and 5, when it is a FDD system and in 3 symbol of first slot in subframe 1 and 6, when it is a TDD system

Since PSS is a Zadoff-Chu sequence, when plotted as a constellation diagram we should see a circle

Secondary Synchronization Signal (SSS)

SSS Generation

The SSS is a combination of 2 31 length binary sequence, where these binary sequences are function of NID1, there can be 168 different NID1 values, hence there are 168 different binary sequence corresponding to these NID1. Also these sequences differ between subframe 0 and subframe 5, infact this the way UE gets the subframe number within the radio frame. These binary sequences are also scrambled with a scrambling sequence which is function of NID2, hence creating a coupling between PSS & SSS.

SSS Mapping

The SSS is also mapped similar to PSS in the frequency domain, occupying the central 72 subcarrier, with 62 valid SSS subcarriers. But SSS is mapped to the last but one symbol of first slot in subframe 0 and 5 for FDD and last symbol of second slot of subframe 0 & 5 in a TDD system. Since SSS is a binary sequence, when plotted we should see two dots on the x-axis.

The symbol location of PSS/SSS in time domain is different between a FDD and TDD system as this helps the UE to identify, if this is a FDD or a TDD system.

Since the location of PSS/SSS is always fixed in frequency domain, the UE can easily do a correlation at the expected band to get the PSS/SSS, from which the UE can aquire many parameters such as the physical cell ID (From NID1 & NID2), duplexing mode( from the location of PSS/SSS in time domain), subframe number( from the SSS sequence), slot boundary as well.

For a new UE, the PSS/SSS helps to get synchronized to the eNodeB and for a idle UE within the service of eNodeB, the PSS/SSS helps to maintain the synchronization. Hence these synchronization signals play a very important role in LTE.

Source: http://ltebasics.wordpress.com/2013/09/23/synchronization-signals-in-lte-downlink/

Reduction of Power Envelope Fluctuations in OFDM Signals

16 Sep

Orthogonal frequency-division multiplexing (OFDM) is one of the most widely used modulation techniques in telecommunication technologies. Its strong orthogonality properties make it resilient to inter-symbol interference. Nevertheless, OFDM signals suffer from a major problem known as peak to average power ratio (PAPR), which causes distortion in high power amplifiers (HPA). Several studies have been conducted in reducing the PAPR [1,2,3]. Active constellation extension (ACE-AGP) is one method that has achieved good results [4]. On the other hand, artificial intelligence, based on neural networks (RN), has become very useful in the approximation of phenomena with a known evolution.

This allowed us to use neural networks to create a system of PAPR reduction in OFDM signals (Temporal architecture and temporal frequency architecture), while relying on algorithms ACE-AGP [5]. Indeed, the inputs and outputs of this algorithm will be taken as a learning data RN.

Our first proposal to reduce the OFDM signals envelope fluctuations is performed in the time domain. We train our ANN by using the signals with low envelope fluctuations obtained by the ACE-AGP algorithm. This way, this ANN learns what are the characteristics for a signal with low envelope fluctuations.

Unfortunately, the main problem with this time domain training scheme is that the neural network is not able to learn which regions of the constellation are allowed and which ones are forbidden. Thus, a second neural network, working in the frequency domain, is proposed to be concatenated to the time-domain scheme.

The performance of the developed system, in term of PAPR reduction, constellations and BER, is important and its complexity is negligible in comparison with other methods. The results are presented in the following figures, which show a comparison of the actual time complexity between these two approaches [5].

Results:

1st case: QPSK

1st case: QPSK

Cubic Metric Comparison

BER Comparison. N = 1024

2nd case: 16-QAM

2nd case: 16-QAM

Cubic Metric Comparison

BER Comparison. N = 1024


Our development and prototyping platform

Software-defined radio (SDR) is becoming the standard rapid prototyping platform for scientists, researchers, and R&D engineers to use when performing proof-of-concepts for new algorithms. As such, we implemented our solution (ACE-AGP and neural networks) on a Nutaq SDR platform.

SDR platforms typically consist of an FPGA board on which the developed algorithm can be programmed, tested, and validated. Good software tools can drastically reduce the development time. The Nutaq SDR platform provided us with the ability to choose between two approaches when programming: VHDL coding or model-based design tools for automatic code generation. IP cores that manage the on-board peripherals like storage and communication links are provided for both approaches.

The Nutaq BSDK consists of a VHDL development environment and the Nutaq MBDK. It is fully integrated with Simulink and Xilinx System Generator and forms a model-based development environment. It allowed us to rapidly prototype the algorithm and target the hardware. Moreover, the Nutaq SDR platform is equipped with configurable state-of-the-art RF transceivers and receivers to enable the up/down conversion of the baseband signals to/from the RF domain for transmission and reception. The developed algorithm could then be tested and its performance benchmarked in a real-world environment (rather than in a simulated environment with test data). The result is a tested and validated ACE-AGP algorithm in the form of an FPGA design and a synthesized netlist. It can then be reused on other FPGA hardware or even serve in the design of an integrated circuit (IC).

Author: Dr. Younes Jabrane, École Nationale des Sciences Appliquées (ENSA) de Marrakech

References:

[1] M. Breiling, S. H. Muller-Weinfurtner, and J. B. Huber, “SLM peakpower reduction without explicit side information,” IEEE Commun. Lett., vol. 5, no. 6, pp. 239–241, June 2001.

[2] J.-C. Chen, “Partial transmit sequences for peak-to-average power ratio reduction of OFDM signals with the cross-entropy method,” IEEE Signal Process. Lett., vol. 16, no. 6, pp. 545–548, June 2009.

[3] S. H. Han and J. H. Lee, “An overview of peak-to-average power ratio reduction techniques for multicarrier transmission,” IEEE Wireless Commun., vol. 12, no. 2, pp. 56-65, Apr. 2005.

[4] B. S. Krongold and D. L. Jones, “PAR reduction in OFDM via active constellation extention,” IEEE Trans. Broadcast., vol. 49, no. 3, pp.258–268, Sep. 2003.

[5] Y. Jabrane, V. P. G. Jiménez, A. G. Armada, B. A. E. Said, and A. A. Ouahman. Reduction of power envelope fluctuations in OFDM signals by using neural networks. IEEE Commun. Lett., 14 (7): 599-601, Jul. 2010

 

Source: http://blog.nutaq.com/blog/reduction-power-envelope-fluctuations-ofdm-signals

%d bloggers like this: