Archive | 9:06 am

Security at the Speed of VoLTE

4 Apr

Infonetics white paper – The mobile broadband industry’s rapid migration to LTE has opened the door to malicious and non-malicious threats as a result of fundamental vulnerabilities in the all-IP LTE architecture. Consequently, security must be a foundational element of LTE network deployments. As the adoption of IPsec encryption for transporting LTE traffic continues to grow significantly, there is increasing need for a security gateway. This white paper discusses the evolving threat landscape, the economics and performance requirements of security solutions, and the role of the security gateway in LTE networks as subscribers ramp up by the millions every quarter and VoLTE services start to emerge.

 

 

Source: http://www.slideshare.net/MaryMcEvoyCarroll/2014-infoneticswhitepapersecurityatthespeedofvo-lte

Advertisements

The 600 MHz Incentive Auction in US – What we know so far

4 Apr

Source: Gunjan –  http://wirelesstelecom.wordpress.com/2014/03/31/the-600-mhz-incentive-auction-in-us-what-we-know-so-far/

A major high stakes wireless industry event generating a lot of interest nowadays in US is the 600 MHz incentive auction for broadcast spectrum scheduled to take place in 2015. This auction assumes special significance since it would be perhaps the last set of airwaves under 1 GHz that will be sold in America through a primary auction. Given the exploding demand for data on mobile devices and the superior propagation characteristics of wireless signals in this band, the four major US carriers – Verizon, AT&T, Sprint, T-Mobile and many smaller regional service providers have exhibited deep interest in this spectrum. The proposition is considered a first of its kind in the world. To put it simply, the broadcasters will voluntarily sell their spectrum to the US regulator, FCC through a reverse auction. Subsequently, the mobile operators would buy those airwaves through traditional bidding. But the reality will be more complex than that and this article would attempt to address the related complexities.

The FCC first floated the idea of utilizing broadcast TV airwaves for mobile broadband access in the National Broadband Plan of 2010. Two years later, the US Congress authorized the Commission to conduct the incentive auction of the broadcast television spectrum. In the fall of 2012, the FCC issued a Notice of Proposed Rulemaking (NPRM) to officially kick off the rules and guidelines developing process for the 600 MHz auction. The following diagrams illustrate the concept of this auction in terms of how the television broadcast spectrum looks currently and one of the several proposals on what it could look like after the completion of this auction.

Pre and post auction 600 MHz band plan

The values of X and Y as shown above, are variable and obviously depend on the amount of spectrum that the broadcasters are willing to sell. One of the ideas floated by the FCC is the provision to accommodate different amount of TV spectrum relinquished in different markets. The downlink spectrum would be a fixed band nationwide while the uplink band may vary depending on the market. The FCC is hoping that 120 MHz of total spectrum can be made available through this process although the actual figure would be less than this and will be determined by the willingness of TV station owners to give up their usage rights. The first aspect of these incentive auctions would be the reverse auction. Over-the-air active TV licensees holding 6 MHz spectrum in various areas of US will be eligible to participate in the reverse auction. In order to ensure maximum participation, the NPRM states that such licensees would have 3 options. They could either give up the Ultra-High frequency (UHF) channel and relocate to a channel in the Very High Frequency (VHF) range or give up their channel and share a broadcast channel with another licensee post-auction or they can simply sell all their rights to the channel and go off air. In every case, the selling broadcaster could potentially earn tens or in some cases hundreds of millions of dollars in exchange of the spectrum rights in a region. The second aspect would be repacking those broadcast channels that did not participate in this auction and will be on air after the whole process ends. This will ensure that such stations occupy one end of the spectrum resulting in contiguous blocks that could be sold off to the wireless network operators. During the rebanding, the channels would be reassigned and not geographically relocated. There would be no negative impact on the coverage area and served population of a TV station. Final piece of the puzzle would be the forward auction, a process that is generally followed to sell airwaves to the telcos. However, the regulator may follow a new approach to this process, since different areas might open up different amounts of spectrum. Selling spectrum in blocks and keeping flexible uplink spectrum are two such approaches. The pricing of airwaves in a particular region would depend on the success of reverse auction in that region. Another important aspect of the 600 MHz incentive auctions would be the integration of reverse and forward auctions. Both could either run sequentially or concurrently. The sequential path would show the supply through reverse auction to the bidders, but the sellers would be unable to determine the right price, since they would not be aware of the demand during the forward bidding. The concurrent path would show the supply demand balance, but how would repacking fit into that plan?

It is quite obvious that many questions need to be answered before marking a date for this auction. Biggest of them is whether the broadcasters would volunteer to relinquish their spectrum rights. The National Association of Broadcasters (NAB) has shown cautious interest in the auction. They are unsure about the kind of money that can be earned by either exiting the business or going to a shared channel. In comments filed with the Commission last summer, they also expressed deep concern over the co-existence of broadcast and mobile carriers on co-channels and adjacent channels in neighboring markets. They endorsed a nationwide standard band plan rather than an area-dependent approach that maximizes spectrum recovery. The NAB is definitely looking for more transparency in the rules. The FCC itself is still not sure about the success of the auction. The procedure can go belly up right at the start if participation from the broadcasters is low. Remember, if the target is to free up 120 MHz of frequencies, 20 stations will be required to exit the spectrum. Repacking presents another conundrum. Any move to reconfigure the TV stations would be complex and dependent on multiple factors. Apart from time and cost of repacking, interference protection on the new channel would be a major concern. There is no dearth of controversies on the wireless operator side too regarding these auctions. Bigger carriers like AT&T and Verizon want an open and simple bidding mechanism with no spectrum caps. Sprint, T-Mobile and other smaller rural carriers want an upper limit on the amount of airwaves that a bidder can buy. They claim that AT&T and Verizon already control more than three-fourth of commercial wireless spectrum below 1 GHz. Thus such a limit would promote consumer interest and encourage competition. Nonetheless, restrictions on spectrum bidding would reduce government’s revenue. There are divergent views on the channel block size and the size of economic areas (EA) too, although the spectrum is likely to be auctioned off in 5 MHz blocks. Appropriate utilization of the guard band frequencies is one more contentious topic. Internet companies like Google and Microsoft want unlicensed operations in that band, while the mobile telcos support only limited unlicensed spectrum. Given so many unresolved problems, the FCC delayed the 600 MHz incentive auctions until middle of 2015.

The regulator clearly needs to address issues of all stakeholders, but to be fair, this is an unprecedented situation and it is important to get it right even if that requires more time and discussion. The original plan was to have the order for this unique auction out by this spring, but that looks improbable now. The regulator must assuage the concerns of broadcasters in the order.  Rules and guidelines must be transparent with a well-defined structure. The barriers to entry must be low and TV spectrum owners should be educated about the approximate amount of dollars that they can expect in exchange of their 6 MHz of spectrum. They must be encouraged to explore the channel sharing option too. A recent pilot project conducted by 2 stations in Los Angeles concluded that sharing the same broadcast spectrum is technically feasible. Also as part of the order, the repacking methodology must be clearly laid out with specific timelines and costs involved. Various technical parameters like interference protection should also be outlined. Broadcasters must be assured that repacking will not affect their services in any manner and to further convince them, they should be allowed to test the repacking model. The station owners should be made to understand that since less than 10% of US households completely rely on over-the-air television, the spectrum they are holding can be utilized more efficiently if allocated for wireless data services. The other key policy challenge is on the forward auction side. There are valid arguments both in favor of and against imposing restrictions on spectrum that be bought by a bidder, so a balance has to be struck to ensure maximum participation and a level playing field. Lastly, the software and systems have to be extensively tested before commencing the complicated process.

There is clearly a long road to travel before these auctions can be held. There have been some positive developments like the formation of Expanding Opportunities for Broadcasters Coalition (EOBC). EOBC represents broadcasters that are interested in these auctions and want to be a part of the rulemaking process in order to make this endeavor a success. But much more needs to happen. A well-designed competitive sale process encompassing all three stages is what the industry needs and if executed well, it can bring rich benefits to the consumers, promote competition and boost the economy. A successful auction would also influence other nations to follow suit. Now there is only shot at getting it right. The FCC seems to be working hard at it and basic idea sounds good, so let us hope for a result that is in best interests of all the stakeholders.

 

 

Wi-Fi Offload Predictions

4 Apr

Which operators in the US are rolling out Wi-Fi?  It’s actually the cable TV operators rather than the mobile operators.  With all the commotion (news headlines, whitepapers, conference sessions, etc.) on Wi-Fi offload, why are the mobile operators not quickly rolling out Wi-Fi to manage network congestion, leverage free/unlicensed spectrum?  Why are the Cable operators adopting the “Community Wi-Fi” concept instead of the mobile operators?  While there are several rather complex answers to these questions, I believe there are several key reasons:

  • Mobile operators budgets were focused on LTE build out for the last 18-24 months
  • Some Mobile operators assumed (incorrectly it turns out) that LTE would mean excess capacity for years to come
  • Cable operators have access to millions of “hotspots” via home routers
  • Increasing Wi-Fi coverage beyond home hotspots to public areas creates more valuable footprint for monetization

So what lies ahead for Wi-Fi offload in the US?  Several predictions in my humble opinion:

  1. Cable operators will learn fast how to monetize public Wi-Fi including the “low hanging fruit” of in-bound Wi-Fi roaming of international mobile subscribers.
  2. Community Wi-Fi will soon turn into a pseudo-ubiquitous wireless footprint for leveraging many interesting services (VoIP and IP-SMS anyone?)
  3. Mobile operators will allocate capital expenditures to small cell and public Wi-Fi in “strategic’ locations.
  4. Other (non-cable) fixed broadband operators such as the local xDSL providers will closely evaluate the public Wi-Fi model now that the old scars of municipal Wi-Fi are healed.

These are interesting times in Wi-Fi offload and the coming months are sure to see that more clarity in the offload monetization models is promoted.  Fortunately Accuris Networks is closely involved in many of these models allowing us a front row seat.

 

Source: http://info.accuris-networks.com/blog/wi-fi-offload-predictions

Challenges with massive MIMO: Data throughput

4 Apr

A lot of thoughts have been shared lately in the wireless telecommunication industry about multiple-input/multiple-output (MIMO) systems of massive scale (referred to as massive MIMO systems). The principle behind massive MIMO system is that when you reach a critical number of antennas (around 60), the system’s energy and spectral efficiency increase significantly. For details about the theoretical principles behind massive MIMO systems, see Nutaq’s six-part blog series:

1. Massive MIMO – Part 1. Introduction: From theory to implementation
2. Massive MIMO – Part 2: A few lessons learned from asymptotic information theoretic analysis
3. Massive MIMO – Part 3: Capacity, coherence time and hardware capability
4. Massive MIMO – Part 4: Massive MIMO and small cells, the next generation network
5. Massive MIMO – Part 5: The need for practical prototyping and implementation
6. Massive MIMO – Part 6: Estimation and capacity limits due to transceiver impairments

Systems containing around 100 transceivers achieve significant gains in efficiency. As energy and spectral efficiency are top priorities in the development of next-generation wireless networks, the research community is looking to develop this technology. The hardware industry now needs to be able to provide systems that enable its physical implementation.

This blog post describes some of the hardware challenges behind massive MIMO systems and explains how Nutaq is responding to them.

Challenges

The development of future wireless technologies faces several challenges, one of which is a very efficient, low-latency, high-throughput data interface between the central processing unit and the multiple transceivers. To achieve such a scale in a MIMO system, one must raise the frequency to very high values in order to reduce the size of the antennas. To increase the data throughput over the RF link, a wide bandwidth is also necessary. The frequency coverage is defined by the RF front-end. This challenge will be addressed by designing front-ends with the required band pass frequency coverage. However, a wide bandwidth coverage implies a very high data throughput within the system, thus representing a second challenge to be addressed. The following section will explain more thoroughly this issue as well as how Nutaq addresses it.

Rapid data throughput

A challenging issue within a massive MIMO system is that the data must be accessible by every processing unit in order to compute inverse matrices and other such algorithmic functions. If the system is equipped with a central processing unit, then all the data must be routed to it. If the configuration involves distributed processing, every processor must have access to all of the data all of the time, which involves the same requirements for data throughput as for the central processor.

To understand the throughput required, let’s calculate what would be the minimal data throughput to a central processing node for a 100×100 MIMO system with a 28 MHz bandwidth using a standard LTE sampling clock speed of 30.72 MHz and 12-bit samples.

The calculations are as follows: for a coverage of 28 MHz, a sampling rate of 2×30.72 MHz, 61.44 MSPS is required according to the Nyquist theorem (when based on an LTE standard clock of 30.72 MHz). Each sample has 12 bits, therefore the data rate for one radio is 737.28 Mbits/s. Multiplying this by 100 to cover all the radios, we see that the throughput to the central processor is 73.7 Gbps when only covering a bandwidth of 28 MHz. A 100 MHz bandwidth target would require at least 320 Gbps of throughput.

Most telecommunication systems are based on FPGAs. With the exception of the Virtex-7, Xilinx’s FPGAs only support PCIe Gen 1. Nutaq’s RTDEx IP core provides support for PCIe gen 1 4x on the Virtex-6 based Perseus AMC. This interface allows a tested sustained throughput of around 6 Gbps. If we built a system using a serial architecture like the one shown below, a bottleneck would appear.

Using Nutaq’s PicoSDR, with PCIe support and 4 transceivers, we see in Figure 1 that even with only 16 radio channels covering a 20-MHz bandwidth, a bottleneck arises because the required 7.7 Gbps throughput isn’t met by the PCIe interface implemented by the Virtex-6 FPGA. You can imagine what happens with 100 channels covering a 100MHz bandwidth!

Figure 1. Arrangement of PicoSDRs to cover 16x16

Figure 1. Arrangement of PicoSDRs to cover 16×16

Nutaq’s solution

One possible solution is to parallelize the data routing. In other words, one could either aim for a mesh architecture between the distributed processing nodes or have multiple links routing data from subgroups of radios within the system to the processing unit in parallel. Figure 2 shows such a solution for the 16×16 system studied in the previous section.

Figure 2. Parallelizing the data links

Figure 2. Parallelizing the data links

Parallelizing the data links requires you to change the interface from PCIe to another, more adapted, interface. In the case of Nutaq’s hardware, Aurora is used through miniSAS disk connectors, allowing up to 7x 16 Gbps per each subgroup of 4 radio channels. The miniSAS disk connectors are on a Rear Transition Module (RTM) and installed in an MTCA.4 chassis. To raise the number of channels, Nutaq creates subgroups of radio channels resembling the one in Figure 2.

Figure 3. Many subgroups for a 96x96 system

Figure 3. Many subgroups for a 96×96 system

The following table compares the requirements to meet a 28 MHz bandwidth coverage against a system based on Figure 3.

Link 28 MHz req. Proposed system
Blue link necessary throughput 2950 Mbps 16000 Mbps
Red link necessary throughput 14745 Mbps 16000 Mbps
Full central perseus input throughput 73728 Mbps 112000 Mbps

Conclusion

This blog article introduced a data throughput challenge for next-generation massive MIMO prototyping platforms for wireless networks. The next blog post in this series will explain, in detail, Nutaq’s proposed system and cover the radio front-end and other system components.
Source: http://nutaq.com/en/blog/challenges-massive-mimo-data-throughput
%d bloggers like this: