Archive | Cellular RSS feed for this section

Cellular to Wi-Fi Data Offloading

10 Apr

It’s now 18 years since the 802.11 came out by IEEE, since then the Wi-Fi had rapid developments and there are over five billion devices support Wi-Fi today around the world [1]. Since early beginnings of the year 2000, the competition of notebook, laptop and smartphone manufacturing made it essential to have a WLAN card for wireless networking, and the Wireless LANs are everywhere today in offices, hotels, homes, airports, and restaurants. The need for Wi-Fi is increasing day after day, and it became the most favorite way to be online. The Wi-Fi became a Universal technology for the modern term called “Connected Homes”, where TV and Multimedia, Home operations and automation, Life Management, and broadband connectivity are managed through the Wi-Fi.

The revolution of smartphone industry, which rose after the Apple iPhone®, had a vast spread of Wi-Fi technology along with smartphones. The 3rd Generation Partnership Project (3GPP), a collaboration between groups of telecommunications associations aims to make a globally applicable third-generation (3Gmobile phone system specification, became aware of the increasing role of Wi-Fi and started putting standards for Wi-Fi and Mobile Networks interoperability. Some mobile operators, Wireless Internet Service Providers (WISPs), and vendors saw the potential in Wi-Fi as generic and low cost wireless technology to provide their data and Internet services to millions of users through their already equipped Wi-Fi in smartphones, tablets, laptops, and PDAs.

The global mobile data traffic grew by 81 percent in 2013 [2], and nearly 18 times nearly the entire global internet traffic in the year 2000. Mobile Network Operators (MNOs) are facing increasing deployment challenges to spread their 3G and 4G sites to serve the increasing needs of customers for high speed broadband connectivity. The cost of building a new 3G/4G site is about 100 to 150 times building a Wi-Fi Access Point (AP) [3], a study by Wireless 2020shows that Wi-Fi offloading could save around 7% of total Network Deployment cost with 60% Wi-Fi coverage.  The Wi-Fi offloading became lately the most hotly debated business opportunity that provides solutions for MNOs for a lot of challenges like spectrum licensing, running costs, coverage gaps ,deployment delays, and congestion. In addition, Wi-Fi can give new business opportunities to MNOs to access new types of users like laptop, tablet, and home users. The study of Wireless 2020shows that 65% of traffic can be offloaded via already installed Wi-Fi Networks in USA. Some MNOs are thinking about building their networks using the Wi-Fi as a primary network and the mobile cellular network as a secondary network.


What is Mobile Data Offloading?

Mobile Data offloading is the use of complementary network technologies for delivering data originally targeted for cellular networks. Examples of complementary networks are the Wi-Fi and Wi-Max. With Wi-Fi Mobile Data offloading, MNOs can deliver their data services to the customers through a Wi-Fi AP connected to the core network with seamless connectivity with the cellular network. Seamless connectivity means no need for any user interaction; with many options to authenticate the user and perform the data charge payments, and make an automatic handover called “vertical handover” from the cellular network to the Wi-Fi network and vice versa. With the increased CAPEX and OPEX of applying new 3G and 4G technologies, MNOs found a great business opportunity in Wi-Fi to increase the revenue per MB by deploying APs in hotspots and cellular network coverage gaps.


How it can be done?

3GPP started defining its system to Wireless LAN interoperability in Release 6, where the cellular core network authenticate the user through 3GPP AAA server, once authentication is performed the WLAN AP allow the user to access the internet.

The authentication can be performed in multiple ways: SIM based authentication, authentication through SMS, username and password authentication, or manual authentication.

ImageFigure 1: 3GPP release 6 WLAN Access Control


Smartphone manufacturers started applying the choice in operating systems to favor either the Wi-Fi or the cellular network like Apple iOS and Android 4.0. 3GPP release 6 was the first step towards allowing Wi-Fi users to access the cellular network, but still the selections of the radio access not defined how to favor access between Wi-Fi and the cellular network; it still needs user interaction or management by an application. Another drawback is when the user switches from cellular to Wi-Fi during a download, depending on the application, the download may stop. This switching mechanism between the Wi-Fi network and the cellular network that depend on the application is called Application Based Switching as the application has to depend on its own to continue the data transfer after switching the Radio Access Technology.

In 3GPP release 8, a new approach introduced to solve the later problem and define a way for the mobile users to automatically choose between cellular and Wi-Fi networks and allowing the user to perform vertical handover between the two technologies without any application or user interaction. With the introduction of Hotspot 2.0 and Mobile IP (MIP), the 3GPP release 8 allows Wi-Fi mobility with service continuity when moving between the two technologies.

ImageFigure 2: 3GPP release 8 WLAN Seamless Mobility

Hotspot 2.0 was created by the Wi-Fi Alliance in 2012, a technology intended to render Wi-Fi technology similar to cellular technology with a suite of protocols to allow easy selection and secure authentication. It allows the mobile devices automatically select the Wi-Fi network based on its SSID. It also allows reacting some useful information such as Network and venue type, list of roaming partners, and types of authentication available.

Mobile IP allows the mobile device to have dual IPs to communicate with each access technology; with the H1 interface the Home Agent (HA) manages the mobility between the two access technologies.

The 3GPP provided further enhancement with release 10; a completely seamless Wi-Fi offloading, where the mobile device can have multiple connections to each technology managed by the 3GPP core network. Some heavy traffic like video streaming and P2P downloads can be routed via Wi-Fi and the HTTP and VoIP traffic through the cellular Network.

ImageFigure 3: 3GPP release 10 WLAN Seamless offload


Why Wi-Fi is the future potential for increased user demands?

With the increased speed of technology development today, companies like Apple, Samsung, Google, and Microsoft are changing their project schemes from yearly basis planning to quarterly basis. The market today receives a new product every quarter, the users are more demanding for high data rate applications, and the personal computing power doubles every 18 months. This creates challenges to the telecommunication vendors and cellular network operators of fast solutions to fit this increased user demands and offer economic and practical solutions to transfer from 10x speed today’s cellular data rate evolution into 1000x. The evolution of the Wi-Fi resulted in 5 generations, and reached today with IEEE 802.11ac Gigabits of speed.

ImageFigure 4: The evolution of Wi-Fi

The mobile cellular technology has 4 generations today and practically reached 100 Mbps with the LTE technology, and it is expected to reach 300 Mbps with the coming few years.

ImageFigure 5: The evolution of Mobile Cellular Speeds

Since the Wi-Fi AP is designed to cover a range of 50 meters indoor and 100 meters outdoors, repeating this coverage all over the coverage of a single cellular cell will allow the current data rate speed to jump from 10x growth into 1000x growth, providing higher spectrum efficiency and allowing application that consume much data rates like HD IPTV and the “Connected Home” approach.

ImageFigure 6: Wi-Fi + Cellular Network Integration.

 Source: Qualcomm, Wi-Fi evolution


What is the Future?

            The Wi-Fi is a great opportunity for cellular mobile operators towards deploying high efficient, low cost, high speed, and robust network. Many vendors and operators already started deploying Wi-Fi offloading solutions around the world today, and mobile country regulators are started making new laws and regulations for the spectrum by increasing the band of Wi-Fi for mobile operators to implement this technology. It can be said that it’s a new track in the mobile telecommunication evolution before the Cellular 5th Generation standard comes within the next 10 years, forcing the standard writes to consider Wi-Fi as essential part the next global telecommunication  standard.




2-      Cisco Visual Networking Index: Global Mobile Data Traffic Forecast Update, 2013–2018

3-      ROI analysis of Wi-Fi Offloading: A Study by Wireless 2020.



Intel Touts New Ultra-High-Speed Wireless Data Technology

27 Feb

Small base stations could achieve huge data capacity increases using Intel’s modular antenna arrays.

Intel says it has prototyped a chip-based antenna array that can sit in a milk-carton-sized cellular base station. The technology could turbocharge future wireless networks by using ultrahigh frequencies.

Intel’s technology, known as a millimeter wave modular antenna array, is expected to be demonstrated today at the Mobile World Congress conference in Barcelona, Spain, says Ali Sadri, director of the millimeter wave standards and advanced technology group at Intel.

Any one such cell could send and receive data at speeds of more than a gigabit per second over up to few hundred meters—and far more at shorter distances—compared to about 75 megabits per second for the latest standard, known as 4G LTE.

For mobile cellular communications, both the Intel and Samsung technologies could eventually use frequencies of 28 or 39 gigahertz or higher. These frequencies are known as millimeter wave and carry far more data than those used in cellular networks today. But they are easily blocked by objects in the environment—and even water droplets in the air. So they’ve traditionally been seen as impractical for mobile devices.

To get around the blockage problem, processors dynamically shape how a signal is combined among 64, 128, or even more antenna elements, controlling the direction in which a beam is sent from each antenna array, making changes on the fly in response to changing conditions.

Several groups are working on such antenna arrays, but Intel says its version is more efficient. “We can scale up the number of modular arrays as high as practical to increase transmission and reception sensitivity. The barrier is only regulatory issues, not technological ones,” Sadri says.

A major problem is finding a way to get so many antennas into a mobile device. The NYU technology used a benchtop gadget hauled around the sidewalks of Manhattan for testing. It steers beams mechanically toward intended users. The Intel chip does the same thing by shaping the direction of the signal electronically, and is now packaged in a gadget smaller than a shoebox.

A number of companies are betting next-generation wireless technologies will need to use millimeter wave links to deliver all the data people want. The European Commission, for example, last year launched a $1.8 billion 5G research effort to help develop this and other technologies.



What to expect from mobile networks in 2014: The 4G car, LTE Broadcast and small cells

30 Dec
SUMMARY:In 2014, carriers will tinker with some new network technologies. They’ll start broadcasting video, shrinking the size of their cells and moving voice calls onto LTE. They’ll even start connecting cars to the 4G network.

2013 was largely a year of entrenchment in the U.S. mobile industry. Carriers expanded their 4G footprints geographically and added new capacity to meet the demands of an increasing number of LTE devices. But in 2014, we’re going to see carriers get a bit more experimental with their networks and their services.

Verizon Wireless, AT&T, Sprint and T-Mobile all have varying projects in the works that will result will result in faster, better performing networks, change the way services like voice and video are delivered to the handset and bring vehicles into the 4G fold. Here’s a look at what’s in store for our mobile networks in 2014.

Verizon becomes a broadcaster for the Super Bowl

Verizon plans to make a big splash with a new technology called LTE Broadcast at the country’s biggest sporting event in February. At Super Bowl XLVIII in Newark, N.J., Verizon will convert some of its LTE systems from two-way to one-way streets.


I’ve explained how LTE-Broadcast works in other posts, but in a nutshell it lets carriers transmit the same data stream to multiple devices simultaneously, much like a TV or radio station blankets a city with a single signal. If there are multiple people in the same cell consuming the same content, a carrier using LTE-Broadcast just ships the same packets to every device rather than establish separate streaming sessions for each individual subscriber and eating up valuable capacity.

As you might imagine, a technology like this becomes more valuable when you have a lot of people in the same place consuming the same real-time content. That makes Super Bowl Sunday at Prudential Center and its surroundings the ideal time and place to prove out the technology. Verizon hasn’t revealed the specific details of what kind of broadcast services it would offer at the Super Bowl, but we can take a guess at some possibilities.

For instance, the network could continually transmit a highlight reel of all of the game’s big plays that anyone with a Verizon LTE smartphone could tap into. It could also datacast a constantly updated ticker of player and game stats. If the NFL and Verizon get really ambitious they could broadcast a video feeds from every single one of the event’s 70 TV cameras to everyone in the stadium, letting people watch the live action or replays from any angle they choose.

Revving up the 4G car

We’re going to see the first LTE-connected vehicles in 2014, and right now it seems like a race between AudiGeneral Motors and Tesla to see who can get an 4G car on U.S. roads first.

connected car logoCellular connectivity in cars is certainly nothing new. Automakers have been using mobile networks to power their in-car roadside assistance and telematics services for the better part of a decade. But those connections have primarily been pokey 2G links, ideal for transmitting itty-bitty chunks of data – such as GPS coordinates or an “unlock vehicle” command – across the country.

But by embedding LTE in cars automakers and carriers will be able to deliver the kinds of data services we’ve grown to expect on our smartphones and tablets. Those connections will power apps in the dashboard and redistribute mobile broadband throughout the car via Wi-Fi.

Consequently, network connectivity will cease being a behind-the-scenes technology and turn into a data services sold to drivers by a carrier. Unfortunately we won’t have much choice in which carriers connect our cars – in the U.S. automakers are locking down their vehicles to specific networks – but if the carrier connecting your phone and car happen to be the same, you could attach your car to a shared data or family plan.

The incredible shrinking network

Operators spent the last few years building LTE coverage, but with their initial 4G rollouts complete or near completion they’re now starting to focus on capacity. We’re already seeing all four nationwide operators soup up their LTE speeds and capacity with new spectrum, but in 2014, they’ll start using a different tool: the small cell.

Carriers can only get so far with new airwaves. At a certain point they have to start reusing the spectrum they already have, which means breaking their big wide-reaching umbrella networks into ever-smaller partitions. As these “cells” get smaller, the overall capacity of the network grows, meaning more customers get faster, more resilient connections, especially in high-traffic locations like malls, parks and other public areas.


Of the major U.S. operators, AT&T is being the most aggressive, promising a 40,000-small cell network before the end of 2015. Sprint is also planning to make extensive use of the tiny base stations in its network. Verizon has been experimenting with small cells as well, though it’s much less enthusiastic about the technology.

VoLTE: Better late then never

Carriers have been promising for two years that VoIP services running over their LTE networks are just around the corner, but voice-over-LTE is hardly progressing. In fact, it seems to be regressing. MetroPCS launched the first U.S. VoLTE service last year, but T-Mobile has been quietly shutting it down as it transfers Metro customers onto its networks.

Both Verizon and AT&T have pushed back their planned VoLTE launches, but it looks like next year they’ll be finally ready to make their first moves toward all-IP communications. AT&T has said it would seed the network with the first VoLTE-capable handsets this year and launch an IP-phone service in 2014. News reports show Verizon’s VoLTE trials in the wild, and it too has promised to make it available to the public in 2014.

Man with two mobile phones smartphonesCarriers already have perfectly good 2G voice networks so they’re not in much of a hurry to move their core communications service over to LTE. And at least initially consumers probably wouldn’t notice if they did – their calls would just run over a different network.

The real promise of VoLTE is its ability to integrate easily with other IP communications services. We’ll see VoLTE phones linked to enterprise PBX systems first, and then we’ll see it make its way to consumers in the form of richer communications apps, supporting features like HD voice, one-touch group conferencing, messaging and video chat all within the same communications session.

Feature image courtesy of Shutterstock user valdis torms; Man-with-phones photo courtesy of Shutterstock user Stanislav Komogorov



Advances in Communications: New FSO provides reliable 10 Gbit/s and beyond backhaul connections

13 Nov

With 4G cellular communications placing increasing demands on backhaul capacity between cell towers, a new free-space optical (FSO) technology uses ultrashort pulse lasers for reliable high-bandwidth wireless communications in all weather conditions for backhaul “sweet spot” distances of 2 to 3 km.

Cellular carriers face a growing challenge to increase the backhaul capacity between cell towers to keep up with the rapidly increasing number of mobile users employing 4G technology to access the Internet. The only viable backhaul options for a full 4G network include deploying fiber-optic cables (which can be very time consuming and expensive), or installing wireless free-space optical (FSO) technology between cell towers.

Unfortunately, FSO signals can degrade due to the presence of fog and turbulence, which has prevented legacy FSO systems-that used continuous-wave (CW) lasers-from reaching the 2 to 3 km backhaul link range.1,2 However, experiments by Attochron with new FSO systems incorporatingultrashort pulse (USP) lasers show much better performance and can provide 1 Gbit/s backhaul capacity today and 10 Gbit/s and higher in the future without having to deploy fiber-optic cables.

Individual mobile Internet access and download speeds have progressed as mobile networks have evolved from 2G to 4G
FIGURE 1. Individual mobile Internet access and download speeds have progressed as mobile networks have evolved from 2G to 4G. Data rates displayed above and below show a range for that protocol. Individual mobile user download speeds will approach 100 Mbit/s in a full 4G network.

The backhaul bottleneck

As individual mobile Internet downloads increase from 2G speeds of 10 kbit/s to 4G speeds of 100 Mbit/s, the added outbound traffic places a tremendous strain on the cellular tower backhaul (see Fig. 1).3 Most cell towers depend on slower microwave backhaul connections that realistically top out at 100 Mbit/s, restricting how many users can be connected at 4G speeds (see Fig. 2). If a faster wireless technology can be deployed between cell towers, the number of 4G users can increase along with a corresponding increase in carrier revenues (see table).

Cell towers are presently connected to their core network by fiber-optic cables, copper wires, or wireless microwave links, which dictate mobile user capacity
FIGURE 2. Cell towers are presently connected to their core network by fiber-optic cables, copper wires, or wireless microwave links. Unfortunately, most cell towers depend on slow microwave connections (such as cell tower B). As more 4G mobile devices try to access the Internet from these microwave-connected cell towers, microwave speeds realistically max out at 100 Mbit/s and this capacity or pipeline must be shared with all of the mobile users connected to that cell tower.

A 20 Gbit/s fiber-optic connection to a cellular tower allows up to 200 mobile users to individually download from the Internet at full 4G, 100 Mbit/s speeds. However, former Verizon CEO Ivan Seidenberg claimed on the June 22, 2009 Charlie Rose Show that fiber optics will reach no more than 30% of a carrier’s footprint. JDSU marketing experts estimate that of the one million cell towers built by the end of 2014, 50% will require more capacity than any non‐fiber media can provide, and carriers will not be able to afford fiber optics—leaving 500,000 cell towers without a viable backhaul solution. The cost-effective wireless alternative to fiber-optic cables is FSO technology.4,5

mobile users accommodated by various backhaul technologies

Free-space optics

Legacy FSO systems work well in clear or hazy weather at distances up to 1.5 km, but the presence of fog can reduce effective link distances to 200 m.5 And while one option to improve visibility is to make the light brighter, this is not possible because the light then becomes unsafe to the eyes.6 Using flashing or strobe lights is another option; USP lasers behave similarly, but the pulses of light flash on a much shorter time scale—as short as a few femtoseconds.7

Qualitative differences between a CW laser and a USP laser are shown by infrared photography
FIGURE 3. Qualitative differences between a CW laser and a USP laser are shown by infrared photography. The false color photo is at the fiber end (start of transmission) and the black and white photo is at the receive end 1.25 km away. Notice the improvement to the pattern using the USP laser as the transmitter.

Infrared photography reveals qualitative differences between a CW laser and a USP laser both in the near-field and at 1.25 km (see Fig. 3). Experiments performed by Attochron (and supported in part by Lockheed Martin Corporation) at a 500 m wireless testing facility at the U.S. Army’s Picatinny Arsenal in Dover, NJ, demonstrated that USP laser-based FSO systems have an up to 25 dB increase in receive power over a legacy CW FSO system in fog (see Fig. 4). Other classified military research has observed 25 to 30 dB gains using an Attochron-specified USP laser in foggy conditions.

Quantitative measurement of a 25 dB difference in receive power between a USP laser and a CW laser at 550 m range is shown during high attenuation conditions (visibility less than 125 m)
FIGURE 4. Quantitative measurement of a 25 dB difference in receive power between a USP laser and a CW laser at 550 m range is shown during high-attenuation conditions (visibility less than 125 m).

These new USP FSO systems output a passively mode-locked 100 fs pulse at 1550 nm, with an average output power of 50 mW with a 1 Gbit/s repetition rate. The stream of ultrashort pulses is modulated externally to produce the gigabit Ethernet signal. In Picatinny Arsenal experiments, a single 3 in. telescope was used on the transmit side and a similar 3 in. telescope was used on the receiver side (see Fig. 5).

The hardware (a) of an Attochron USP laser-based FSO system is visualized in an artist’s concept (b)
FIGURE 5. The hardware (a) of an Attochron USP laser-based FSO system is visualized in an artist’s concept (b).

FSO signals are also subject to atmospheric scintillation that can cause receive-power fluctuations and fading and burst errors for longer link distances (similar to a star twinkling at night).8 We measured an increase in receive power of up to 15 dB in clear-air turbulence using a USP laser over a CW laser. These empirical observations correlate well to theoretical work done by G. P. Berman and colleagues at Los Alamos National Laboratory.9

This product design will incorporate four transmit apertures that allow more transmit power while maintaining eye safety. Multiple transmit apertures further reduce atmospheric scintillation because the four paths will independently sample slightly different portions of the atmosphere. These four paths will have different fluctuation patterns, and the summation of these four signals into the receive aperture will have fewer overall fluctuations.10 The receive telescope will be 8 to 12 in. in diameter and incorporate fine-steering mirrors for tracking.

In preliminary testing of our prototype USP laser-based FSO systems, the 25 dB additional margin improves link availability at 1 Gbit/s to 99.5% at 3 km; pulse modulation techniques now in development will further increase the bandwidth to 10 Gbit/s and beyond.

Pulse-shaping efficiencies

Pulse‐modulation techniques manipulate the laser pulse shape before transmission to achieve greater transmission efficiencies and optimize various desirable propagation effects. Binary phase shift keying (BPSK) pulse modulation has been used in satellite laser communication to increase overall bandwidth to 6 Gbit/s.11 Fiber-optic systems use BPSK, quadrature phase shift keying (QPSK), and 16-quadrature amplitude modulation (16-QAM) to increase bandwidth from 10 to 400 Gbit/s.

Since the USP is much narrower in time (100 fs), it has much broader spectral content than CW laser pulses in conventional fiber-optic systems. Our “pulseshaper” technology decomposes the bandwidth of a single USP laser into many discrete spectral “bins” that can be independently modulated, and then recombined to produce a new single pulse with a modified temporal shape.12 For example, by using 10 spectral bins, each with its own signaling, a 1 Gbit/s signal can be increased to 10 Gbit/s. Modulating 100 spectral bins will result in 100 Gbit/s overall.

Of course, the modulation and demodulation of these shaped USPs will require very fast digital signal processing (DSP). Fortunately, these DSP capabilities—which have been used in 100 Gbit/s and higher fiber-optic systems—are now available on an optical chip and will be incorporated in the USP FSO systems.

These advanced modulation schemes will extend wireless backhaul capacity to 10 Gbit/s and even 100 Gbit/s between cell towers. This will greatly speed the deployment of full 4G networks by cellular carriers and even be sufficient for future 5G cellular networks.13


1. W. K. Pratt, Laser Communication Systems, J. Wiley & Sons, New York, NY (1969).

2. I. I. Kim et al., SPIE Opt. Eng., 37, 3143–3155 (1998).

3. NGMN Alliance, “Guidelines for LTE Backhaul Traffic Estimation” (2011).

4. T. H. Carbonneau and D. R. Wisely, “Opportunities and challenges for optical wireless; the competitive advantage of free-space telecommunications links in today’s crowded marketplace,” Proc. SPIE, 3232, 119–128 (1998).

5. I. I. Kim, Lightwave, 26, 19–21 (2009).

6. “American National Standard for Safe Use of Lasers (ANSI Z136.1‐1993),” the Laser Institute of America, Orlando, FL (1993).

7. J. M. Hopkins and W. Sibbett, Sci. Amer., 283, 72–79 (2000).

8. I. I. Kim et al., “Measurement of scintillation for free-space laser communication at 785 nm and 1550 nm,” Proc. SPIE, 3850, 49–62 (1999).

9. G. P. Berman et al., J. Phys. B: At. Mol. Opt. Phys., 44, 55402–55421 (2011).

10. I. I. Kim et al., “Scintillation reduction using multiple transmitters,” Proc. SPIE, 2990, 102–113 (1997).

11. B. Smutny et al., “5.625 Gbit/s optical inter-satellite communication link verified in-orbit,” Proc. Ka and Broadband Communications Conference, Matera, Italy (2008).

12. See

13. P.E. Mogensen et al., “LTE-Advanced: The path towards gigabit/s in wireless mobile communications,” Wireless VITAE 2009, 147–151, Aalborg, Denmark (May 2009).


Cellular Networks and Technologies for the Internet of Things

14 Oct

Machine to machine (M2M) communications infrastructure is evolving to support the rapid growth of the Internet of Things (IoT).  While it is agreed that wireless is essential for the IoT, there is debate on how the IoT will be best accommodated – whether this is best achieved by reusing existing wireless networks and technologies or by establishing new networks and technologies dedicated to the IoT.

The utilization of existing cellular networks and Wifi platforms reduces the investment needed for the IoT.  One of the main drivers for M2M is the widely deployed wireless communications infrastructure supported by declining data costs and availability of connected low-cost devices.  Mobile operators are preparing for the opportunity of IoT and the revenue derived from the growth in M2M communications. Expanding their approach to M2M, mobile operators are value adding by offering end-to-end solutions for key applications in target markets. Wifi is also being used to support a range of M2M applications in healthcare, logistics and manufacturing industries.

Revenue models for traditional cellular networks are based on the high bandwidth demands of many cellular data applications and real-time connectivity required for mobile voice and video services.  These revenue models will require radical revision or a new approach to accommodate the low bit rates and periodic short bursts of data adequate for many M2M uses.

New cellular networks are being promoted to support the IoT and offer low cost entry with cheap data rates and low cost hardware for high volume, low bandwidth data throughput.  Other capabilities offered such as long range and long battery life further improve the feasibility of M2M/IoT.  An example of a new cellular network being promoted is SigFox (Fr.), which offers a service dedicated to low throughput M2M/IoT applications with a $3 yearly data rate.  Another example is the Weightless (U.S.) SIG which is developing a wireless radio standard using TV white spaces spectrum that is ideal for IoT with low cost data and long range capabilities of up to 10km.

Helping the future development of the IoT is the use of harmonized, licence-free spectrum for short range devices in keeping down the costs of M2M communications. Use of the vacant portions of spectrum between TV channels meets the main spectrum requirements for the IoT – low cost, sub-1GHz frequencies, global harmonization and enough bandwidth to support billions of machine devices.

The ultimate vision of the IoT is all-IP M2M networking, achievable using mesh networks.  Some RF mesh networks already take advantage of IPv6, the latest version of the IP protocol.  Ultimately, meshed M2M networks for M2M applications could become fully integrated into existing cellular networks bringing the scalability of cellular to very cost-sensitive narrowband applications.

Even when successfully addressing the primarily cost-conscious requirements of much of the potential for the IoT, it is debatable whether the existing communication platforms can entirely meet the diverse demands of a myriad of IoT applications.  Many M2M applications have different needs and new networks and new technologies are being developed to meet these needs and to reap a share of the revenue that billions of connected devices is expected to generate.  The evolution of the infrastructure of M2M communications to support the IoT is in its infancy – stay tuned.


LTE and Cellular Routers Will Stimulate Mobile Broadband Device Rebound

20 Aug

Despite an 8 percent year on year decline in 2012, annual sales of mobile cellular broadband modems and embedded cellular PCs will rebound and grow to 250 million units by 2018, with LTE increasing by 187 million units between from 2012 and 2018.

According to Strategy Analytics, the mobile broadband device recovery will begin in 2013, driven by the widespread emergence of LTE, cellular routers for USB modem replacements (for connecting multiple CE devices) and strong emerging market demand.

“The mobile broadband modem market struggled in 2012, with a decline in vendor shipments overall, as market leaders Huawei and ZTE were unable to maintain the continued growth of previous years as overall spending on modems decreased during the recession as users became more cautious over spending on data plans for modems. Discretionary spending tended to favor smartphones and bolt-on plans for tethering in a number of cases, or spending was withheld altogether,” commented Andrew Brown, Executive Director of Enterprise Research at Strategy Analytics and author of the report.

“Nevertheless, the increasing dependence on persistent and ubiquitous connectivity, coupled with the widespread emergence of high-speed LTE networks, the multi-purpose nature of cellular hotspot routers that can connect up to 8 consumer electronics devices via Wi-Fi and the opportunities in developed and emerging market means there is plenty of life left in the mobile broadband devices market,” Brown added.

Click on images to enlarge

Mobile Broadband Modem Annual Sales



M2M, Cellular and Small Cells

6 Aug

A few questions are being asked recently regarding the impact of M2M on Small Cells. With a prediction of up to 50 billion devices by 2020-25, this is a very valid question.

While M2M applies to a broad range of devices, there are still people in the industry who tend to think of M2M devices as small, cheap devices, generally with little or no mobility, responsible for small amounts of data transfer over long durations. While it is true that a multitude of M2M devices may fall in this category, it is not necessarily applicable for all M2M devices. For instance, the car industry is working very tough on the design and development of various ‘connected car’ initiatives. These connected cars will require high speed data transfer (e.g. SatNav maps, traffic related information, video streaming, etc.) over a fast varying channel due to very high mobility. To facilitate and simplify this analysis an assumption of little or no mobility will be made where device ‘may’ or ‘may not’ require reasonable amount of data at regular time intervals. This assumption would also be more applicable to scenarios with possible small cells around. Such devices could be motion sensor based devices like security cameras, connected meters, any other sensors that monitor real time information, etc. We can discount M2M devices like those in connected cars from our analysis because they would most likely rely on macro cellular coverage. Finally, we will consider only ‘open access’ small cells rather than ‘closed access’ (a.k.a. CSG or ‘Closed Subscriber Group’) small cells.


There are many factors that need to be considered in the design of an M2M device such as cost, form factor, power consumption, security, etc.

To keep the cost down while maintaining a small and simple form factor, it may be sensible to stick with one access technology rather than more. Power consumption is very low in certain type of technologies like Bluetooth; it is not always practical for large number of M2M devices. Each of these M2M devices may need a Bluetooth access point which may not always be feasible. On the personal front, many eHealth M2M devices use Bluetooth as access technology where the user provides some sort of input and the M2M device can connect to the Bluetooth on the phone. Generally, most indoor low mobility M2M devices would rather have wireless coverage which is WiFi or cellular.

A point generally made in favor of WiFi is that it’s cheap but as most people would already know, ‘there is no such thing as a free lunch’ and there are other issues that need to be considered along with this. First and foremost being that WiFi coverage area is limited per access point which may be true in case of cellular small cells also. However the M2M device can always fall back on the macro cell usage in case if the small cell becomes unavailable for whatever reason. WiFi makes use of unlicensed spectrum which is prone to interference and jamming issues. Another issue that needs consideration is if the WiFi channel is security protected or not. If unsecured, the data sent by the M2M devices may be visible for others to view unless the M2M devices encrypt it thereby increasing complexity and maybe cost. Security of the devices may also be compromised if some kind of vulnerability is detected after the devices are in the field when using WiFi. With cellular, these issues are much reduced as the SIM provides the additional layer of security against the potential hackers and also the user data is not visible for anyone interested in eavesdropping or sniffing. In case of security protected WiFi devices each of the M2M devices would need to possess the appropriate security credentials. If the WiFi SSID changes or password is changed then each of these devices would have to somehow update their credentials. With cellular, the device relies on SIM for authentication and security and thereby benefits both the network as well as the device as they both know that the other party is a trusted one.

With WiFi out of consideration, an obvious question would be how small cells could do the job better than the macro cell? The ground reality is that the macro cells are quite heavily in use most of the day. There is no longer a ‘peak period’ but the use is generally distributed evenly during an entire day. The standardization bodies along with the operators are working on various ways to offload the traffic from the macro cells to some other form of access networks to make sure there is cell capacity available for users who cannot be or would rather not offload. Going back to the predictions of 50 Billion devices, there is a limit on how many active users the network can simultaneously allow on a cell. The restriction can arise because of the ‘air interface’ bottlenecks or even the ‘core network’ overload. The networks are also wary of ‘Signalling Tsunamis’ which are very much possible with a multitude of M2M devices. Some of the new features, especially in LTE, deal with access barring of M2M devices to avoid overload. Small cells with a restricted coverage area may be less prone to the ‘overload’ situations in the access or/and the core network. There are also features intended for small cells that will allow the user plane data to be offloaded while the signaling data, responsible for security, is still sent through the normal route. The above mentioned and some additional features are still under development and standardization process by the 3GPP which will work to the benefit of Small Cells thereby making them the best solution for certain types of M2M devices that we have considered in this analysis.


LTE: Strong Starter

25 Jun

At what point does a cellular network technology become mainstream? LTE, as we are often told, has outstripped its predecessors in terms of deployment, making it the fastest mobile technology in history in terms of uptake as well as throughput. Research from the Global Mobile Suppliers Association published in April counted 175 commercial networks in 70 countries at the end of 1Q13. It has taken a little over three years to reach this point; an impressive achievement for the industry.

But LTE remains a minnow in relative terms. Third quarter 2013 data from Informa’s WCIS Plus put global LTE subscriptions at 88.48 million, which accounts for 1.35 per cent of the overall cellular market. At the end of the year, Informa forecasts, it will be the fifth largest network technology, behind GSM, WCDMA, CDMA and—by a whisker—TD-SCDMA. Four years on, at end 2017, Informa predicts that LTE will be nudging one billion subscriptions, having overtaken the dwindling CDMA and the slower growing TD-SCDMA.

The maturity of LTE is also relative in geographical terms; in the leading markets of the world uptake is flying. The US accounts for over half of all current LTE subscriptions while, in South Korea, LTE has more than 40 per cent of the market. Japan is the third of the leading markets and, with 13.8 million LTE subscriptions, oversees the considerable gap between the top three and the chasing pack; Fourth placed Australia has 2.05 million LTE subscriptions. Most of the leading LTE operators have been driven by necessity, with CDMA technologies nearing the end of their lifespan.

There is plenty of headroom remaining on WCDMA networks that have evolved towards HSPA, which is leading to a less aggressive approach from operators in other mature markets—particularly in Western Europe. So despite having less than a quarter of a million LTE subscribers, Austria still makes the top ten LTE operator listing.


Operators report that, technically, LTE deployments have not been unusually demanding. Michel Lenoir, programme manager for LTE at Vodafone Netherlands says that his firm’s rollout was “broadly comparable to the move from 2G to 3G” while Mock Pak Lum, chief technology officer at Singapore’s StarHub, says that, for his organisation, it was actually easier. “Because it is all IP,” he says, “it is not as painful as rolling out 3G.” But there has long been a sense that the technology element of LTE would be nowhere near as challenging as the business of making it pay. At the 2012 LTE World Summit Orange Spain CTO Eduardo Duato addressed this issue in the frankest terms, suggesting that many operators—particularly in the markets hit hardest by macroeconomic difficulties— would be incapable of deriving a return on LTE without taking network sharing to previously unexplored depths.

Despite Orange having calculated the total cost of ownership for LTE as 30 per cent lower than 3G, Duato’s warning was dire: “We are unable to really come up with a solid business case for LTE; we see only erosion in the years to come,” he told event delegates.

Unable to decommission their legacy networks “for a very long time”, established European operators are going to be hit hard by the cost of managing three generations of network technology concurrently, he said. Duato captured one of the key differences between GSM operators’ LTE deployments and those of CDMA leaders like Verizon and SK Telecom. Ex-CDMA operators are highly motivated to migrate users to LTE as fast as reasonably possible because there is no other option. Necessity will enable them to reap the cost efficiencies of network consolidation more quickly than their GSM-based peers.


GSM will remain in place for many years yet, not least to facilitate the roaming services that were so fundamental to its selection and success (CDMA operators, meanwhile, have been well used to workarounds for roaming). While GSM operators argue that they will harvest the benefits traditionally associated with following rather than leading in any technological evolution, they will not be able to exploit a single, or even dual network portfolio for some time to come.

Indeed in a recent survey of European operators conducted by Informa, integration problems with legacy networks was rated the second most serious challenge that operators face in deploying LTE. The first was an “unclear business model and lack of visibility for successful pricing models,” suggesting that—one year on—Eduardo Duato’s concerns remain valid. LTE has already influenced pricing behaviour from operators, with recent developments including shared or multi-device plans, contextual upgrades and moves to separate device cost from service cost, allowing operators to extricate themselves from the burden of device subsidy. Yet competition remains intense and not all operators are looking to LTE to provide premiums. Earlier this year the UK arm of Hutchison’s 3 announced that it will not price LTE services at a premium to its existing offers. In a nod to the higher charges levied by UK LTE market debutante EE, 3UK said: “Unlike some other UK mobile operators, [LTE] will be available across all existing and new price plans without customers needing to pay a premium fee to ‘upgrade’.”

Meanwhile EE cut its prices in January, well aware that its first-mover advantage is shortly to expire—and it is clearly trying to attract as many users onto long contracts as possible. But EE’s price cuts and 3UK’s announcement reflect the fact that users are not widely inclined to up their spend for faster network access.

“If the price of a service is well above a consumer’s income or disposable spending levels then they are simply not going to buy it,” says Jaco Fourie, senior BSS expert at Ericsson. “If you have more than 100 per cent penetration in a market then you might get a bump from the early adopters when you first launch [a new technology] but when you get to the mass market you will grow at GDP—end of story.”


Nonetheless there is optimism out there, if you know where to look. Vodafone Netherlands’ Michel Lenoir says that, while the firm’s initial deployment of LTE at 2600MHz in 2012 was a small-scale sop to licence obligations, its current move towards deployment at 800MHz and 1800MHz is “based on the commercial belief in the return on investment for LTE.” Nokia Siemens Networks’ head of portfolio management Thorsten Robrecht says that belief in LTE has improved enormously over the last year. “It is a really big surprise how much LTE has really boosted,” he says. “In terms of deployments [the industry] is so far above our own expectations and forecasts that RoI does not seem to be a problem. At the end of 2012 we had 52 LTE customers. By Mobile World Congress [in February] we had 78.”

Whatever positive shifts may be taking place in the collective industry mindset—and however comfortable CTOs feel with the deployment of LTE—significant challenges must still be met, not least at the very heart of the mobile proposition. LTE is a data technology designed to move rapidly increasing volumes of data traffic. But voice communication remains essential to the mobile proposition and solutions—both interim and permanent—are of critical importance.


Here again the split between operators of the GSM and CDMA bloodlines becomes apparent. GSM players have fallback options that offer a comfortable safety net for voice communications while the CDMA camp, once again, is motivated to migrate to new standards at more of a gallop. There are numerous ways to enable voice on an LTE, device, however and the familiar industry issues of fragmentation and interoperability are rearing their head once more. For a detailed exploration of the VoLTE landscape, see our feature on page 12.

Related to voice and no less fundamental to what mobile has come to represent is roaming. In the past new mobile network technologies have enjoyed lengthy grace periods during which they could bed in before anything other than basic functionality was required of them. Today’s end users, enterprise and consumer alike, are quick to demand ubiquity for any service improvement bestowed upon them—and they expect cross-border support.

At Mobile World Congress this year Greg Dial, director for global roaming at Verizon Wireless, told MCI that his brief for the show was to identify potential roaming partners in popular overseas markets.

“We are surveying the market and looking at our travel centres to see where US customers are going,” said Dial. “It’s a wide variety of destinations but certainly the UK is a market we look at very closely; the Caribbean and Latin America too and Canada. We’re still in the planning stages, but we’re here at MWC building relationships for 4G LTE roaming, both outbound but we’re also open for business for inbound roaming as well.”

LTE roaming is somewhat obstructed by the variety of spectrum bands that are allocated to the technology in different markets—a sizeable caveat to the notion of LTE as a single global standard. To complicate matters, operators are so concerned about spectrum shortages that they are buying whatever they can get their hands on in the hope that future network technologies—carrier aggregation in particular—will magically link them together somewhere down the line.

Historically operators have looked to device vendors to address the problem of multiband compatibility. With LTE the device vendors find themselves in a far more powerful position than they did when operators were demanding tri-band GSM ‘worldphones’. Indeed Apple’s decisions on which bands to support with its first LTE capable iPhone were heralded by some as likely to drive operators’ spectrum acquisition strategies.

The device ecosystem has developed at a phenomenal pace compared to previous technologies, with the GSA reporting in March that 821 user devices, including frequency and carrier variants, had been announced by 97 manufacturers. The number more than doubled over the year to the end of March, with the number of vendors also growing by 54 per cent. And while routers and dongles account for a substantial share of these devices, there were 261 smartphones in the mix.

The debates surrounding LTE today are more about routes than destinations. We know that the technology is the future of the world’s mobile operators, we know that it is capable of achieving what it has been designed to achieve and we know that some operators, at least, will make it pay. But there are different ways to approach different problems that remain to be solved. Operators’ choice of paths will go a long way to determining their success.

WiFi vs. Cellular = Home Automation vs. Property Automation

5 Jun

circle homes cellular

Power of Cellular

The world around us is getting “smarter.” And we are becoming increasingly more “connected” to this improved intelligence.

First smart phones, now smart homes. Soon, the internet of everything. It is amazing how quickly technology is racing ahead.

The smart home – i.e. home automation – is getting a lot of attention these days.

Here’s a quick list companies vying for a share of the home automation market (Side note: I complied this list for a January 2013 post, so it is by no means a complete list.)

That’s quite a list, right? It’s clear that home automation is getting a lot of investment interest and investment capital.

In fact, the smart home industry is expected to grow from $21B to 72B in the next 5 years! That’s a lot of dinero heading towards home automation.

However, even with all of the big names and big money headed towards home automation, there is a key component that is not being discussed.

WiFi vs. Celluar

This WiFi vs. Cellular topic is the core of the Home Automation vs. Property Automation discussion. Are you managing one home or an inventory of properties?

Here’s how PC Magazine defines cellular vs. WiFi:

In a nutshell, cellular is everywhere[.] Wi-Fi is only within a confined area approximately 50 to 100 feet in diameter.

Home automation offers huge potential for us all to achieve more control of our homes, and in essence, our lives – increased security, energy savings, ease of living, piece of mind. All controlled from mobile devices.

If home automation offers that much potential for an individual house – i.e. the home you live in – imagine the potential it offers on a commercial level where companies manage 10′s, 100′s, 1000′s of properties.

Increase security, energy management, operational efficiency across 1000′s of properties?! That is substantial.

Here’s the rub though, WiFi does not offer the reliability needed to connect 1-1000+ properties. Cellular does.

When the WiFi goes down at my house (a few times per week…grrrr!), I need to physically reboot the system (modem, router, etc.).

When the signal drops on your cell phone, do you even think about it? My guess is no. Cellular is self-managed (or at least not managed by you :) ). Signal drops and comes back without you worrying your pretty little head.

That’s a HUGE difference.

Imagine managing WiFi at 500 properties to make sure mission critical applications like access management, HVAC control, or motion detection are up and running.

In my humble opinion, there’s no way. WiFi based home automation systems make sense for one or two homes that you can directly manage. Beyond that, WiFi requires too much direct management.

When I worked for the vacation rental manager on the Outer Banks, 50% of our maintenance calls were because WiFi was down!

Cellular on the other hand, set it and forget it. Let the cellular network manage itself while you stay focused on running your business.

{Related Post: PointCentral is Uniquely Positioned to Provide Commercial Property Automation}

Below are a couple quick videos to offer a visual explanation of the difference. I’ll admit, they are a bit dry, but they do a great job of depicting a major difference in the networks…

  • The WiFi video demonstrates a network within a single home.
  • The cellular video focuses on a geographic network…not just one home, but many under one geographic network!

How a Cellular Telephone Works 

Intro. to WiFi


A context-aware framework for the efficient integration of femtocells in IP and cellular infrastructures

5 Mar

Abstract (provisional)

In today’s heterogeneous networking (HetNet) environments, where end users are provided   with universal connectivity opportunities, femtocell deployments can become key players   in the enhancement of critical performance indicators such as capacity, coverage,   QoS, etc. In order to confront the up-to-date LTE femtocell challenges, we propose   a context-aware framework that provides a controlled environment from the femtocell   point of view, which is required for applicable functionality. More specifically,   we aim to (a) control the local environment where the femtocell is placed within,   by efficiently managing the total incoming traffic load and by continuously adjusting   the distribution of the backhaul capacity among the coexisting networks and (b) control   the macro–femto interference caused by macrocell users transmitting close to the   femtocell by investigating the “femtocell as a relay” concept. Finally, the performance   of the proposed framework is evaluated via simulation results showing that the overall   performance of a HetNet environment can be leveraged in terms of QoS requirements,   energy saving and data rate enhancement.

The complete article is available as a PDF


%d bloggers like this: