Archive | ARPU RSS feed for this section

6G does not exist, yet it is already here

7 Oct
The Average Revenue per User, ARPU, in LATAM -as in any other areas, has kept declining, in spite of an increase in network use -data transfer- and an increase in network performances. Image credit: Global Market Intelligence, 2019

I had, recently, an interesting conversation with some analysts looking at the implication of 6G. That in itself was surprising since most of the time analysts are looking at the next quarter. Yet, they were interested on what kind of impact 6G might have on telecom operators, telecom manufacturers and on the semiconductor industry. Of course, looking that far down the lane they were also interested in understanding what type of services might require a 6G.

I started the conversation saying that 6G does not exist, but then I said that it was already here, in terms of “prodrome”. In other words, looking at the past evolution and at the present situation it may be possible to detect a few signs that can be used to make some prediction on 6G. Since this is more a crystal ball exercise than applied science, I would appreciate very much your thoughts in this matter

Lessons from “G” evolution

If you look back, starting from 1G, each subsequent “G”, up to the 4th one was the result on the one hand of technology evolution and on the other of the need of Wireless Telecom Operators to meet a growing demand. Market was expanding (more users/cellphones) and more network equipment was needed. Having a new technology that could decrease the per-element cost (with respect to capacity) was a great incentive to move from one “G” to the next. Additionally, the expansion of the market resulted in an increase of revenues.

The CAPEX to pay for expanding the network (base stations and antennas sites mostly) could be recovered in a relatively short time thanks to an expanding market (not an expanding ARPU, the Average Revenue per User was actually decreasing). Additionally, the OPEX was also decreasing (again measured against capacity).

The expanding market meant more handsets sold with increasing production volumes leading to decreased price. More than that, The expanding market fuelled innovation in the handsets, with new models stimulating the top buyers to get a new one and attracting new buyers with lower cost models. All in all a virtual spiral that as increased sales increased the attractiveness of the wireless services (the me too effect).

It is in this “ensemble” that we can find the reason for the 10 years generation cycle. After ten years a new G arrives on the market. New tech is supporting it and economic reasons make the equipment manufacturers (network and device) and telecom operators ride (and push) the wave.

How is it that an exponential technology evolution does not result in an exponential acceleration of the demise of a previous G in favour of the next one? Why are the ten years basically stable?

There are a few reasons why:

  • The exponential technology evolution does not result in an exponential market adoption
  • The market perception of “novelty” is logarithmic (you need something that is 10 times more performant to perceive at 2 times better), hence the logarithmic perception combined with an exponential evolution leads to a linear adoption
  • New technology flanks existing one (we still have 2G around as 5G is starting to be deployed)

With the advent of 4G the landscape has changed. In many Countries the market has saturated, the space for expansion has dwindled and there is only replacement. Also, the coverage provided by the network has reached in most places 100% (or at least 100% of the area that is of interest to users). A new generation will necessarily cover a smaller surface expanding over time. Hence the market (that is each of us) wil stick to the previous generation since it is available everywhere. This has the nasty (for the Operators) implication that the new generation is rarely so appealing to sustain a premium price.

The price of wireless services has declined everywhere in the last twenty years. The graphic shows the decline in the US over the llast ten years. Image credit: Bureau of LLabor Statistics

An Operator will need to invest money to deploy the new “G” but its revenues will not increase. Why would then an Operator do that? Well, because it has no choice. The new generation has better performance and lower OPEX. If an Operator does not deploy the new “G” someone else will, attracting customers and running the network at lower cost, thus becoming able to offer lower prices that will undercut others’ Operators’ offer.

5G is a clear example of this new situation and there is no reason to believe that 6G may be any different. Actually, the more capacity (performance) is available with a given G (and 4G provides plenty to most users in most situations) the less the market is willing to pay a premium for the new G. By 2030 5G will be fully deployed and people will get capcity and performance that will exceed their (wildest) needs.
Having a 6G providing 100 Gbps vs a 1 Gbps of the 5G  is unlikely to find a huge number of customers willing to pay a premium. What is likely to happen is that the “cost” of the new network will have to be “paid” by services, not by connectivity. This opens up a quite different scenario.

The Shannon theorem, expanded, to take into account the use of several antennas. In the graphic W stands for the spectrum band, B in the original Shannon theorem, and SNR for the Signal Noise ratio. Image credit: Waveform

Spectrum efficiency

Over the last 40 years, since the very first analogue wireless systems, researchers have managed to increase the spectral efficiency, that is to pack more and more information in the radio waves. Actually, with the 4G they have reached the Shannon limit. Shannon (and Hartley) found a relation between the signal power and the noise on a channel that was limiting the capacity of that channel. Over that limit the errors will be such that the signal will no longer be useful (you can no longer distinguish the signal from the noise):

C=Blog(1+S/N)

where C is the theoretically available channel capacity (in bit/s), B is the spectrum band in Hz, S is the Signal power in W and N is the Noise power in W).

Since the spectral efficiency is a function of the signal power you cannot give an absolute number to it, by increasing the signal power you could overcome noise, hence pack more bit per Hz. In practice you have some limit to the power, dictated by the regulation (max V per meter allowed), the kind of average noise  in the transmission channel (very very low for optical fibre, much much higher from wireless in a urban area, even higher in a factory…) as well as to the use of battery power.

Today, in normal usage condition and with the best wireless system, the Shannon limit for wireless system is around 4 bit per Hz (that is for every available Hz in the spectrum range allocated to that wireless transmission you can squeeze in 4 bits. Notice that because of the complexity of the environment condition you can find numbers from 0.5 to 13 in spectral efficiency, what I am indicating is a “compromise” just to give an idea of where we are). A plain 3G system may have a 1 bit per Hz in spectral efficiency, a plain vanilla 4G reaches 2.5 and with QAM 64 reaches 4.

This limit has already been overcome using “tricks” like higher order modulation (like QAM 256 reaching 6.3 bit per Hz) and most importantly using MIMO, Multiple Input Multiple Output.

This latter is really a nice way to circumvent the Shannon limit. This limit is about the use of a single channel. Of course, if you use more channels you can increase the number of bits per Hz, as long as these channels do not interfere with one another. This is actually the key point! By using several antennas, in theory, I could create many channels. one for each antenna couple (transmitting and receiving). However these parallel transmission (using the same frequency and spectrum band) will be interfering with one another.

Here comes the nice thing: “interference” does not exist! Interference is not a property of waves. Waves do not interfere. If a wave meets another wave, it does not stop to shake hands, rather each one continues undisturbed and unaffected on its way. What really happens is that an observer will no longer be able to distinguish one wave from the other at the point where they meet/overlap. So, the interference is a problem in the detector, not of the waves. You can easily visualise this as you look at a calm sea. You will notice small waves and in some areas completely flat patches. These are areas where waves meet and overlap annihilating one another (a crest of one adds to the trough of the other resulting in a flat area). If you have “n” transmitting antennas and “n+1” receiving antennas (each separated from the others at least half-wavelength, then you can sort out the interference and get the signal. This is basically the principle of MIMO. To exploit it you need sufficient processing power to manage all signals received in parallel by the antennas and this is something I will address in a future post. For now it is good to know that there is a way to circumvent the Shannon limit and expand the capacity of a wireless system.

6G will not just exploit massive MIMO, it will be able to do something amazing: spread the signal processing across many devices, each one acting as an array of antennas. Rather than having a single access point in 6G, in theory at least, you can have an unlimited number of access points, thus multiplying the overall capacity. It would be like sending mails to many receivers. You may have a bottleneck in one point but the messages will get to other points that in turn will be able to relay them to the intended receiver once this is available.

Source: https://cmte.ieee.org/futuredirections/2020/10/07/6g-does-not-exist-yet-it-is-already-here-ii/ 07 10 20

Why Network Visibility is Crucial to 5G Success

9 Mar

In a recent Heavy Reading survey of more than 90 mobile network operators, network performance was cited as a key factor for ensuring a positive customer experience, on a relatively equal footing with network coverage and pricing. By a wide margin, these three outstripped other aspects that might drive a positive customer experience, such as service bundles or digital services.

Decent coverage, of course, is the bare minimum that operators need to run a network, and there isn’t a single subscriber who is not price-sensitive. As pricing and coverage become comparable between operators, though, performance stands out as the primary tool at the operator’s disposal to win market share. It is also the only way to grow subscribers while increasing ARPU: people will pay more for a better experience.

With 5G around the corner, it is clear that consumer expectations are going to put some serious demands on network capability, whether in the form of latency, capacity, availability, or throughput. And with many ways to implement 5G — different degrees of virtualization, software-defined networking (SDN) control, and instrumentation, to name a few — network performance will differ greatly from operator to operator.

So it makes sense that network quality will be the single biggest factor affecting customer quality of experience (QoE), ahead of price competition and coverage. But there will be some breathing room as 5G begins large scale rollout. Users won’t compare 5G networks based on performance to begin with, since any 5G will be astounding compared to what they had before. Initially, early adopters will use coverage and price to select their operator. Comparing options based on performance will kick in a bit later, as pricing settles and coverage becomes ubiquitous.

So how then, to deliver a “quality” customer experience?

5G, highly virtualized networks, need to be continuously fine-tuned to reach their full potential — and to avoid sudden outages. SDN permits this degree of dynamic control.

But with many moving parts and functions — physical and virtual, centralized and distributed — a new level of visibility into network behavior and performance is a necessary first step. This “nervous system” of sorts ubiquitously sees precisely what is happening, as it happens.

Solutions delivering that level of insight are now in use by leading providers, using the latest advances in virtualized instrumentation that can easily be deployed into existing infrastructure. Operators like Telefonica, Reliance Jio, and Softbank collect trillions of measurements each day to gain a complete picture of their network.

Of course, this scale of information is beyond human interpretation, nevermind deciding how to optimize control of the network (slicing, traffic routes, prioritization, etc.) in response to events. This is where big data analytics and machine learning enter the picture. With a highly granular, precise view of the network state, each user’s quality of experience can be determined, and the network adjusted to better it.

The formula is straightforward, once known: (1) deploy a big data lake, (2) fill it with real-time, granular, precise measurements from all areas in the network, (3) use fast analytics and machine learning to determine the optimal configuration of the network to deliver the best user experience, then (4) implement this state, dynamically, using SDN.

In many failed experiments, mobile network operators (MNOs) underestimated step 2—the need for precise, granular, real time visibility. Yet, many service providers have still to take notice. HR’s report also alarmingly finds that most MNOs invest just 30 cents per subscriber each year on systems and tools to monitor network quality of service (QoS), QoE, and end-to-end performance.

If this is difficult to understand in the pre-5G world — where a Strategy Analytics’ white paper estimated that poor network performance is responsible for up to 40 percent of customer churn — it’s incomprehensible as we move towards 5G, where information is literally the power to differentiate.

The aforementioned Heavy Reading survey points out that the gap between operators widens, with 28 percent having no plans to use machine learning, while 14 percent of MNOs are already using it, and the rest still on the fence. Being left behind is a real possibility. Are we looking at another wave of operator consolidation?

A successful transition to 5G is not just new antennas that pump out more data. This detail is important: 5G represents the first major architectural shift since the move from 2G to 3G ten years ago, and the consumer experience expectation that operators have bred needs some serious network surgery to make it happen.

The survey highlights a profound schism between operators’ understanding of what will help them compete and succeed, and a willingness to embrace and adopt the technology that will enable it. With all the cards on the table, we’ll see a different competitive landscape emerge as leaders move ahead with intelligent networks.

Source: https://www.wirelessweek.com/article/2017/03/why-network-visibility-crucial-5g-success

How connected cars are turning into revenue-generating machines

29 Aug

 

At some point within the next two to three years, consumers will come to expect car connectivity to be standard, similar to the adoption curve for GPS navigation. As this new era begins, the telecom metric of ARPU will morph into ARPC (average revenue per car).

In that time frame, automotive OEMs will see a variety of revenue-generating touch points for connected vehicles at gas stations, electric charging stations and more. We also should expect progressive mobile carriers to gain prominence as essential links in the automotive value chain within those same two to three years.

Early in 2016, that transitional process began with the quiet but dramatic announcement of a statistic that few noted at the time. The industry crossed a critical threshold in the first quarter when net adds of connected cars (32 percent) rose above the net adds of smartphones (31 percent) for the very first time. At the top of the mobile carrier chain, AT&T led the world with around eight million connected cars already plugged into its network.

The next big event to watch for in the development of ARPC will be when connected cars trigger a significant redistribution of revenue among the value chain players. In this article, I will focus mostly on recurring connectivity-driven revenue. I will also explore why automakers must develop deep relationships with mobile carriers and Tier-1s to hold on to their pieces of the pie in the connected-car market by establishing control points.

After phones, cars will be the biggest category for mobile-data consumption.

It’s important to note here that my conclusions on the future of connected cars are not shared by everyone. One top industry executive at a large mobile carrier recently asked me, “Why do we need any other form of connectivity when we already have mobile phones?” Along the same lines, some connected-car analysts have suggested that eSIM technology will encourage consumers to simply add to their existing wireless plans connectivity in their cars.

Although there are differing points of view, it’s clear to me that built-in embedded-SIM for connectivity will prevail over tethering with smartphones. The role of Tier-1s will be decisive for both carriers and automakers as they build out the future of the in-car experience, including infotainment, telematics, safety, security and system integration services.

The sunset of smartphone growth

Consider the U.S. mobile market as a trendsetter for the developed world in terms of data-infused technology. You’ll notice thatphone revenues are declining. Year-over-year sales of mobiles have registered a 6.5 percent drop in North America and have had an even more dramatic 10.8 percent drop in Europe. This is because of a combination of total market saturation and economic uncertainty, which encourages consumers to hold onto their phones longer.

While consumer phone upgrades have slowed, non-phone connected devices are becoming a significant portion of net-adds and new subscriptions. TBR analyst Chris Antlitz summed up the future mobile market: “What we are seeing is that the traditional market that both carriers [AT&T and Verizon] go after is saturated, since pretty much everyone who has wanted a cell phone already has one… Both companies are getting big into IoT and machine-to-machine and that’s a big growth engine.”

At the same time, AT&T and Verizon are both showing a significant uptick in IoT revenue, even though we are still in the early days of this industry. AT&T crossed the $1 billion mark and Verizon posted earnings of $690 million in the IoT category for last year, with 29 percent of that total in the fourth quarter alone.

Data and telematics

While ARPU is on the decline, data is consuming a larger portion of the pie. Just consider some astonishing facts about data usage growth from Cisco’s Visual Networking Index 2016. Global mobile data traffic grew 74 percent over the past year, to more than 3.7 exabytes per month. Over the past 10 years, we’ve seen a 4,000X growth in data usage. After phones, cars will be the biggest category for mobile-data consumption.

Most cars have around 150 different microprocessor-controlled sub-systems built by different functional units. The complexity of integrating these systems adds to the time and cost of manufacturing. Disruptive companies like Tesla are challenging that model with a holistic design of telematics. As eSIM becomes a standard part of the telematics control unit (TCU), it could create one of the biggest disruptive domino effects the industry has seen in recent years. That’s why automakers must develop deep relationships with mobile carriers and Tier-1s.

The consumer life cycle for connected cars will initially have to be much longer than it is for smartphones.

Virtualization of our cars is inevitable. It will have to involve separate but interconnected systems because the infrastructure is inherently different for control versus convenience networks. Specifically, instrument clusters, telematics and infotainment environments have very different requirements than those of computing, storage and networking. To create a high-quality experience, automakers will have to work through hardware and software issues holistically.

Already we see Apple’s two-year iPhone release schedule expanding to a three-year span because of gentler innovations and increasing complexity. The consumer life cycle for connected cars will initially have to be much longer than it is for smartphones because of this deep integration required for all the devices, instruments and functionalities that operate the vehicle.

Five factors unique to connected cars

Disruption is everywhere within the auto industry, similar to the disruption that shook out telecom. However, there are several critical differences:

  • Interactive/informative surface. The mobile phone has one small screen with all the technology packed in behind it. Inside a car, nearly every surface could be transformed into an interactive interface. Beyond the instrumentation panel, which has been gradually claiming more real estate on the steering wheel, there will be growth in backseat and rider-side infotainment screens. (Semi-) autonomous cars will present many more possibilities.
  • Processing power. The cloud turned mobile phones into smart clients with all the heavy processing elsewhere, but each car can contain a portable data center all its own. Right now, the NVIDIA Tegra X1 mobile processor for connected cars, used to demonstrate its Drive CX cockpit visualizations, can handle one trillion floating-point operations per second (flops). That’s roughly the same computing power as a 1,600-square-foot supercomputer from the year 2000.
  • Power management. The size and weight of phones were constrained for many years by the size of the battery required. The same is true of cars, but in terms of power and processing instead of the physical size and shape of the body frame. Consider apps like Pokémon Go, which are known as battery killers because of their extensive use of the camera for augmented reality and constant GPS usage. In the backseat of a car, Pokémon Go could run phenomenally with practically no affect on the car battery. Perhaps car windows could even serve as augmented reality screens.
  • Risk factors. This is the No. 1 roadblock to connected cars right now. The jump from consumer-grade to automotive-grade security is just too great for comfort. Normally, when somebody hacks a phone, nobody gets hurt physically. Acybersecurity report this year pointed out that connected cars average 100 million lines of code, compared to only 8 million for a Lockheed Martin F-35 Lightning II fighter jet. In other words, security experts have a great deal of work to do to protect connected cars from hackers and random computer errors.
  • Emotional affinity. Phones are accessories, but a car is really an extension of the driver. You can see this aspect in the pride people display when showing off their cars and their emotional attachment to their cars. This also explains why driverless cars and services like Uber are experiencing a hard limit on their market penetration. For the same reasons, companies that can’t provide flawless connectivity in cars could face long-lasting damage to their brand reputations.

Software over hardware

The value in connected cars will increasingly concentrate in software and applications over the hardware. The connected car will have a vertical hardware stack closely integrated with a horizontal software stack. To dominate the market, a player would need to decide where their niche lies within the solution matrix.

However, no matter how you view the hardware players and service stack, there is a critical role for mobility, software and services. These three will form the framework for experiences, powered by analytics, data and connectivity. Just as content delivered over the car radio grew to be an essential channel for ad revenue in the past, the same will be true in the future as newer forms of content consumption arise from innovative content delivery systems in the connected car.

In the big picture, though, connectivity is only part of the story.

As the second-most expensive lifetime purchase (after a home) for the majority of consumers, a car is an investment unlike any other. Like fuel and maintenance, consumers will fund connectivity as a recurring expense, which we could see through a variety of vehicle touch points. There’s the potential for carriers to partner with every vehicle interaction that’s currently on the market, as well as those that will be developed in the future.

When consumers are filling up at the gas pump, they could pay via their connected car wallet. In the instance of charging electric cars while inside a store, consumers could also make payments on the go using their vehicles. The possibilities for revenue generation through connected cars are endless. Some automakers may try the Kindle-like model to bundle the hardware cost into the price of the car, but most mobile carriers will prefer it to be spread out into a more familiar pricing model with a steady stream of income.

Monetization of the connected car

Once this happens and carriers start measuring ARPC, it will force other industry players to rethink their approach more strategically. For example, bundling of mobile, car and home connectivity will be inevitable for app, data and entertainment services as an integrated experience. In the big picture, though, connectivity is only part of the story. Innovative carriers will succeed by going further and perfecting an in-car user experience that will excite consumers in ways no one can predict right now. As electric vehicles (EVs), hydrogen-powered fuel cells and advances in solar gain market practicality, cars may run without gas, but they will not run without connectivity.

The first true killer app for connected cars is likely to be some form of new media, and the monetization potential will be vast. With Gartner forecasting a market of 250 million connected cars on the road by 2020, creative methods for generating revenue streams in connected cars won’t stop there. Over the next few years, we will see partnerships proliferate among industry players, particularly mobile carriers. The ones who act fast enough to assume a leadership role in the market now will drive away with an influential status and a long-term win — if history has anything to say about it.

Note: In this case, the term “connected” brings together related concepts, such as Wi-Fi, Bluetooth and evolving cellular networks, including 3G, 4G/LTE, 5G, etc.

Featured Image: shansekala/Getty Images
Source: http://cooltechreview.net/startups/how-connected-cars-are-turning-into-revenue-generating-machines/

The Mobile Backhaul Evolution

2 Oct

As mobile data usage proliferates, so does the demand for capacity and coverage, particularly with the rise of connected devices, data-hungry mobile apps, video streaming, LTE roll-outs and the popularity of the smartphone and other smart devices. With mobile data traffic expected to double annually, existing mobile backhaul networks are being asked to handle more data than they were ever designed to cope with, and operators are being asked to deal with a level of capacity demand far greater than ever could have been imagined.

Breaking the backhaul bottleneck
The demand on operators to provide more, and faster, services for the same costs is putting mobile backhaul networks under intense pressure, and effectively means the operator ARPU (Average Revenue per User) is in decline. iGR Research Company has confirmed that the demand on mobile backhaul networks in the US market will increase 9.7 times between 2011 and 2016, fueled by rapidly growing data consumption, faster than operators can keep up with. Surging data traffic is stressing existing connections and forcing many operators to invest in their network infrastructures in order to remain competitive and minimize subscriber churn.

Mobile operators realize that in order to meet capacity, coverage and performance demands, while raising their ARPU, they need to evolve their mobile backhaul networks to perform better and be more efficient. As the capacity and coverage demands accumulate, mobile backhaul evolution comes to the forefront as an area that operators must address and align with growing demand.

Evolution not revolution
As wireless technologies have developed over the years, a mixture of transmission technologies and interfaces to Radio Access Network (RAN) equipment have been utilized to support communications back to the mobile network operator, including 2G, 3G and now 4G LTE. Today, operators evolve their backhaul by converging multiple backhaul technologies into one unified technology and converging multiple parallel backhaul networks into a single all-IP network. Based on IP and MPLS, having one, all-IP network makes more efficient use of network resources, reduces operational costs, and is cheaper to manage and maintain. IP gives operators the ability to converge RAN traffic and MPLS technology addresses the challenge of

Source: A Knowledge Network Article by the Broadband Forum http://www.totaltele.com/view.aspx?C=1&ID=487671

The case for sponsored video

10 Feb

 

 
AT&T Sponsored Data 
We have all seen theannouncement  at CES this January. AT&T is to offer a new plan for its 4G customers, allowing companies to sponsor traffic from specific app or content. The result would be that subscribers would not be charged for data traffic resulting from these interactions, the sponsoring company picking up the bill.
While there is not much detail available on how the offer works and what price would the sponsor be expected to pay for the sponsored content (after all, subscribers all have very different plans, with different charging / accrual models), there has already been much speculation and comments in the press and analyst community about the idea.
I haven’t really read anything yet to convince me whether this is a good or bad idea, so I thought I would offer my 2 cents.

Costs are rising, ARPU is declining 

Ralph de la Vega, AT&T’s CEO was quoted commenting on the press release that AT&T has seen a 30,000% growth in mobile data in the last 6 years. This growth in traffic resulted in an increase in costs, paving the way for the license bid and roll out of LTE. US ARPU are declining for the first time in history, and with rising costs, network operators must find new revenue streams. Since video now accounts for over 50% of data traffic and growing, it is a good place to start looking.

Mobile advertising is under utilized, but there is appetite

According to KPCB, about 41B$ were mis spent by advertisers in the US alone, on old media (print, radio) if we compare to time spent on new media (internet, mobile). The Internet Advertising Bureau 2013 study (people were interviewed in Australia, China, Italy, South Korea, Brazil, UK, India, Russia, Turkey, the US) shows that a large proportion of users are “ok with advertising if [they] can access content for free”. The same study shows that announcers are looking at targeting (45%) and reach (30%) as the most important criteria to select a medium for advertisement. At last, video pre-roll seems to be the preferred format for advertising on tablet and smartphones.

Network operators are not (well) organized to sell advertising

Barring a few exceptions, network operators do not have the means to sell sponsored data efficiently. The technology aspect is sketchy. Isolating specific data traffic from their context can be difficult (think facebook app with a youtube embedded video served by a CDN) and content / app providers do not design their service with network friendliness in mind. On the business front, the challenges are, I believe, bigger. Network operators have failed repetitively in coopetition models. They do not have a wholesale division and mindset (everyone is scared of being only a pipe). On the bright side, Verizon, Vodafone, AT&T are putting forward APIs to start enabling content providers to have more visibility and varying level of control on the user experience.

Regulatory forces are not mature for this model

We have seen the latest net neutrality comments and fear flaring on media. Sponsored data and/or video is going to have to be managed properly if AT&T actually wants to make it a business. I am very skeptical with AT&T’s statement “Sponsored Data will be delivered at the same speed and performance as any non-Sponsored Data content.” I doubt that best effort will be sufficient, when / if advertisers are ready to put real money on the table. They will need guarantees, service level agreements, analytics to prove that the ad was served until completion in a good enough quality.

In conclusion, sponsored data is going to be difficult to put in place, but there is an appetite for it. Technically, it would be easier and probably more beneficial to limit the experience to video only. Culturally and business-wise, operators need to move in this direction, if they want to compete against companies for whose advertising is the dominant model (Google, Facebook, Linked In…). In order to do so, separating video from general data traffic and managing it as a separate service can go a long way. The biggest challenge will remain. It is one of mindset and organization. I am not sure that sending an email to sponsoreddata@att.com is going to get McDonalds to pay for my 30 minutes of YouTube if I buy a Big Mac combo.

 
Source: http://coreanalysis1.blogspot.ca/2014/02/the-case-for-sponsored-video.html

The 2020 network: How our communications infrastructure will evolve

8 Oct

Let’s start with two basic questions: In the year 2020, what will the network look like? What are the technology building blocks that will make up that network? In order to answer these questions, we need to examine some likely truths about the telecom industry, and understand what current realities are shaping decisions about the future.

The $1 ARPU economy

The first likely truth is the emergence of the $1 ARPU (average revenue per user) economy. The shift from mobile telephony to mobile compute has irreversibly shifted our attentions and our wallets away from undifferentiated voice, text and data services to the billions of individual apps encompassing all user needs.

Money In The Air - dollar bills - money flying

This “few” to “many” application economy drives pricing pressure, resulting in a $1-ARPU-per-service revenue foundation. That service unit could take the form of a Nike wellness application or it could be a production-line sensor connected to a General Electric industrial control system. Whether serving human or machine, the value of a service is being driving down to a dollar.

These forces are not binary. We are still stuck between the old and new realities. The telecom landscape has become a tug-of-war. On one end of the rope is increasing capital investment driven by growing data demand. On the other end is increasing price competition, causing diminishing margins. Profitable growth will require rethinking the network end to end.

Telecom data center

mobilize-2013-essay

Let’s start with compute.  The prevailing telecom services consist of voice and messaging.  These applications are typically part of mobile data service subscription bundles. Data center equipment is designed to fit those few applications. Network sub-systems come with significant software built in. The resulting bespoke systems require significant operating expenses. The business model here is to minimize upfront costs and pay maintenance to maintain that hardware and software. The distribution model is limited to the telecom provider and a specific mobile client.

There is a Henry Ford “any color you like as long as it’s black” philosophy to telecom architecture. It is imminently suitable for to high-ARPU services like mobile telephony, but it’s simply not flexible or cost efficient to give choice to the mobile compute consumer. To compete at cost, the data center must be more efficient.

The key metric here is number of servers operated by a single system administrator. Today that ratio is around 40:1, this involves individual servers with unique installs, low levels of automation, compliance requirements and time-intensive support requests. To reduce the marginal cost of adding an application, carriers would need to migrate to cloud architectures. Cloud systems offer a unified platform for applications and allow for high levels of automation with server to system administrator ratios greater than 5000:1. The higher the ratio, the more the system administrator’s role becomes that of a high-level software developers –  instead of hitting a reset switch they’re finding find bugs with the help of custom firmware. The consequence is a massive competency shift in the operations team.

data center hard drives storage shutterstock_112814833

These technologies are rooted in the Google and Facebook hyperscale models. The hyperscale approach is the polar opposite of the telecom model. The application is built using a scale-out commodity system design, where the objective is to minimize the total cost over the life of the rack, where the most expensive component is the human system administrator. The operational pattern is the reverse of hardware uptime, instead you simply switch off the shelf when it fails and fall back to another scaled out instance. When all the shelves in a rack have failed it is retired and replaced with the next generation of hardware. The net consequence is to swap long-term operational cost for capital cost depreciated over much shorter periods of time.

Software-defined network

Second, let’s double click on the network. The network connects the compute with mobile endpoints through rigid overlays, such as multi-label packet switching (MPLS) or virtual LANs, which force traffic through one-size-fits-all network services such as load balancers and firewalls.

Binary code

To make the network more flexible, the mobile industry needs to embrace software defined networks and network function virtualization. The central idea is to abstract the network such that the operator can program services instead of creating static network overlays for every new service.  All network services are moved from the network to data centers as applications on commodity or specialized hardware, depending on performance. The implication is that time to market can be reduced from years to hours.

How would this translate into a real world example? Consider writing a script that would map all video traffic onto a secondary path during peak hours. First we need to get the network topology, then allocate network resources across the secondary path, and finally create an ingress forwarding equivalence class to map the video traffic to that path. Today this would require touching every network element in the path to configure the network resources, resulting in a significant planning and provisioning cycle.

The benefit of software-defined networks is that the command sequences to configure the network resources would be automated through a logically centralized API. The result is an architecture that allows distributed network elements to be programmed for services through standard sequential methods. This effectively wrests control of the network away from IP engineers and puts it in the hands of IT software teams.

Internet of things math

What is the end game of unleashing these IT software teams on the network? The goal is to create a “network effect” which can fuel a transformation towards an internet of things. To achieve this a critical requirement of the software abstractions in the data center and network are the RESTful APIs. The importance of adopting web APIs across the network allows telecom services to be unlocked and combined with other internal or external assets.  This transforms the network from a black box of static resources, to a marketplace of services. A network marketplace will fuel the network effects required to serve the crush of connections anticipated by 2020. The choice of web interfaces is therefore critical for success.

Let’s look at the numbers to understand why. Today there are about half a million developers who can use proprietary telecom service creation environments (for example IP Multimedia Subsystem). With modern day RESTful methods, there is an addressable audience of about five million developers. The network vision of 2020 is unlike the current mobile broadband ecosystem, where 1 billion human- connected devices can be mediated by a half a million telecom developers. In the $1 ARPU future, 50 billion connected devices will need to be mediated by 5 million developers.  This reality compels a shift of several orders of magnitude in the requisite skills and number of developers. We’re simply going to need a bigger boat, and REST is the biggest boat on the dock.

5G: Choice and flexibility

So, we’ve looked at data center and network, but we still need to address the last mile. This brings us to a second likely truth: 5G will not just be about speed.

I understand that ITU has not yet qualified “5G” requirements, however the future always experiments in the present.

The 2020 network will need to support traffic volumes more than 1000x greater than what we see today. In addition, we’ll need connections supporting multi-gigabit throughputs as well as connections of only of a few kilobits-per-second. Smart antennas, ultra-dense deployments, device-to-device communications, expanded spectrum – including higher frequencies – and improved coordination between base stations will be foundational elements of such networks. The explosion and diversity of machine-connected end points will define use cases for low bandwidth, low latency and energy-efficient connections.

Crowd density dense network feature

Therefore, 5G will consist of a combination of radio access technologies, with multiple levels of network topologies and different device connectivity modes. It’s not just a single technology.

5G will likely require similar abstraction requirements as in software-defined networks to provide loosely coupled and coarsely grained integration with end-point and network-side services. The result will be applications aware of the underlying wireless network service, delivering rich new experiences to the end-user.

The research required for 5G is now well underway. Ericsson is a founding member of the recently formed METIS project. This community is aimed at developing the fundamental concepts of the 5G.

Conclusion

Harvard Business School professor Clayton Christenson recently said: “I think, as a general rule, most of us are in markets that are booming. They are not in decline. Even the newspaper business is in a growth industry. It is not in decline. It’s just their way of thinking about the industry that is in decline”

The mobile industry is undergoing a dramatic rethinking of business foundations and supporting technologies. In many ways, technologies such as cloud, software-defined networking and 5G result in a “software is eating the network” end game.  This in turn will promote opportunities that are much larger than just selling voice and data access. There is a possibility of vibrant ecosystems of users and experiences that can match the strong network effects enjoyed by over-the-top providers. The 2020 telecom network will enable service providers to create a network marketplace of services, and deliver the vision of a networked society.

Source: http://gigaom.com/2013/10/07/the-2020-network-how-our-communications-infrastructure-will-evolve/

Telcos in hot pursuit of high-paying subscribers

28 Jan

New tariff schemes, loyalty plans to boost revenues. – Telecom operators are now on a hunt for high-paying subscribers, rather than mere SIM additions, and are keen to upgrade low tariff users to post-paid schemes.

This, apart from giving high margins, would also ensure ‘stickiness’ of the customer on the network.

However, the dependency on pre-paid plans has not eroded and still continues as they are the preferred mode of payment, not to mention other benefits.Operators want more pre-paid, but revenue-generating customers on their network.

“The average revenues per user (ARPU) from post-paid subscribers are at least four times than that from a pre-paid user as minutes of usage of the latter is much lower. Hence, it would be a good idea for an operator to convert pre-paid to post-paid,” said PricewaterhouseCoopers India Executive Director Sivarama Krishnan.

“However, it also has its repercussions. A pre-paid user foots the bill instantly, if not in advance, while in case of post-paid it comes in much later,” Krishnan added.

The country’s total wireless customer base was at 890.60 million as of November 2012, according to the Telecom Regulatory Authority of India data.Of this, about 97 per cent of users are on pre-paid connections.

“There is a process of up-sell to post-paid schemes, even though ideally it should be a mix of pre- and post-paid schemes. This is because a large part of pre-paid users are not serious customers, while post-paid also brings more loyal customers,” Jaideep Ghosh, Partner at KPMG India, said.

NEW SCHEMES

The majority of new schemes launched in 2012 were for post-paid users as pre-paid margins are lower. The operators were also looking at data-centric services such as 3G and 4G to improve revenues, an analyst with a Mumbai-based brokerage said.

Uninor, a company that only offers pre-paid services, has launched more than 100 different variants and products in the past one year, officials said. Many of the telecom operators declined to divulge the number of schemes they have launched last year, but admitted that a majority were launched for post-paid users. “New products and schemes serve to both attract customers and increase usage. This is a natural evolution once an operator has reached a critical mass in terms of subscriber share. Pre-paid markets, being largely driven by youth segment, seek a fair extent of excitement with customers constantly seeking the latest and newest best deal,” Uninor Chief Product Officer Amaresh Kumar said.

Voice and SMS continue to be revenue earners in the Indian market.Consequently, offers relating to talk-time on recharge vouchers traditionally generate the highest revenues for mobile operators, he added.

PREFERRED MODE

“It is important to mention that pre-paid is the preferred mode of payment for most mobility customers across the nation and also the faster-growing market segment. As a youth brand, Tata DoCoMo is continuing to chase aggressive growth targets on pre-paid,” Gurinder Singh Sandhu, Marketing Head at Tata DoCoMo, said.

Tata DoCoMo views both pre- and post-paid as mere modes of payment, not different services, Sandhu said. Telecom companies cutting down on freebies, discounts and validity periods, a move that would result in subscribers paying 20-30 per cent more, is in the direction of retaining high-value customers.Post-paid or pre-paid doesn’t matter.

Source:thehindubusinesslinehttp://magteletalk.wordpress.com/2013/01/24/telcos-in-hot-pursuit-of-high-paying-subscribers/

Making More Money in M2M: The M2M Service Company

22 Jan

Machine-To-Machine (M2M) has many potential applications with big benefits to end-customers, but how do suppliers including telcos make enough money to make it worth investing in? Here’s a preview of a new M2M business model we’re working on, the ‘M2M Services Company’, based on an approach successfully used by energy and IT companies to share end-customer productivity gains. This preview includes an example use-case.

M2M – where’s the money?

For telcos, M2M ARPUs are low, and are also being eroded, like ARPUs across the industry. M2M used to be thought of as the next SMS, a business with potentially huge scale and high margins on relatively undemanding traffic volumes. But it’s turning out to be a rather disappointing generic ISP product. So, more recently, operators have been increasingly keen to substitute value from bulk telecoms products with value from software and managed services, providing service enablers that help the customer deploy and manage the device fleet.

However, few operators have any illusions about being providers of compelling software. In most cases, with the notable exception of Telenor, they have had to partner with third-party developers such as Jasper Wireless and Pachube. So that’s a partial solution to the problem, but at the cost of sharing some of the revenue and losing much of the potential for differentiation.

So what else can be done?

M2M: like selling cosmetics or selling hope?

Charles Revson, the founder of cosmetics giant, Revlon once famously said: “In the factory, we make cosmetics. In the drugstore, we sell hope.”

In M2M, operators have traditionally tried to sell the equivalent of ‘cosmetics’ – something that makes sense to operators, such as SIMs, gigabytes of data traffic, or bundled messaging tariffs. Whether it was a sheep, for Telenor, an Amazon Kindle for Sprint or Vodafone, or one of these baggage-tracking devices that SMS you their location, it worked a bit like that.

But here’s a question: how many M2M customers actually want “SIMs”, “modules”, “data”, or “connectivity”?

Consider this other famous remark by the US energy-efficiency pioneer Amory Lovins:

People don’t want energy. They want cold beer.

 

Energy, as such, isn’t particularly useful. It is only valuable because of the work that can be done with it – for example, cooling the beer. M2M is very much like that: its customers typically have business requirements that don’t have very much to do with telecommunications, but which do depend on telecoms. What if they were offered the solution to their business requirement, rather than just the minutes of use?

An Innovative Business Model in Energy

In the energy sector, people began doing just this at the end of the 1970s. A company called Time Energy, which makes timers, thermostats, and other controls, noticed that the biggest challenge for their sales force was convincing the customer that they really would save substantial amounts of money on their energy bills.

They came up with an elegant solution: offer the goods free, in exchange for a share of the savings. More precisely, Time Energy would offer 100% vendor financing and the customer would pay them back at a rate based on the reduction in their energy expenses. However you cut it, Time and the customer shared in the reduced energy costs, and the customer didn’t have to put any money down up front. It’s hard to say no to free.

This business model was later developed into the energy service company or ESCO, which applies the same idea to a broader selection of energy-related products and services. Some projects even include deeper re-engineering of production processes, or include the option to take the customer’s share as a lump sum. IBM extended it into the IT sector, applying it to major enterprise computing projects.

What if the same approach was applied to M2M? Most M2M projects are all about some kind of cost reduction or productivity gain in a business process. Quite often, they involve energy saving, but that’s not necessary.

The approach could:

    • Use the power of free to sell the project
    • Share in the underlying productivity gain
    • Create sticky customers
    • Make M2M more of a “SMS-like business”
    • Make practical use of big data

Data is the Key to New Business Models in M2M

 

The ESCO experience shows that data is critical to the business model. Typically, it’s necessary to carry out a careful energy audit of the customer premises in advance in order to estimate the potential energy savings (and therefore revenue) and to cost out the job. Then, after closing the deal, it’s necessary to come back and measure the effects, in order to settle the bills and to guarantee quality. In a M2M service company project, data will be on the front line. Whatever the metrics are, it will be critical to master the data in order to make it all work.

Screenshot from 2013-01-07 17:14:39.png

Analysing the success factors for such a business, we identified two axes, one of scale and one of “data legibility”. Some projects – the ones in the “zone of gold” on the chart – have good data and a big payoff. These are likely to get done in-house, and therefore they aren’t really available to operators. Some have terrible data and are tiny, and therefore can be safely ignored.

But there are two groups of opportunities we see as ideal for the M2M service company model: ones where the benefit is clear, but the data isn’t good enough, and ones where the data is there, but the individual project size is too small to be worth doing. We can get at the first group through better M2M data collection and analytics. We can get at the second through aggregation and financing.

Example Use Case: UK Rail

Here’s an example. In the UK, train leasing companies acquire and maintain trains for the UK railways’ train operating companies and the Department for Transport (Rail). Today, it is typical that the contract between DfT(R), the TOC, and the leasing company specifies a number of “diagrams” – roughly, a diagram is one train-movement – rather than a number of physical trains. This means that a major opportunity exists for the competitor who can maximise the amount of time the trains spend on the line, thus both making the service more efficient and reducing the number of trains they need to buy.

Collecting live data from instrumentation on the trains could make it possible to bring them in for maintenance on an as-needed or predictive basis, rather than on a schedule, and therefore improve their operational efficiency. (Similar projects exist for Rolls-Royce jet engines, for example.) An M2M service company could take on the project, supplying the hardware, the service enablement, the connectivity, and some or all of the data work, in exchange for a share in the gains from improved availability and on-time running. As there are numerous TOCs, leasing companies, and M2M applications in UK rail, such a company might even aggregate more projects and become a sector-wide platform.

That could, in turn, create opportunities for new and creative re-use of the data resource.

Source: http://www.telco2.net/blog/2013/01/the_m2m_service_company_a_new.html#more

Don’t go technical on telecom

21 Jan

With too many policy matters in flux, using technical analysis for telecom stocks may not throw up the right results

Several  telecom policy issues will be resolved one way or another in the next couple of months. As of now, that’s all the market knows about the sector. The resolution could be relatively good or bad for specific companies.

It guarantees a shakeout of severe dimensions. The service providers left functional at the end of this chaos will face less competition, which means that they may be able to raise tariffs. India is also a market which still has plenty of growth opportunities.

More sophisticated consumers will move to heavier mobile data usage, raising average revenue per user ( ARPU). Already, mobile internet traffic exceeds fixed line internet traffic in India and if  3G coverage was less spotty, usage would pick up faster. Plus, there’s under-penetration in rural and semi-urban geographies so subscriber numbers should also grow. Both growth factors are long-term, of course.

Indian telecom service providers (telcos) will, however, have higher cost inputs. First, there will be the enhanced cost of various bands of spectrum. Whatever that cost finally turns out to be, it will be much more expensive than in a prior period. There could also be service disruptions and additional capex will be required if there is re-farming.

The price of diesel is also being hiked in stages. That is another key input. Telcos must use back up power and usually employ gensets for the purpose. The telecom industry is probably the biggest diesel consumer in India. The demand for backup power in telecom is not going to ease, until the unrelated mess in the power sector is dealt with.

This may mean that rural and semi-urban networks will be loss-making, given low penetration and low ARPU. Financing for licenses and capex will either be high-cost (if rupee denominated) or taken in forex at high exchange risk. International telecom majors are also wary about India exposure after multiple fiascos so that puts a brake on potential  FDI inflows.

To make a judgement call involves balancing off the prospects of long-term growth in a less competitive scenario, versus rising costs and policy uncertainty. Generating hard numbers with any accuracy is impossible even for insiders. One has to make assumption upon assumption regarding policy, and also assume long-term growth rates for traffic, ARPU and subscriber numbers.

PSUs  MTNL and  BSNL are both loss-makers and unlikely to turn around without a dramatic display of political will.  RCom has too much debt on the balance sheet.  Airtel has seen declining quarterly margins now for three years and this has led to a change in top management.

The market seems selectively optimistic, however. Idea, Rcomm and Airtel have all seen sustained uptrends in the past 30 days. Airtel is up 12 per cent even as the  Nifty was up 3 per cent. Idea is up 23 per cent and RCom up 10 per cent while the Nifty Junior (to which both stocks belong) is up 2 per cent. Even Tata Communications, a specialised player, has risen 5 per cent. MTNL has stagnated meanwhile. In technical terms, the patterns generated by Airtel, Idea and RCom are all bullish. Newsflow pertaining to spectrum auctions at the end of the month could affect this. But the effect may be positive as well.

The normal way to handle this sort of trending pattern is to set a trailing stop loss and go long. If the stock moves up, raise the stop loss. Hold until such time as the stop is hit. In this case, the trader must be prepared for extraordinary volatility every time pertinent information hits the market. If you’re using leveraged instruments like stock futures, single session gains or losses could be really huge, given the big (4000 share) lots of Idea and RCom.

The niggling worry is that information is not symmetrically available to all players. There are too many policy matters in flux and too many strategic management decisions on the table. Technical analysis works best when the market has symmetric information flow and knows what to discount. This is not such a situation.

Source: http://www.business-standard.com/india/news/dont-go-technicaltelecom-/499418/

KPMG punts customer-centricity

21 Aug

Telecom operators worldwide are facing tough challenges as voice market growth either saturates or slows down. Strong adjacent industry competition from over-the-top brands, device companies and social media has demonstrated adequate signals to limit telecom operators’ role in the entire sector.
Revenue growth from voice-based services is tapering down, with more and more customers demanding data services. Margins are under threat.
While in most African states voice is, and will continue to be, the dominant business for telecom service providers, average revenue per user (ARPU) is declining because of these challenges and enhanced competition.
KPMG has built a significant practice in the telecoms industry in Africa. Its Africa Telecom Group (ATG) is a joint venture between key KPMG member firms from Africa, Europe, the Middle East and India. Each brings specific knowledge and expertise in the telecommunications industry to the table, creating a centre of excellence that can help clients succeed in Africa.
In response to the sector’s challenges, KPMG believes customer focus is essential.
“Increasingly, global organisations from services sectors such as telecoms are incorporating customer-centricity in their core business strategy,” says Johan Smith, director and head of KPMG ATG.
It is imperative for telecom operators in Africa, and elsewhere, to focus on the customer to sustain and grow business. They also need to leverage innovation, and introduce new products and value-added services to manage the ARPU decline and retain a prominent position in the value-chain.
During the Fourth Annual Customer Retention and Profitability Summit held on 3 and 4 July in Johannesburg, KPMG experts presented their thoughts on customer-centricity, innovation and enhancing profitability.
A panel discussion with African telecom service providers, led by Jaideep Ghosh, partner in business effectiveness and strategy, KPMG in India, focused on innovation and value-added services to enhance ARPUs.
The panellists agreed that value-added services are now practically part of the core proposition. Further discussion led to an agreement that telecom operators need to collaborate significantly more with ecosystem players over and above usage-based and bolt-on value-added services.
The panel also recognised the impact of social media and over-the-top contenders, and the need for telecom operators to innovate continuously to retain value.
“Innovation needs to be institutionalised in the telecom service provider’s DNA, and has to be centred on the customer, taking into account the importance of social media,” says Ghosh.
Joaquim Ribeiro, senior manager in Management Consulting, KPMG in Portugal, also presented at the conference. He focused on the cost side of the profitability equation.
“I think we should discuss the philosophy of improved customer service but not at any cost. Cost to serve has been increasing constantly with no direct link to increasing ARPU.”
The presentation sparked an interesting debate about how best to serve and retain customers while controlling the cost at a time when the cost of capital is high and shareholders are increasingly worried about the earnings before interest, taxes, depreciation and amortisation (EBITDA) margin.
KPMG has been very active in the operating expense (OPEX) and capital expenditure (CAPEX) optimisation space without losing focus on customer-centricity and core strategy.
“Customer-centricity is no longer a mere buzzword in the telecoms industry. It has become a business necessity to stay ahead of competition,” concludes Smith.

Source: http://www.it-online.co.za/2012/08/20/kpmg-punts-customer-centricity/ Posted by IT-Online on Aug 20, 2012