Archive | Services RSS feed for this section

Data monetization and customer experience optimization using Telco data assets

27 Jan

To overcome the risk of being relegated to a utility or dumb pipe, TSPs today are looking to diversify, adopting alternative business models to generate new revenue streams.

In recent times, adopting customer experience (CX) and data monetization initiatives has been a key theme across all industries. Although many Tier-1 TSPs are leading this transformation by using new technologies to improve CX and improve profitability, many TSPs have yet to embark on this challenging but rewarding journey.

Building and implementing a CX management and data monetization strategy

Data monetization is often misunderstood as making dollars by selling data, but what it really means is to drive revenue by increasing the top line or the bottom line. It can be tangible or intangible, internal or external, or by making use of data assets.

According to Gartner, most data and analytics leaders are looking to increase investments in business intelligence (BI) and analytics (see the following study results).

The preceding visualization is from “The 2019 CIO Agenda: Securing a New Foundation for Digital Business”, published October 15, 2018.

Although the external monetization opportunities are limited due to strict regulations, a plethora of opportunities exist for TSPs to monetize data both internally (regulated but much less compared to external) and externally via a marketplace (highly regulated). If TSPs can shift their mindsets from selling data to focus on using data insights for monetization and improving CX, they can adopt a significant number of use cases to realize an immediate positive impact.

Tapping and utilizing insights around customer behavior acts like a Swiss Army Knife for businesses. You can use these insights to drive CX, hyper-personalization and localization, micro-segmentation, subscriber retention, loyalty and rewards programs, network planning and optimization, internal and external data monetization, and more. The following are some use cases that can be driven using CX and data monetization strategies:

  • Segmentation/micro-segmentation (cross-sell, up-sell, targeted advertising, enhanced market locator); for example:
    • Identify targets for consuming baby products or up-selling a kids-related TV channel
    • Identify females in the age range of 18-35 to target for high-end beauty products or apparels

You can build hundreds of such segments.

  • Personalized loyalty and reward programs (incentivize customers with what they like). For example, movie tickets or discounts for a movie lover, or food coupons and deals for a food lover.
  • CX-driven network optimization (allocate more resources to streaming hotspots with high-value customers).
  • Identifying potential partners for joint promotions. For example, bundling device offers with a music app subscription.
  • Hyper-personalization. For example, personalized recommendations for on-portal apps and websites.
  • Next best action and next best offer. For example, intelligent bundling and packaging of offerings.

Challenges with driving CX and data monetization

In this digital era, TSPs consider data analytics a strategic pillar in their quest to evolve into a true data-driven organization. Although many TSPs are harnessing the power of data to drive and improve CX, there are technological gaps and challenges to baseline and formulate internal and external data monetization strategies. Some of these challenges include:

  • Non-overlapping technology investments for CX and data monetization due to misaligned business and IT initiatives
  • Huge CAPEX requirements to process massive volumes of data
  • Inability to unearth hidden insights due to siloed data initiatives
  • Inability to marry various datasets together due to missing pieces around data standardization techniques
  • Lack of user-friendly tools and techniques to discover, ingest, process, correlate, analyze, and consume the data
  • Inability to experiment and innovate with agility and low cost

In this two-part series, I demonstrate a working solution with an AWS CloudFormation template for how a TSP can use existing data assets to generate new revenue streams and improve and personalize CX using AWS services. I also include key pieces of information around data standardization, baselining an analytics data model to marry different datasets in the data warehouse, self-service analytics, metadata search, and media dictionary framework.

In this post, you deploy the stack using a CloudFormation template and follow simple steps to transform, enrich, and bring multiple datasets together so that they can be correlated and queried.

In part 2, you learn how advanced business users can query enriched data and derive meaningful insights using Amazon Redshift and Amazon Redshift Spectrum or Amazon Athena, enable self-service analytics for business users, and publish ready-made dashboards via Amazon QuickSight.

Solution overview

The main ingredient of this solution is Packet Switch (PS) probe data embedded with a deep packet inspection (DPI) engine, which can reveal a lot of information about user interests and usage behavior. This data is transformed and enriched with DPI media and device dictionaries, along with other standard telco transformations to deduce insights, profile and micro-segment subscribers. Enriched data is made available along with other transformed dimensional attributes (CRM, subscriptions, media, carrier, device and network configuration management) for rich slicing and dicing.

For example, the following QuickSight visualizations depict a use case to identity music lovers ages 18-55 with Apple devices. You can also generate micro-segments by capturing the top X subscribers by consumption or adding KPIs like recency and frequency.

The following diagram illustrates the workflow of the solution.

For this post, AWS CloudFormation sets up the required folder structure in Amazon Simple Storage Service (Amazon S3) and provides sample data and dictionary file. Most of the data included as part of the CloudFormation template is dummy and is as follows:

  • CRM
  • Subscription and subscription mapping
  • Network 3G & 4G configuration management
  • Operator PLMN
  • DPI and device dictionary
  • PS probe data

Descriptions of all the input datasets and attributes are available with AWS Glue Data Catalog tables and as part of Amazon Redshift metadata for all tables in Amazon Redshift.

The workflow for this post includes the following steps:

  1. Catalog all the files in the  Data Catalog using the following  data crawlers:
    1. DPI data crawler (to crawl incoming PS probe DPI data)
    2. Dimension data crawler (to crawl all dimension data)
  2. Update attribute descriptions in the Data Catalog (this step is optional).
  3. Create Amazon Redshift schema, tables, procedures, and metadata using an AWS Lambda
  4. Process each data source file using separate AWS Glue Spark jobs. These jobs enrich, transform, and apply business filtering rules before ingesting data into an Amazon Redshift cluster.
  5. Trigger Amazon Redshift hourly and daily aggregation procedures using Lambda functions to aggregate data from the raw table into hourly and daily tables.

Part 2 includes the following steps:

  1. Catalog the processed raw, aggregate, and dimension data in the Data Catalog using the DPI processed data crawler.
  2. Interactively query data directly from Amazon S3 using Amazon Athena.
  3. Enable self-service analytics using QuickSight to prepare and publish insights based on data residing in the Amazon Redshift cluster.

The workflow can change depending on the complexity of the environment and your use case, but the fundamental idea remains the same. For example, your use case could be processing PS probe DPI data in real time rather than in batch mode, keeping hot data in Amazon Redshift, storing cold and historical data on Amazon S3, or archiving data in Amazon S3 Glacier for regulatory compliance. Amazon S3 offers several storage classes designed for different use cases. You can move the data among these different classes based on Amazon S3 lifecycle properties. For more information, see Amazon S3 Storage Classes.

Prerequisites

For this walkthrough, you should have the following prerequisites:

For more information about AWS Regions and where AWS services are available, see Region Table.

Creating your resources with AWS CloudFormation

To get started, create your resources with the following CloudFormation stack.

  1. Click the Launch Stack button below:
  2. Leave the parameters at their default, with the following exceptions:
    1. Enter RedshiftPassword and S3BucketNameParameter parameters, which aren’t populated by default.
    2. An Amazon S3 bucket name is globally unique, so enter a unique bucket name for S3BucketNameParameter.

The following screenshot shows the parameters for our use case.

  1. Choose Next.
  2. Select I acknowledge that AWS CloudFormation might create IAM resources with custom names.
  3. Choose Create stack.

It takes approximately 10 minutes to deploy the stack. For more information about the key resources deployed through the stack, see Data Monetization and Customer Experience(CX)Optimization using telco data assets: Amazon CloudFormation stack details. You can view all the resources on the AWS CloudFormation console. For instructions, see Viewing AWS CloudFormation stack data and resources on the AWS Management Console.

The CloudFormation stack we provide in this post serves as a baseline and is not a production-grade solution.

Building a Data Catalog using AWS Glue

You start by discovering sample data stored on Amazon S3 through an AWS Glue crawler. For more information, see Populating the AWS Glue Data Catalog. To catalog data, complete the following steps:

  1. On the AWS Glue console, in the navigation pane, choose Crawlers.
  2. Select DPIRawDataCrawler and choose Run crawler.
  3. Select DimensionDataCrawler and choose Run crawler.
  4. Wait for the crawlers to show the status Stopping.

The tables added against the DimensionDataCrawler and DPIRawDataCrawler crawlers should show 9 and 1, respectively.

  1. In the navigation pane, choose Tables.
  2. Verify the following 10 tables are created under the cemdm database:
    • d_crm_demographics
    • d_device
    • d_dpi_dictionary
    • d_network_cm_3g
    • d_network_cm_4g
    • d_operator_plmn
    • d_tac
    • d_tariff_plan
    • d_tariff_plan_desc
    • raw_dpi_incoming

Updating attribute descriptions in the Data Catalog

The AWS Glue Data Catalog has a comment field to store the metadata under each table in the AWS Glue database. Anybody who has access to this database can easily understand attributes coming from different data sources through metadata provided in the comment field. The CloudFormation stack includes a CSV file that contains a description of all the attributes from the source files. This file is used to update the comment field for all the Data Catalog tables this stack deployed. This step is not mandatory to proceed with the workflow. However, if you want to update the comment field against each table, complete the following steps:

  1. On the Lambda console, in the navigation pane, choose Functions.
  2. Choose the GlueCatalogUpdate
  3. Configure a test event by choosing Configure test events.
  4. For Event name, enter Test.
  5. Choose Create.
  6. Choose Test.

You should see a message that the test succeeded, which implies that the Data Catalog attribute description is complete.

Attributes of the table under the Data Catalog database should now have descriptions in the Comment column. For example, the following screenshot shows the d_operator_plmn table.

Creating Amazon Redshift schema, tables, procedures, and metadata

To create schema, tables, procedures, and metadata in Amazon Redshift, complete the following steps:

  1. On the Lambda console, in the navigation pane, choose Functions.
  2. Choose the RedshiftDDLCreation
  3. Choose Configure test events.
  4. For Event name, enter Test.
  5. Choose Create.
  6. Choose Test.

You should see a message that the test succeeded, which means that the schema, table, procedures, and metadata generation is complete.

Running AWS Glue ETL jobs

AWS Glue provides the serverless, scalable, and distributed processing capability to transform and enrich your datasets. To run AWS Glue extract, transform, and load (ETL) jobs, complete the following steps:

  1. On the AWS Glue console, in the navigation pane, choose Jobs.
  2. Select the following jobs (one at a time) and choose Run job from Action
    • d_customer_demographics
    • d_device
    • d_dpi_dictionary
    • d_location
    • d_operator_plmn
    • d_tac
    • d_tariff_plan
    • d_tariff_plan_desc
    • f_dpi_enrichment

You can run all these jobs in parallel.

All dimension data jobs should finish successfully within 3 minutes, and the fact data enrichment job should finish within 5 minutes.

  1. Verify the jobs are complete by selecting each job and checking Run status on the History tab.

Aggregating hourly and daily DPI data in Amazon Redshift

To aggregate hourly and daily sample data in Amazon Redshift using Lambda functions, complete the following steps:

  1. On the Lambda console, in the navigation pane, choose Functions.
  2. Choose the RedshiftDPIHourlyAgg function.
  3. Choose Configure test events.
  4. For Event name, enter Test.
  5. Choose Create.
  6. Choose Test.

You should see a message that the test succeeded, which means that hourly aggregation is complete.

  1. In the navigation pane, choose Functions.
  2. Choose the RedshiftDPIDailyAgg function.
  3. Choose Configure test events.
  4. For Event name, enter Test.
  5. Choose Create.
  6. Choose Test.

You should see a message that the test succeeded, which means that daily aggregation is complete.

Both hourly and daily Lambda functions are hardcoded with the date and hour to aggregate the sample data. To make them generic, there are a few commented lines of code that need to be uncommented and a few lines to be commented. Both functions are also equipped with offset parameters to decide how far back in time you want to do the aggregations. However, this isn’t required for this walkthrough.

You can schedule these functions with CloudWatch. However, this is not required for this walkthrough.

So far, we have completed the following:

  1. Deployed the CloudFormation stack.
  2. Cataloged sample raw data by running DimensionDataCrawler and DPIRawDataCrawler AWS Glue crawlers.
  3. Updated attribute descriptions in the AWS Glue Data Catalog by running the GlueCatalogUpdate Lambda function.
  4. Created Amazon Redshift schema, tables, stored procedures, and metadata through the RedshiftDDLCreation Lambda function.
  5. Ran all AWS Glue ETL jobs to transform raw data and load it into their respective Amazon Redshift tables.
  6. Aggregated hourly and daily data from enriched raw data into hourly and daily Amazon Redshift tables by running the RedshiftDPIHourlyAgg and RedshiftDPIDailyAgg Lambda functions.

Cleaning up

If you don’t plan to proceed to the part 2 of this series, and want to avoid incurring future charges, delete the resources you created by deleting the CloudFormation stack.

Conclusion

In this post, I demonstrated how you can easily transform, enrich, and bring multiple telco datasets together in an Amazon Redshift data warehouse cluster. You can correlate these datasets to produce multi-dimensional insights from several angles, like subscriber, network, device, subscription, roaming, and more.

In part 2 of this series, I demonstrate how you can enable data analysts, scientists, and advanced business users to query data from Amazon Redshift or Amazon S3 directly.

Post Syndicated from Vikas Omer original https://aws.amazon.com/blogs/big-data/part-1-data-monetization-and-customer-experience-optimization-using-telco-data-assets/

Source: https://noise.getoto.net/2021/01/26/data-monetization-and-customer-experience-optimization-using-telco-data-assets-part-1/ – 27 01 21

Open Data vs. Web Content: Why the distinction?

14 Feb

For those who are unfamiliar with our line of work, the difference between open data vs. web content may be confusing. In fact, it’s even a question that doesn’t have a clear answer for those of us who are familiar with Deep Web data extraction.

One of the best practices we do as a company is reaching out to other companies and firms in the data community. In order to be at the top of our game, we only benefit from picking the brains of those with industry perspectives of their own.

To find out the best way to get more insight on this particular topic, our Vice President of Business Development, Tyson Johnson, had a discussion with some of the team members at Gartner. As a world-renowned research and advisory firm, Gartner has provided technological insight for businesses all around the globe.

Open Data vs. Web Content

According to his conversation with Gartner, their company perspective is that open data is information online that is readily findable and also meant to be consumed or read by a person looking for that information (i.e. a news article or blog post). Web content, conversely, is content that wasn’t necessarily meant to be consumed by individuals in the same way but is available and people likely don’t know it or how to get it (i.e. any information on the Deep Web).

In a lot of the work we do, whether or not all of this data is material a lot of people are aware of and consuming is up for debate.

For example, we’ve been issuing queries in the insurance space for commercial truck driving. This is definitely information that people are aware of, but the Deep Web data extraction that comes back isn’t necessarily easily consumed or accessed. So is it open data or web content?

It’s information that a random person surfing the Internet can find if they want to look for it. However, many aren’t aware that the Deep Web exists. They also don’t know that they have the ability to pull back even more relevant information.

So why is this distinction even being discussed? The data industry has struggled with what to call things so people can actually wrap their head around what’s out there.

The industry is realizing we need to make a distinction between most Internet users know they can consume; news articles, information on their favorite sports team, the weather of the day, etc. (open data), but they probably don’t know that there’s something called the Deep Web where they can issue queries into other websites and pull back even more information that’s relevant to what they’re looking for (web content).

Making as many people aware of the data that is available to them is at the core of the distinction and really as long as you understand the difference, we think it’s okay to call it and explain it however you want.

Web Data and How We Use It

BrightPlanet works with all types of web data. Our true strength is automating the harvesting of information that you didn’t know existed.

How this works is that you may know of ten websites that have information relevant to your challenge.

We then harvest the data we are allowed to from those sites through Deep Web data extraction. We’ll more than likely find many additional sources that will be of use to you as well.

The best part is that as our definitions of data expand, so do our capabilities.

Future Data Distinctions and Trends

It was thought that there were three levels of data we worked with: Surface Web, Deep Web, and Dark Web. According to Tyson, the industry is discovering that there may be additional levels to these categories that even go beyond open data and web content.

On top of all of this is the relatively new concept of the industrial Internet. The industrial Internet is a collection of gigabits of data generated from industrial items like jet engines and wind turbines. Tyson points out that the industrial Internet may be three times the size of the consumer Internet we’re familiar with. So when the industrial Internet becomes more mainstream will it be web content and everything on the consumer Internet be open data? We’ll have to wait and see.

These future trends put us in a good position to help tackle your challenges and find creative solutions. We harvest all types of data. If you’re curious about how BrightPlanet can help you and your business, tell us what you’re working on. We’re always more than happy to help give you insight on what our Data-as-a-Service can do for you.

Source: https://brightplanet.com/2017/02/tyson-gartner-open-data-vs-web-content/

The biggest story to watch in high-tech: Shift to services

2 May

The biggest high-tech story to watch is the shift by traditional hardware, software, and platform companies to deliver their offerings as services to business customers and consumers.

But there are big challenges.

Many high-tech companies have yet to make this change, and doing so will be a huge strategic and financial undertaking. If well executed, however, they will be empowered to provide service offerings faster, deliver continuous and more personalised services, disrupt and enter new markets, and generate a continuous and consistent stream of and higher revenues and profits.

As-a-Service business model

‘As-a-Service’ is the term often used to describe this profound industry transition. It means delivering value through on-demand, highly scalable, plug-and-play services. In the Software As-a-Service business, for example, instead of manufacturing software on disks and shipping them to brick and mortar stores, that same software can be placed in the cloud. Customers can subscribe to the service by paying a subscription fee. By doing so, companies no longer have to maintain costly onsite computer servers or grapple with multiple software versions.

The race towards this new business model is well underway. Software and content industries have already moved in this direction. But many hardware companies, such as manufacturers of smartphones, PCs, servers, networking equipment and any products using Internet of Things (IoT) technology, are just getting started.

This is a problem. It’s also an opportunity.

Most of their product features and functionalities are defined by the software embedded on top of the hardware. Using this new model, companies can roll out new functions after the product has been taken into operations.

As-a-Service will become a required business model for any company selling ‘intelligent’ or ‘smart’ products whose value lies in the features and functionalities defined by software.

Shifting to this business model quickly is especially critical for these manufacturers. Facing a sluggish high-tech device market, providing value-added services offers new opportunities and revenue streams critical for their near- and long-term success. Accenture has released a new report about this transition.

Underestimating costs

The path to delivering as a service will be neither easy nor inexpensive. Accenture has learned that many underestimate how much this will cost and how vital it is to make begin this transition now. Embracing the As-a-Service model will impact most corporate functions including sales and marketing, research and development, finance and administration, customer support, and logistics.

Many hardware companies are late to this shift and will be challenged to catch formidable companies already well down entrenched in this business model such as Amazon, Facebook, and Google. These trailblazing companies are now scaling their service offerings, entering new markets, and attracting huge large numbers of customers. Many hardware companies have not made this shift quickly enough to offset declines in market growth and compete in this expanding and lucrative market.

Three disruptions driving move to services

The underlying trend making it necessary for these companies to adopt a services model swiftly is widespread and multi-dimensional disruption. Accenture identifies three disruptions of paramount importance:

Disruption #1: Internet and social media

The Internet and social media disruptions have reshaped the electronics and high-tech market. Amazon, Facebook, Google, as well as other high-tech companies that operate on pure digital platforms such as the Internet and mobile, have transformed their markets and are continually entering and disrupting adjacent markets. Providing news-as-a-service, music-as-a-service, and video-as-a-service are prime examples. Operating as all-digital businesses, they gain unlimited scale and versatility to launch new offerings quickly at minimal costs. As such, they thrive in established markets.

Leveraging digital technologies, they are adept at using analytics to gain more valuable customer insights, innovate more, and respond faster and more intelligently to market changes. Their broad influence is changing how consumers and companies buy technology, view content, and communicate socially.

To compete, the only viable option for a traditional high-tech player is to embrace the As-a-Service model.

Shifting to services is a major challenge for many of them. Traditional operating models cannot match the speed and capabilities of their formidable competitors. These hardware companies need to embrace new ways of providing services and make digital the central strategic focus of their companies.

Disruption #2: Consumer technology products become platforms

Stand-alone consumer technology products are becoming obsolete. IoT and the cloud computing are converting every day products into interconnected, multi-faceted platforms. Functioning as multi-purpose, interactive platforms, smartphones, tablets, gaming consoles, TVs, and security cameras transcend their original purpose. They provide their suppliers with consistent pipelines for reaching consumers, and generate valuable analytics about how platforms are used.

Disruption #3: New business models for new industry

As promising as this is, serious and systemic challenges associated with security, privacy and logistics of launching a services model need to be overcome. In too many cases traditional high-tech companies have been slow to address these issues. As a result, user confidence and the inherent value of the service has been undermined. For many, the main challenge is figuring out when and where to begin.

So where do they begin?

High-tech hardware manufacturers need a structured approach to the problem that starts with forming a strategy.

As part of this, a key component is recognising that As-a-Service permeates virtually all aspects of a business. As such, every contingency must be planned. Research and development functions need to be equipped to handle swift and continuous product evolution. Real-time insights into customer behaviors and product performance must be gathered to change the way sales and marketing teams operate.

Pricing and billing needs to be more responsive to fluctuating demand. New products and platforms require supply chains in multiple cloud modes with comprehensive connected tools and processes.

For each company, a separate As-a-Service business unit should operate alongside the company’s traditional business. To minimise corporate disruption while the new business grows, the older business can be ramped down over time. This dual and coordinated approach controls disruption and eases the transition.

Final thoughts

This business model change is undeniable. It is a complex and expensive challenge. To succeed, companies must be realistic – and ready to embrace — the scale and investment costs.

On multiple levels transitioning to a service business model is a major strategic undertaking. Yet as daunting as this is, the industry is accelerating full throttle in this direction. In 2016 the As-a-Service transition will accelerate and expand. High-tech companies, particularly traditional hardware manufacturers, must embrace this fundamental change.

Whether they do this, at what speed, and to what level of effectiveness will be the most compelling and important story to track this year.

 

Source: http://www.itproportal.com/2016/05/01/the-biggest-story-to-watch-in-high-tech-shift-to-services/

U.S. seeks trials to test transition to digital phone networks

31 Jan

 

U.S. wireless providers like AT&T Inc andVerizon Communications Inc on Thursday received a nod from regulators to test a transition of the telephone industry away from traditional analog networks to digital ones.

The Federal Communications Commission unanimously voted in favor of trials, in which telecommunications companies would test switching telephone services from existing circuit-switchtechnology to an alternative Internet protocol-based one to see how the change may affect consumers.

The experiments approved by the FCC would not test the new technology – it is already being used – and would not determine law and policy regulating it, FCC staff said. The trials would seek to establish, among other things, how consumers welcome the change and how new technology performs in emergency situations, including in remote locations.

“What we’re doing here is a big deal. This is an important moment,” FCC Chairman Tom Wheeler said. “We today invite service providers to propose voluntary experiments for all-IP networks.”

The move in part grants the application by AT&T to conduct IP transition tests as companies that offer landline phone services seek to ultimately replace their old copper wires with newer technology like fiber or wireless.

“We cannot continue requiring service providers to invest in both old networks and new networks forever,” Commissioner Ajit Pai, a Republican, said.

Some consumers, particularly in rural or hard-to-reach areas, have complained about poor connectivity of their IP-based services. Advocates have also expressed concerns about the impact of the transition on consumers with disabilities.

“I think we must be mindful of the impact this transition has on consumers — their needs, their expectations and their willingness to embrace network change,” said Commissioner Jessica Rosenworcel, a Democrat.

The trials will be voluntary, and regulators require that the experiments “cover areas with different population densities and demographics, different topologies, and/or different seasonal and meteorological conditions.” They also require that no consumers be left disconnected.

Source: http://www.reuters.com/article/2014/01/30/usa-fcc-iptransition-idUSL2N0L414G20140130

Mobile Trends: Vision for 2014

27 Dec

Recently, we knew that the future of Mobile technology would contain all the same things, but vastly accelerated. Today, we realize that 2014 holds a huge possibility for new and different. This article on the key Mobile trends for 2014 will focus on Mobile First, S+S or Client-Cloud, Wearables and BYOD, BYOA & BYOT.

Have you just got used to iPhones and Droids? Have you started to feel comfortable with the new mobile world order? Then prepare for a disruption as 2014 is going to be a year of changes for mobile trends and beyond…

The more precise term is mobile and wearable technology trends, as it better reflects the overwhelming integration of machines into our everyday life and business. Mobiles and Wearables are already changing lifestyles and industries. Recently, we knew that the future would contain all the same things, but vastly accelerated. Today, we realize that 2014 holds a huge possibility for new and different. In this two-part series on the key Mobile trends for 2014 I`ll focus on:

  • MOBILE FIRST
  • S+S or CLIENT-CLOUD
  • WEARABLES
  • BYOD, BYOA, BYOT
  • PERSONAL EXPERIENCE
  • UBIQUITOUS UI
  • PERSONALIZED HEALTHCARE.

Mobile First

In 2013, the retail industry had to face the fact that the majority of time spent online is accessed via smartphones and tablets rather than from PCs, with the ratio of 55% vs. 45% in favor of mobiles, according to comScore stats.

It is a clear sign that enterprises should (and will) sit up and take notice. Their time-to-market strategies will most likely be built on top of the Mobile First initiative, which is a proof of concept for new business strategies and mobilized enterprises. Mobile First could transform into an Android First for enterprises with a field workforce, as the Android Launcher allows full smartphone customization for exclusive business needs.

S+S or Client-Cloud

The need for native apps will undoubtedly prevail. While SaaS, PaaS and IaaS are continuing to mature, we are seeing the strengthening of a new trend of Software+Services aka S+S. The occasionally connected scenario will remain as a preferred paradigm for app design. Another point in favor of native apps is hardware, especially the presence of new sensors. Lengthy standardization procedures leave no chance for creating an HTML “silver bullet” code that will run everywhere and use all novelty hardware. The native approach, on the other hand, allows instant access to new sensors and is more likely to ensure a better user experience.

While the native code on the mobile devices is Software/Client, and the back-end is Services, together they form a Software+Services model. With Services running on the Cloud, it can be considered a remake of the old Client-Server, transformed into the Client-Cloud.

Wearables

Wearable devices clearly deserve a separate paragraph. Wearables signify the beginning of a new massive wave in computing. These are devices for humans, machinery and movable machinery. Let’s describe the three groups of wearables:

  • Humans will have universal wrist band gadgets and glasses, as well as medical body-friendly devices, capable of tracking the body`s vital signs and other body parameters.
  • Homes, offices, and stores will soon be packed with sensors and connected thinking machines, running real-time analytics.
  • Cars, cargo and goods will be continuously tracked and managed.

It is a domain of embedded programming, therefore it is only logical to predict the increasing popularity of embedded programming platforms and tools. By connecting everything to the Internet we are going beyond the Internet of Things (IoT), into the realm of the Internet of Everything (IoE).

And last but not least, the Wearables will become a huge data source for Big Data and analytics (Machine Data).

BYOD, BYOA, BYOT

As a reflection of a much wider Do It Yourself (DYI) trend, enterprises will experience further strengthening of Bring Your Own Device (BYOD), Bring Your Own Application (BYOA), and Bring Your Own Technology (BYOT).

As the Cornerstone productivity study proves, Millennials are ready (and quite enthusiastic about it) to spend their own money on work-related mobile devices and gadgets, mobile apps and technologies.

Small and medium businesses will have to establish BYOD/BYOA/BYOT policies rather than trying to prohibit these initiatives. Of course, enterprise security is a serious issue, but far from being a road block for establishing such policies. We`ve already seen a similar process with Enterprise 2.0, when people wanted to bring Web 2.0 technologies and tools to the enterprise. Today, employees will start bringing Web of Apps to the enterprises.

 

Continuing the overview of the key Mobile Trends to prevail in 2014 as based on the tendencies we`ve noticed in SoftServe`s mobility projects this year, this article focuses on the three important 2014 Mobile trends: Personal Experience, Ubiquitous UI and Personalized Healthcare.

This is the second part of my overview of the key Mobile Trends to prevail in 2014 as based on the tendencies we`ve noticed in SoftServe`s mobility projects this year. In the previous part I have already discussed Mobile First, Software+Service, Wearables, and BYOD, BYOA and BYOT. This article focuses on the next four important 2014 Mobile trends: Personal Experience, Ubiquitous UI and Personalized Healthcare.

Personal Experience

Judging from the consumers` behavior and today`s marketplace situation, the strengthening of a Consumerism trend is a given. The consumers` interaction with the marketplace is further evolving.

We`ve already witnessed four eras of economy: extraction of commodities; making goods, service delivery and the staging of experience.

What a contemporary consumer wants is personal experience, authenticity, and individuality. It’s expected that the providers will meet these challenges by utilizing personal devices – mobile phones, wrist gadgets, glasses, tablets, home TV panels, car boards, etc. And although the role of speech interface will increase, I believe in the near future, visuals will prevail.

Ubiquitous User Interface

Mobile User Interface is getting ubiquitous. With smartphones omnipresent and smartwatches on the rise, people are used to always being “on” – wherever they are, it in the office, driving in a car, or sitting at home in front of a Smart TV.

Obviously, the users want to have the same features (and have them working exactly the same way) on wrist gadgets, car head units and Smart TVs. That’s probably why the iOS7 has been redesigned shifting to a “flat” style. While skeuomorphism is less efficient for cars, the flat design is a strategic step for the gadgets of tomorrow. It’s no longer a question of a single device, where form follows the function. It’s a set of connected services and products that are aware both of context and of each other, staging a special personal experience for a user. The goal is to ensure a continuous and consistent experience across all devices and channels, so it is the cross-channel UX that will become the basis for the rising demand of personal UX.

Personalized Healthcare

The impact of mobile and wearable devices is also transforming the healthcare industry, so I will mention a couple of mobile healthcare trends in this post.

Mobile and wearables are blurring the borders between treatment procedures (especially in the aftercare and preventive care) and lifestyle choices. They continuously track your behavior, nutrition, sleep, calories burned, vital signs and other health aspects and suggest the optimal behavior to prevent diseases. It`s a huge achievement for both preventive and treatment healthcare, and an important benefit is that it`s done remotely, outside of the hospital.

Here are two more technological opportunities that would have been considered a miracle just a couple of years ago:

  • It is now possible to conduct a sanitary check via spectral analysis using your smartphone only
  • Your smartphone can recognize the food on a supermarket shelf even without the bar code scanning – just from a picture of it.

Machine learning does it all; the devices we will use in 2014 are indeed smart devices.

Source: http://united.softserveinc.com/blogs/mobility/december-2013/mobile-trends-2014/

LTE Asia: transition from technology to value… or die

27 Sep

 

I am just back from LTE Asia in Singapore, where I chaired the track on Network Optimization. The show was well attended with over 900 people by Informa’s estimate.

Once again, I am a bit surprised and disappointed by the gap between operators and vendors’ discourse.

By and large, operators who came (SK, KDDI, KT, Chungwha, HKCSL, Telkomsel, Indosat to name but a few) had excellent presentations on their past successes and current challenges, highlighting the need for new revenue models, a new content (particularly video) value chain and better customer engagement.

Vendors of all stripes seem to consistently miss the message and try to push technology when their customer need value. I appreciate that the transition is difficult and as I was reflecting with a vendor’s executive at the show, selling technology feels somewhat safer and easier than value.
But, as many operators are finding out in their home turf, their consumers do not care much about technology any more. It is about brand, service, image and value that OTT service providers are winning consumers mind share. Here lies the risk and opportunity. Operators need help to evolve and re invent the mobile value chain.

The value proposition of vendors must evolve towards solutions such as intelligent roaming, 2-way business models with content providers, service type prioritization (messaging, social, video, entertainment, sports…), bundling and charging…

At the heart of this necessary revolution is something that makes many uneasy. DPI and traffic classification, relying on ports and protocols is the basis of today’s traffic management and is becoming rapidly obsolete. A new generation of traffic management engines is needed. The ability to recognize content and service types at a granular level is key. How can the mobile industry can evolve in the OTT world if operators are not able to recognize a content that is user-generated vs. Hollywood? How can operators monetize video if they cannot detect, recognize, prioritize, assure advertising content?

Operators have some key assets, though. Last mile delivery, accurate customer demographics, billing relationship and location must be leveraged. YouTube knows whether you are on iPad or laptop but not necessarily whether your cellular interface is 3G, HSPA, LTE… they certainly can’t see whether a user’s poor connection is the result of network congestion, spectrum interference, distance from the cell tower or throttling because the user exceeds its data allowance… There is value there, if operators are ready to transform themselves and their organization to harvest and sell value, not access…

Opportunities are many. Vendors who continue to sell SIP, IMS, VoLTE, Diameter and their next generation hip equivalent LTE Adavanced, 5G, cloud, NFV… will miss the point. None of these are of interest for the consumer. Even if the operator insist on buying or talking about technology, services and value will be key to success… unless you are planning to be an M2M operator, but that is a story for another time.

Watch TV anywhere with the new Bell TV app – at home or on the go, over Wi-Fi or high-speed wireless networks

26 Aug

Bell announced the launch of the new Bell TV app that lets customers watch popular channels they receive in their programming package at home on tablet or smartphones at no extra charge.

“Bell has invested billions in building next-generation Canadian wireless and fibre networks to execute our commitment to deliver the best TV experience across every screen,” said Wade Oosterman, President of Bell Mobility and Residential Services, and Chief Brand Officer at Bell. “The Bell TV app gives customers the flexibility to watch TV everywhere, at home or on the go. It’s the latest proof that with Bell, TV just got better.”

Navigating the Bell TV app is easy thanks to an innovative programming guide and interface that supports searching by program name, description, channel, live or on-demand. Bell Fibe TV customers can also stop and resume on-demand programming between their TV at home and their mobile device.

The Bell TV app includes the industry-leading Bell Mobile TV service that already enables Bell Mobility customers to watch TV anywhere over Bell’s high-speed wireless network or Wi-Fi. With Bell Mobile TV, Bell TV customers now have access to more than 100 unique live and on-demand channels.

10 hours of Mobile TV connectivity is provided with many popular Bell Mobility rate plans or as a $5 a month add-on that does not affect data usage in a customer’s plan. Current Bell Mobile TV customers are automatically upgraded to the Bell TV app (some customers may need to do so manually depending on their individual device settings).

In addition to conventional networks CTV and CTV Two, the Bell TV app provides subscribers access to specialty channels Bravo, A&E and History, premium channels TMN, SuperChannel and Movie Central, children’s programming on Teletoon, Treehouse and YTV, sports on TSN, Leafs TV and NBA TV Canada , 24-hour news on CTV News Channel, CBC News Network and BBC World News and a wide variety of French language including Super Écran, SRC, V and Canal D, RDS, RDI and LCN.

The Bell TV app is available for Android devices at Google Play, BB10 devices in BlackBerry App World and iOS in the Apple and iTunes App Store. For more information, please visit Bell.ca/TVAnywhere.

About Bell
Bell is Canada’s largest communications company, providing consumers and business customers with leading TV, Internet, wireless, home phone and business communications solutions. Bell Media is Canada’s premier multimedia company with leading assets in television, radio and digital media. Bell is wholly owned by Montréal’s BCE Inc. (TSX, NYSE: BCE). For more information, please visit Bell.ca.

The Bell Let’s Talk mental health initiative is a national charitable program that promotes Canadian mental health across Canada with the Bell Let’/s Talk Day anti-stigma campaign and support for community care, research and workplace best practices. To learn more, please visit Bell.ca/LetsTalk.

Image with caption: “Bell offers TV anywhere with new Bell TV app for smartphones and tablets (CNW Group/Bell Canada)”. Image available at: http://photos.newswire.ca/images/download/20130819_C6714_PHOTO_EN_29950.jp

Source: http://www.fierceiptv.com/press-releases/watch-tv-anywhere-new-bell-tv-app-home-or-go-over-wi-fi-or-high-speed-wirel

Four Steps to Monetizing 4G LTE Services

15 Jan

A 2012 survey by research firm Informa finds that 70% of global operators believe 4G services should be launched now, meaning Mobile Network Operators (MNOs) have a small, but closing, window of opportunity to be the provider of choice for communication services- including mobile video calling, instant messaging and presence, and Web collaboration.  Mobile operators must start the process of monetizing Long Term Evolution (LTE) today.

Recently, Forbes’s published my outline on four key ways to monetize the LTE opportunity:

1.     Speed is sexy, but applications drive revenue

Mobile operators need to extend beyond marketing campaigns that focus exclusively on faster speeds in order to generate the business case for higher monthly charges. Read more.

2.    Own the user experience

Compared to Over-the-Top (OTT) providers tied to specific applications (i.e., Skype calling) and devices, mobile operators, with ownership of the network, are well positioned to provide end-users with one identity and mobility across the devices. Read more.

3.    Meet exploding bandwidth requirements

LTE delivers the cost-effective bandwidth that enables mobile operators to realize economies of scale required to drive down the rising costs associated with delivering data over existing networks. Read more.

4.   Look beyond tried-and-true consumer segments

Compared to OTT providers, MNOs can more effectively ensure a carrier-grade quality of experience for end users of services suited for consumers but inadequate for the rigors of enterprise use. Read more.

By leveraging ownership of the network and the quality of service LTE enables, MNOs are well positioned to capitalize on a lucrative market; although, it is essential for MNOs to begin the LTE monetization process today in order to fully maximize the benefits and ensure emerging competitor segments do not strike first to unravel these efforts.

Read my full article here: http://bit.ly/MonetizeLTENow.

Source: http://broadbandignite.com/2013/01/14/four-steps-to-monetizing-4g-lte-services/

Usage-Based Service Management for Fixed-Line Broadband

7 Jan

Fixed-line broadband providers need to constantly invest in increased network capacity in order to keep pace with unrelenting growth in subscriber Internet usage, driven largely by bandwidth-intensive consumption of rich media content such as OTT video. Trouble is, while per-subscriber usage continues to increase, ARPU has remained flat, resulting in margins getting squeezed because there’s no incremental revenue growth to offset the capital expenditures in expanded capacity required to ensure satisfactory quality of experience for subscribers.

Network operators have two ways to overcome this challenge. They can adopt a usage-based pricing model that generates additional revenue, especially from the subscribers who consistently consume the most bandwidth. While this is certainly an important aspect of addressing the problem, it’s even more critical that operators have the ability to manage subscriber demand in order to alleviate congestion during periods of peak traffic load – typically the evening hours. Without this ability, the network has to be engineered for peak demand – a costly proposition that results in operators over-investing in network capacity.
The real key to solving the problem is harnessing subscriber usage and network utilization data. First, operators need to gain visibility into subscriber behavior and network performance so they can analyze demand for network capacity management and planning. They can invest capital more efficiently by knowing exactly where capacity needs to be expanded. Second, by leveraging new streaming data collection protocols and a high-performance data mediation and storage management system architecture, operators can use this same data for near real-time service and traffic management applications.
High-Performance Data Mediation
The network elements already deployed in the existing broadband network infrastructure are the source of a wealth of subscriber usage and network telemetry data that can be retrieved using highly efficient streaming data collection protocols such as IP Detail Record (IPDR), RADIUS and IPFIX/NetFlow. When enabled in a network element, these protocols operate by periodically taking a snapshot of a set of statistics and parameters and packaging the values into a single record that is sent to a centralized collector. If the collection interval is set short enough – 10 to 15 minutes – then it is possible to use the data collected for near real-time service and traffic management applications.
However, doing this effectively in a large network with hundreds of thousands or millions of subscribers requires a service management system capable of collecting, processing and storing a large number of stream data records within each specified collection interval. This involves decoding protocol records and performing a series of checks and cross-checks to ensure the integrity of the data. It also involves generating mediated subscriber usage records that are time normalized relative to a fixed reference for the service management system. These mediated records then need to be stored in an in-memory cache for rapid access by service and traffic management applications as well as written to disk for archival storage and historical trend analysis. The massive volume of usage data collected requires Big Data storage technology in order to meet the stringent performance and scalability requirements.
Applying Usage Data for Policy-Based Traffic Management
Mediated subscriber usage data can serve as the foundation for service and traffic management applications that measure and monitor subscriber usage as well as network utilization and apply policies to network elements to actively manage subscriber traffic in near real-time. Network operators can choose to implement policy-based proactive and reactive traffic management schemes to avoid network congestion and alleviate it when it occurs, improving overall utilization while ensuring subscriber quality of experience.
Proactive traffic management ensures that usage conforms to a subscriber’s service tier by continuously monitoring a subscriber’s usage over a sliding time window and triggering the application of policies to manage a subscriber’s service when specific usage thresholds defined in the subscriber’s service profile are exceeded. This approach simplifies network capacity management by ensuring subscriber usage will conform to a set of service tiers that can be modeled with the network engineered accordingly.
However, even a well-engineered network with all subscriber traffic in conformance can experience congestion during peak busy hours, resulting in the need for reactive traffic management. This approach detects congestion by continuously monitoring network utilization and automatically taking action to manage subscriber traffic when a specified utilization threshold is exceeded. A reasonable and fair way to alleviate congestion is to identify the subscribers with the most usage during the recent time window and apply policies to manage their traffic. Managing the traffic of just the top subscribers will free up bandwidth for the rest of the subscribers.
The Business Value of Usage-Based Broadband Service Management
Broadband providers can realize significant business value by leveraging subscriber usage data for broadband service management. Usage data is critical for network capacity management, allowing operators to analyze network utilization for more efficient capital expenditures when expanding capacity.  More importantly, it can serve as the foundation for policy-based service and traffic management applications that ensure more efficient network utilization while improving subscriber quality of experience, enabling network operators to better amortize investments in network capacity.

SMS – Assimilation is inevitable, Resistance is Futile!

2 Jan

Short Message Service or SMS for short, one of the corner stones of mobile services, just turned 20 years old in 2012.

Talk about “Live Fast, Die Young” and the chances are that you are talking about SMS!

The demise of SMS has already been heralded … Mobile operators rightfully are shedding tears of the (taken-for-granted?) decline of the most profitable 140 Bytes there ever was and possible ever will be.

Before we completely kill off SMS, let’s have a brief look at

SMS2012

The average SMS user (across the world) consumed 136 SMS (ca. 19kByte) per month and paid 4.6 US$-cent per SMS and 2.6 US$ per month. Of course this is a worldwide average and should not be over interpreted. For example in the Philippines an average SMS user consumes 650+ SMS per month pays 0.258 US$-cent per SMS or 1.17 $ per month.The other extreme end of the SMS usage distribution we find in Cameroon with 4.6 SMS per month paying 8.19 US$-cent per SMS.

We have all seen the headlines throughout 2012 (and better part of 2011) of SMS Dying, SMS Disaster, SMS usage dropping and revenues being annihilated by OTT applications offering messaging for free, etcetcetc… & blablabla … “Mobile Operators almost clueless and definitely blameless of the SMS challenges” … Right? … hmmmm maybe not so fast!

All major market regions (i.e., WEU, CEE, NA, MEA, APAC, LA) have experienced a substantial slow down of SMS revenues in 2011 and 2012. A trend that is expected to continue and accelerate with mobile operators push for mobile broadband. Last but not least SMS volumes have slowed down as well (though less severe than the revenue slow down) as signalling-based short messaging service assimilates to IP-based messaging via mobile applications.

Irrespective of all the drama! SMS phase-out is obvious (and has been for many years) … with the introduction of LTE, SMS will be retired.

Resistance is (as the Borg’s would say) Futile!

It should be clear that the phase out of SMS does Absolutely Not mean that messaging is dead or in decline. Far far from it!

Messaging is Stronger than Ever and just got so many more communication channels beyond the signalling network of our legacy 2G & 3G networks.

Its however important to understand how long the assimilation of SMS will take and what drivers impact the speed of the SMS assimilation. From an operator strategic perspective such considerations will provide insights into how quickly they will need to replace SMS Legacy Revenues with proportional Data Revenues or suffer increasingly on both Top and Bottom line.

SMS2012 AND ITS GROWTH DYNAMICS

So lets just have a look at the numbers (with the cautionary note that some care needs to be taken with exchange rate effects between US Dollar and Local Currencies across the various markets being wrapped up in a regional and a world view. Further, due to the structure of bundling propositions, product-based revenues such as SMS Revenues, can be and often are somewhat uncertain depending on the sophistication of a given market):

2012 is expected worldwide to deliver more than 100 billion US Dollars in SMS revenues on more than 7 trillion revenue generating SMS.

The 100 Billion US Dollars is ca. 10% of total worldwide mobile turnover. This is not much different from the 3 years prior and 1+ percentage-point up compared to 2008. Data revenues excluding SMS is expected in 2012 to be beyond 350 Billion US Dollar or 3.5 times that of SMS Revenues or 30+% of total worldwide mobile turnover (5 years ago this was 20% and ca. 2+ times SMS Revenues).

SMS growth has slowed down over the last 5 years. Last 5 years SMS revenues CAGR was ca. 7% (worldwide). Between 2011 and 2012 SMS revenue growth is expected to be no more than 3%. Western Europe and Central Eastern Europe are both expected to generate less SMS revenues in 2012 than in 2011. SMS Volume grew with more than 20% per annum the last 5 years but generated SMS in 2012 is not expected to more than 10% higher than 2012.

For the ones who like to compare SMS to Data Consumption (and please safe us from ludicrous claims of the benefits of satellites and other ideas out of too many visits to Dutch Coffee shops)

2012 SMS Volume corresponds to 2.7 Terra Byte of daily data (not a lot! Really it is not!)

Don’t be terrible exited about this number! It is like Nano-Dust compared to the total mobile data volume generated worldwide.

The monthly Byte equivalent of SMS consumption is no more than 20 kilo Byte per individual mobile user in Western Europe.

Let us have a look at how this distributes across the world broken down in Western Europe (WEU), Central Eastern Europe (CEE), North America (NA), Asia Pacific (APAC), Latin America (LA) and Middle East & Africa (MEA):

sms_revenues_2012 sms_volume_2012

From the above chart we see that

Western Europe takes almost 30% of total worldwide SMS revenues but its share of total SMS generated is less than 10%.

And to some extend also explains why Western Europe might be more exposed to SMS phase out than some other markets. We have already seen the evidence of Western Europe sensitivity to SMS revenues back in 2011, a trend that will spread in many more markets in 2012 and lead to an overall negative SMS revenue story of Western Europe in 2012. We will see that within some of the other regions there are countries that substantially more exposed to SMS phase-out than others in terms of SMS share of total mobile turnover.

sms_pricing sms_per_individual

In Western Europe a consumer would  for an SMS pay more than 7 times the price compared to a consumer in North America (i.e., Canada or USA). It is quiet clear that Western Europe has been very successful in charging for SMS compared to any other market in the World. An consumers have gladly paid the price (well I assume so;-).

SMS Revenues in Western Europe are proportionally much more important in Western Europe than in other regions (maybe with the exception of Latin America).

In 2012 17% of Total Western Europe Mobile Turnover is expected to come from SMS Revenues (was ca. 13% in 2008).

WHAT DRIVES SMS GROWTH?

It is interesting to ask what drives SMS behaviour across various markets and countries.

Prior to reasonable good quality 3G networks and as importantly prior to the emergence of the Smartphone the SMS usage dynamics between different markets could easily be explained by relative few drivers, such as

(1) Price decline year on year (the higher decline the faster does SMS per user grow, though rate and impact will depend on Smartphone penetration & 3G quality of coverage).

(2) Price of an SMS relative to the price of a Minute (the lower the more SMS per User, in many countries there is a clear arbitrage in sending an SMS versus making a call which on average last between 60 – 120 seconds).

(3) Prepaid to Contract ratios (higher prepaid ratios tend to result in fewer SMS, though this relationship is not per se very strong).

(4) SMS ARPU to GDP (or average income if available) (The lower the higher higher the usage tend to be).

(5) 2G penetration/adaptation and

(6) literacy ratios (particular important in emerging markets. the lower the literacy rate is the lower the amount of SMS per user tend to be).

Finer detailed models can be build with many more parameters. However, the 6 given here will provide a very decent worldview of SMS dynamics (i.e., amount and growth) across countries and cultures. So for mature markets we really talk about a time before 2009 – 2010 where Smartphone penetration started to approach or exceed 20% – 30% (beyond which the model becomes a bit more complex).

In markets where the Smartphone penetration is beyond 30% and 3G networks has reached a certain coverage quality level the models describing SMS usage and growth changes to include Smartphone Penetration and to a lesser degree 3G Uptake (not Smartphone penetration and 3G uptake are not independent parameters and as such one or the other often suffice from a modelling perspective).

Looking SMS usage and growth dynamics after 2008, I have found high quality statistical and descriptive models for SMS growth using the following parameters;

(a) SMS Price Decline.

(b) SMS price to MoU Price.

(c) Prepaid percentage.

(d) Smartphone penetration (Smartphone penetration has a negative impact on SMS growth and usage – unsurprisingly!)

(e) SMS ARPU to GDP

(f) 3G penetration/uptake (Higher the 3G penetration combined with very good coverage has a negative impact on SMS growth and usage. Less important though than Smartphone penetration).

It should be noted that each of these parameters are varying with time and there for in extracting those from a comprehensive dataset time variation should be considered in order to produce a high quality descriptive model for SMS usage and growth.

If a Market and its Mobile Operators would like to protect their SMS revenues or at least slow down the assimilation of SMS, the mobile operators clearly need to understand whether pushing Smartphones and Mobile Data can make up for the decline in SMS revenues that is bound to happen with the hard push of mobile broadband devices and services.

EXPOSURE TO LOSS OF SMS REVENUE – A MARKET BY MARKET VIEW!

As we have already seen and discussed it is not surprising that SMS is declining or stagnating. At least within its present form and business model. Mobile Broadband, the Smartphone and its many applications have created a multi-verse of alternatives to the SMS. Where in the past SMS was a clear convenience and often a much cheaper alternative to an equivalent voice call, today SMS has become in-convenient and not per se a cost-efficient alternative to Voice and certainly not when compared with IP-based messaging via a given data plan.

exposure_to_SMS_decline

74 countries (or markets) have been analysed for their exposure to SMS decline in terms of the share of SMS Revenues out of the Total Mobile Turnover. 4 categories have been identified (1) Very high risk >20%, (2) High risk for 10% – 20%, (3) Medium risk for 5% – 10% and (4) Lower risk when the SMS Revenues are below 5% of total mobile turnover.

As Mobile operators push hard for mobile broadband and inevitably increases rapidly the Smartphone penetration, SMS will decline. In the “end-game” of LTE, SMS has been altogether phased out.

Based on 2012 expectations lets look at the risk exposure that SMS phase-out brings in a market by market out-look;

We see from the above analysis that 9 markets (out of a total 74 analyzed), with Philippines taking the pole position, are having what could be characterized as a very high exposure to SMS Decline. The UK market, with more than 30% of revenues tied up in SMS, have aggressively pushed for mobile broadband and LTE. It will be very interesting to follow how UK operators will mitigate the exposure to SMS decline as LTE is penetrating the market.  We will see whether LTE (and other mobile broadband propositions) can make up for the SMS decline.

More than 40 markets have an SMS revenue dependency of more than 10% of total mobile turnover and thus do have a substantial exposure to SMS decline that needs to be mitigated by changes to the messaging business model.

Mobile operators around the world still need to crack this SMS assimilation challenge … a good starting point would be to stop blaming OTT for all the evils and instead either manage their mobile broadband push and/or start changing their SMS business model to an IP-messaging business model.

IS THERE A MARGIN EXPOSURE BEYOND LOSS OF SMS REVENUES?

There is no doubt that SMS is a high-margin service, if not the highest, for The Mobile Industry.

A small de-tour into the price for SMS and the comparison with the price of mobile data!

The Basic: an SMS is 140 Bytes and max 160 characters.

On average (worldwide) an SMS user pays (i.e., in 2012) ca. 4.615 US$-cent per short message.

A Mega-Byte of data is equivalent to 7,490 SMSs which would have a “value” of ca. 345 US Dollars.

Expensive?

Yes! It would be if that was the price a user would pay for mobile broadband data (particular for average consumptions of 100 Mega Bytes per month of Smartphone consumption) …

However, remember that an average user (worldwide) consumes no more than 20 kilo Byte per Month.

One Mega-Byte of SMS would supposedly last for more than 50 month or more than 4 years.

This is just to illustrate the silliness of getting into SMS value comparison with mobile data.

A Byte is not just a Byte but depends what that Byte caries!

Its quiet clear that an SMS equivalent IP-based messaging does not pose much of a challenge to a mobile broadband network being it either HSPA-based or LTE-based. To some extend IP-based messaging (as long as its equivalent to 140 Bytes) should be able to be delivered at better or similar margin as in a legacy based 2G mobile network.

Thus, in my opinion a 140 Byte message should not cost more to deliver in an LTE or HSPA based network. In fact due to better spectral efficiency and at equivalent service levels, the cost of delivering 140 Bytes in LTE or HSPA should be a lot less than in GSM (or CS-3G).

However, if the mobile operators are not able to adapt their messaging business models to recover the SMS revenues (which with the margin argument above might not be $ to $ recovery but could be less) at risk of being lost to the assimilation process of pushing mobile data … well then substantial margin decline will be experienced.

Operators in the danger zone of SMS revenue exposure, and thus with the SMS revenue share exceeding 10% of the total mobile turnover, should urgently start strategizing on how they can control the SMS assimilation process without substantial financial loss to their operations.

ACKNOWLEDGEMENT

I have made extensive use of historical and actual data from Pyramid Research country data bases. Wherever possible this data has been cross checked with other sources. Pyramid Research have some of the best and most detailed mobile technology projections that would satisfy most data savvy analysts. The very extensive data analysis on Pyramid Research data sets are my own and any short falls in the analysis clearly should only be attributed to myself.

Source: http://techneconomyblog.com/2013/01/01/sms-assimilation-is-inevitable-and-resistance-is-futile/