Archive | Cloud RSS feed for this section

Building an infrastructure backbone for IoT utilization

14 Nov

Cloud computing, automated technologies and the emergence of 5G networks are coming together to help to connect devices across diverse global networks.

The age of the Internet of Things (IoT) is upon us, and it’s little surprise that a number of organisations are jumping on the bandwagon. IoT helps to connect us to the digital world, paving the way for enhanced customer experiences, improved processes and better operational efficiencies. Its potential is reflected in the fact that the global IoT market is set to be worth a staggering $1.5 trillion by 2030.

However, the ability to leverage the benefits of IoT implementation may not be as straightforward as initially thought for a number of organisations. Businesses must take a plethora of considerations into account, including the storage needed to power applications, the restrictions posed by legacy systems and the need to mitigate threats to sensitive data. Firstly however, they must ensure that the right infrastructure foundations are in place.

It’s all about data

Modernising cumbersome IT infrastructures is key, as is the need to migrate systems to the cloud to be able to fully utilise connected devices. Low latency and low cost is an imperative for businesses to fully embrace IoT, but can prove difficult to achieve. A solid infrastructural foundation is needed for huge volumes of data to be ingested in real-time. In addition, bandwidth must be sufficient to enable big data analysis and drive decision-making, with this capability gaining new significance as IoT data processing moves to the edge.

Legacy systems frequently prove to be a blocker to increased scalability and flexibility, as many products integrated over ten years ago are unlikely to possess the agility required to process, store and analyse significantly higher volumes of unstructured data. Simultaneously, understanding of IoT technology is still limited in a number of businesses, leading to hesitation and hindrances in digital transformation progress.

To ensure best utilisation of IoT, today’s data centre colocation providers are leading the way in delivering the right solutions. For example, they can provide methods to organise big data and enable low-cost connectivity, as well as share knowledge to businesses that may be unsure of the best strategy and therefore help them to navigate the implementation successfully.

Moving forward with colocation

Colocation data centres are ideally suited to bringing the benefits of IoT to businesses. For example, colocation can both enable and facilitate the connections needed to support IoT use cases, while also ensuring that sensitive data is protected. This is due to optimum levels of protection against the growing threat of cyber attacks by sophisticated malicious actors.

The benefits of colocation are evolving. For factories, supply chains, power grids, distributed products and even entire cities, it is now becoming the most efficient and flexible way to both manage and analyse significant amounts of IoT sensor data. No longer a hope for the future, smart cities are now very much real, with IoT bringing utilities, services, security and transportation together in a number of locations. Colocation providers are some of the organisations helping to make them a reality.

With businesses that have embraced IoT, network connectivity will need to grow in tandem. This means that an interconnected mesh of international and regional access hubs will be needed to enable hybrid cloud benefits through colocation networking. The ultimate intention is to ensure that data can move from each location to the next with limited costs involved for connectivity charges.

Unlocking IoT value

Opting for the right colocation data centre provider will enable organisations to make best use of the edge and enable their customers to benefit through use of IoT and cloud solutions. With so many of the IoT platforms and applications today being ‘as-a-Service’ and ‘cloud first’ by nature, moving data in the cloud will be a crucial first step to access the benefits, particularly as the number of IoT platforms and application providers continue to expand.

Organisations are then able to leverage the capability provided by colocation providers to utilise ‘anytime, anywhere’ interconnectivity alongside cloud-based storage and compute technologies. It’s this comprehensive infrastructure that will prove to be the key in being able to combine the digital and physical worlds and make best use of IoT devices.

By: Bo Ribbing
Source: https://technative.io/infrastructure-backbone-iot-utilization/ 14 11 22

Tech in 2023: Here’s what is going to really matter

14 Nov

These are the technologies and issues that will matter to you in 2023.

As the turn of the year approaches, the annual ritual of identifying tech trends and making predictions for the next solar orbit swings into action. Such exercises are always interesting but also inherently risky – particularly in uncertain times such as those we have recently experienced and continue to live through.

For example, few at the end of 2019 would have forecast that the world of work would be turned upside down during the following year by a global pandemic, leading to an unprecedented focus on devices and services that facilitated remote working, and ushering in a likely permanent shift to a hybrid model.

Then, just as economies were adjusting to and recovering from the pandemic, Russia’s invasion of Ukraine in February 2022 caused a sharp rise in energy prices, increased inflation, supply chain issues and fears of widespread recession. This series of shocks has profound implications for the IT industry that look set to continue through 2023 and beyond.

So perhaps the biggest benefit of the annual round of tech soothsaying is not so much the fine-grained detail – which is often derailed by contact with unexpected events – as the chance to take stock of the general direction of travel.

Let’s look at the latest crop of trends and predictions.

What the analysts say

Gartner 

Gartner’s top 10 strategic technology trends for 2023 are grouped under three themes:

  • Optimizing IT systems for greater reliability, improving data-driven decision making and maintaining the value integrity of production AI systems.
  • Scaling vertical offerings, increasing the pace of product delivery and enabling connectivity everywhere.
  • Pioneering business model change, reinventing engagement with employees and customers, and accelerating strategies to tap new virtual markets.

Overarching all these trends is sustainability: “Every technology investment will need to be set off against its impact on the environment, keeping future generations in mind. ‘Sustainable by default’ as an objective requires sustainable technology,” says Gartner’s David Groombridge. Support for this view comes from Gartner’s 2022 CEO and Senior Business Executive Survey, which found that environmental sustainability was the third largest driver, behind performance and quality, among the 80% of CEOs planning to invest in new products in 2022/23.

Gartner: Top strategic technology trends 2023

Image: Gartner

digital immune system (DIS) uses a variety of techniques – including AI-augmented testing and software supply chain security – to improve the quality and resilience of business-critical systems. Applied observability takes raw data, adds context and analytics, and generates data-driven business and IT decisions. AI trust, risk and security management (AI TRiSM) covers AI model governance, trustworthiness, fairness, reliability, robustness, efficacy and privacy.

Industry cloud platforms combine software, platform and infrastructure as a service with tailored, industry-specific functionality. Platform engineering provides the tooling, capabilities and processes required to optimise developer experience and accelerate digital delivery. Wireless-value realization covers the delivery of business value via end-user computing, edge devices and digital tagging (RFID).

Superapps are an amalgam of an app, a platform and an ecosystem, where third parties can develop and publish their own mini-apps. Adaptive AI systems use real-time feedback to adjust their learning and adapt to changing circumstances in the real world. Metaverse is a ‘collective virtual 3D shared space’ with an economy based on cryptocurrencies and NFTs.

IDC 

IDC’s worldwide IT industry predictions for 2023 cover the key drivers for the next five years – which, as we have outlined above, are likely to see significant economic turmoil. Or, as analyst Rick Villars puts it: “For the next several years, leading technology providers must play a leading role in helping enterprises use innovative technologies to slide through the current storms of disruption.”

However, the wider message is more optimistic: “Technology and IT is not about where we cut – technology is where we invest to help the rest of the business navigate the recession,” says Villars. “This is the continued transition, which we started to see two years ago, towards digital business.”

For IDC, ‘digital business’ covers everything from processes, through products and services, to experiences, and will be delivered by a ‘dream team’ combination of IT and business units.

Here are some of IDC’s key predictions:

  • In 2025, 60% of infrastructure, security, data, and network offerings will require cloud-based control platforms that enable extensive automation and promise major reductions in ongoing operating costs.
  • Through 2024, shortcomings in critical skills creation and training efforts by IT industry leaders will prevent 65% of businesses from achieving full value from cloud, data, and automation investments.
  • Sovereign assertions in sustainability, resiliency, and asset residency through 2025 will force CIOs to shift staff, budgets, and operating processes for more than 35% of IT and data assets.
  • In 2023, 70% of enterprises’ adoption of as-a-service infrastructure/software will be curbed more by an inability to assess promises of faster innovation and operational gains than by cost concerns.
  • By the end of 2024, 60% of platinum-Level aaS offerings in security, business operations, and DaaS will include bundled access to specialized SME teams to help reduce impact of skills shortages.
  • In 2026, 45% of G2000 enterprises will continue to face material risks due to frontline workers’ and business leaders’ unwillingness to trust actions initiated by vetted autonomous tech systems.

For the remainder, see here.

IDC: IT industry predictions for 2023

Image: IDC

IDC’s final thought for 2024 and beyond is that automation – of IT operations, business processes and application development – will be critical for scaling digital business.

Forrester 

Organisations need to concentrate on their core missions and strengths in difficult times, according to Forrester: “In 2023, smart business leaders will get focused – pruning efforts that aren’t bearing fruit and prioritizing long-term growth. Economic and geopolitical turmoil will sow fear and disruption, yet panic, short-sighted revenue grabs, and poorly planned returns to the office will only make things worse.”

Here are some of Forrester’s key predictions:

  • Consumers are cautious but don’t pull back on spending.
  • A skills shortage challenges CX teams’ ability to thrive.
  • Monitoring technology riles employees.
  • Greenwashing becomes a serious business risk.
  • Leaders try to force employees back into the office, with disastrous results.
  • Metaverse experiments fall short of capturing the public’s imagination.
  • The talent crunch pushes tech executives to seek new sources of candidates.
  • Scandals at tech companies burn through consumer trust.

A key business/tech priority for 2023, Forrester says, is trust – in various guises. “Customers are increasingly weary of organizations playing fast and loose with their personal data, and regulators aren’t far behind. And it won’t stop there – fueled by the ire of fed-up customers and employees, regulators will scrutinize greenwashing, misinformation, and employee surveillance.”

CCS Insight 

One research firm that is very much in the business of fine-grained prognostication is CCS Insight, which recently held a three-day online event featuring analyst presentations, interviews with executives from SamsungEE and Qualcomm, and no fewer than 100 predictions.

Rather than list all 100 predictions (you can find them here), I’ve selected a couple from each category that caught my eye:

Category Prediction
Silicon Foundations  Attempts to rebalance the geographic diversity of semiconductor manufacturing fail, and Taiwan maintains its lead
Quantum computers achieve 60-second coherence time by 2025
Sustainability  By 2027, mobile apps enable users to track their carbon footprint in real time
By 2024, an international standard develops for calculating the carbon emissions of cloud services
Infrastructure Advances  China’s early development of 6G results in at least two competing standards in the East and West
By 2024 a major mobile network suffers an outage as multiple DNS gateways fail simultaneously
Connectivity Providers  By 2025 satellite broadband provider Starlink is spun off from SpaceX as a publicly listed company
By 2025, at least 25 telecom operators offer connected car services as an add-on to consumer 5G plans
Regulation  The EU’s Digital Markets Act ends preferred product placement in search results by 2024
Clear signs of a Russian–Chinese “splinternet” emerge by 2023
New Opportunities  Apple enters the US health insurance business in 2024
By 2026, more companies follow Tesla in investing in mineral supply chains
Personal Futures  By 2028, a “blockchain of you” lets developers build viable digital twins of people to support personalized services
By 2030, intelligent wireless body monitoring leads to pervasive and personalized healthcare
Changing Workplace  By 2024, enterprise collaboration tools add immersive spaces to help replicate the in-office experience
Demand for software that measures and tracks the link between employee experience and customer experience accelerates in 2023
Virtual Worlds  Apple’s venture into spatial computing entirely avoids using established terminology such as virtual reality, augmented reality or the metaverse
By 2025, Meta launches a virtual reality headset controlled by an inbuilt neural sensor
Connected Devices  By the end of 2026, more than 25 million people access the Internet through a mobile phone connected directly to satellites
Apple launches a foldable iPad in 2024

In a panel discussion at the event, CCS Insight analysts discussed how satellite connectivity is likely to complement rather than usurp terrestrial fixed and mobile networks, while noting the recent flurry of satellite service announcements from smartphone manufacturers. Also covered by the panellists was the state of play in the metaverse, sustainable technology and the smart home.

A roundup of 2023 predictions

There’s clearly a lot to digest in the analyst predictions described above. To try and get an overall picture, we’ve taken these and predictions from a range of other recent forward-looking surveys and articles, and assigned them to emergent categories. Here’s what the resulting chart looks like:

ZDNET: roundup of 2023 tech trends

Data & chart: ZDNET

It’s no surprise to find connectivity – wi-fi, fixed and mobile broadband, and satellite – heading the rankings, as this basically underpins the entire digital world. Similarly, prominent showings for artificial intelligence (AI) and machine learning (ML), cybersecurity, cloud computing and software development are not unexpected. The metaverse’s runner-up spot, along with VR/AR/MR/XR, confirms its status as the most-hyped technology of the moment, while the presence of ‘Geopolitics & tech supply chains’ in sixth place is a sign of the times.

By: Charles McLellan, Senior Editor
Source: https://www.zdnet.com/article/tech-in-2023-weve-analysed-the-data-and-heres-whats-really-going-to-matter/  14 11 22

2022 technology industry outlook

19 Feb

An analysis of the four biggest industry trends

The technology industry has largely thrived over the past two years, but is that momentum sustainable? To enable the next wave of growth, technology companies should rededicate their efforts to improving transparency, agility, collaboration, sustainability, and digital innovation. Learn more about these key focus areas in our 2022 technology industry outlook.

Laying the foundation for future growth

When the pandemic began two years ago, it catapulted many organizations into the future, rapidly accelerating digital transformation. Work environments changed overnight as remote work became commonplace and market demands evolved. Deloitte urged technology organizations to upgrade their supply chains for greater transparency and resiliency and to embrace cloud, everything-as-a-service (XaaS), and edge intelligence to ramp up their transformation efforts.

As 2021 began and many supply networks struggled, we advised technology industry leaders to reexamine where and how manufacturing happens and to focus on improving transparency, flexibility, and resiliency. We also suggested that organizations reorient and reskill their workforces in order to optimize remote work capabilities and take full advantage of advanced technologies such as AI.

At the start of 2022, many of these issues remain front and center for technology companies, with one important difference: Leaders now have an opportunity to address these challenges more deliberately and purposefully. Instead of managing an immediate crisis, they can lay solid foundations for future innovation and growth.

Some of the specific themes we see playing a foundational role in 2022 and beyond include:

  • Taking cloud and everything-as-a-service to the next level. As more companies embrace cloud and service-based IT to drive innovation and transformation, and as XaaS providers multiply, more work will be needed to manage the technical and operational complexities of hybrid, multi-cloud approaches.
  • Creating the supply chains of the future. As technology companies continue to recover from pandemic-induced supply chain disruptions, they will start proactively preparing for future uncertainty and other systemic risks. To do it, they’ll build systems with better visibility and resiliency.
  • Building the next iteration of the hybrid workforce. With more experience utilizing a hybrid workforce under their collective belts, tech companies will evolve their cultures, accelerate experimentation with collaboration solutions, and develop better approaches to managing tax implications.
  • Leading the charge to create a sustainable future. Although the tech industry is working to address critical sustainability issues, growing pressure from stakeholders and potential changes to environmental, social, and governance (ESG) reporting rules will incite tech companies to heighten their focus on reducing and reversing environmental impact.

2022-technology-outlook Report to learn more about the impacts of technology industry trends, key actions to take, and critical questions to ask.

Source: https://www2.deloitte.com/us/en/pages/technology-media-and-telecommunications/articles/technology-industry-outlook.html 19 02 22

Want to get ahead in service management and orchestration?

12 Nov

Service management and orchestration (SMO) is an exciting and innovative development in our industry. A new concept defined within the O-RAN Alliance architecture for RAN management, SMO provides automation and orchestration of the Open RAN domain and is critical for driving automation within the operator network. We explore how to extend it across different RAN architectures, and why now’s the time for CSPs to ‘get one’s skates on’ to stay ahead of the competition.

The capability to drive automation within the operator network makes SMO immediately attractive to every mobile operator who wants to increase automation to drive down operational costs and, ultimately, improve profitability.

However, the fundamental problem with SMO is that it’s designed to automate Open RAN networks which only make up one to two percent of deployed networks today.

To put that in context, it’s like inventing a perpetual motion machine to replace the internal combustion engine and then, in the small print, explaining it only works in Porches or Ferraris. Great news in principle, but not actually that helpful for 98 percent of motorists.

So, the key question is: “Is there a way to make the SMO concept applicable to 100 percent of radio access networks deployed globally?”

Thankfully, we think that there is an answer – a simple answer – to this ultimate question. And that answer is to extend this to address the existing, purpose-built RAN.

But where are with this now?

Open RAN is an exciting and disruptive development in our industry, but the technology is still at the early stage. We see a couple of new, greenfield entrants, looking to deploy new 100 percent Open RAN networks. We also see lots of leading communication service providers (CSPs) testing the technology, often in remote, rural areas or controlled environments such as university or business campuses.

Industry analyst firm, Gartner, developed an interesting model to look at the introduction of new technologies called the ‘Gartner Hype Cycle.’  The hype cycle outlines a number of stages between the ‘innovation trigger,’ where the new technology is conceived and the ‘plateau of productivity,’ where the technology is widely adopted and starts to deliver results. It’s difficult to estimate where Open RAN is on the cycle, but widespread media coverage and speculation makes it feel like Open RAN is just passing the ‘peak of inflated expectations:’ essentially, it’s still in a relatively early stage.

 

Open RAN and the O-RAN Alliance architecture

Let’s briefly look at the O-RAN Alliance. The O-RAN Alliance is a service provider-led consortium of over 300 companies working on the Open RAN concept. They have defined an Open RAN architecture which, as the name suggests, has openness at its heart.

The O-RAN Alliance - Open RAN architecture

Figure 1. The O-RAN Alliance – Open RAN architecture

The Open RAN architecture defines, or is in the process of defining, a number of open interfaces imaginatively named the O1, O2, A1 and R1 interfaces. These interfaces are not yet standardized in the same way that, for example, the 3GPP Release 17 is standardized, but there are strong aspirations for openness and standardization within the O-RAN Alliance.

Ericsson is a major contributor to the O-RAN Alliance service management and orchestration working groups, and we have a number of ideas about how the SMO concept should be implemented.

 

Expanding SMO to support multi-technology

At the start of this blog, I gave the analogy that O-RAN Alliance SMO is a little bit like building a perpetual motion engine to replace the internal combustion engine, but having it only work with one brand of high-performance car. Brilliant, innovative, but not actually useful for 98 percent of motorists.

But what would happen if you could take the SMO concept and extend the automation capabilities brought about by SMO – non-real-time RAN intelligent controllers (non-RT-RIC) and their automation rApps – into the domain of the existing, purpose-built RAN networks? In effect, building a perpetual motion engine for every motorist.

At Ericsson, we believe this is very much possible. To achieve this, we will start with the existing centralized self-optimizing networks (C-SON) and network design and optimization (NDO) applications already deployed in networks. Today, these applications are often deployed on tightly integrated application platforms to perform specific automation operations. The SMO architecture effectively allows you to recreate these applications as RAN automation applications or rApps, as well as standardizing the underlying application environment: the SMO’s non-RT-RIC, which uses an R1 interface that enables interoperability between platforms and vendors.

“This report…analyzes the evolution of C-SON modules and use cases becoming apps in the RAN Intelligent Controller (RIC) needed in open RAN, open virtual RAN, and software defined RAN (SD-RAN) architectures”
– Stéphane Téral, Chief Analyst

A recent report from analyst firm Light Counting highlights the expectation that existing C-SON applications will become applications in the RAN intelligent controller (RIC).

The ability to run rApps to optimize both Open RAN and existing purpose-built 4G and 5G networks is profound. If this can be achieved one of the major barriers to adoption of the SMO will be crossed – scalability.

Obviously, it’s hard to justify investment in an automation platform for a tiny percentage of your current network. This is why we are expanding the scope of the SMO to cover Open RAN and purpose-built RAN.  –  this results in the investment, covering the entire network with the benefits of automation being similarly applied to the entire network. At Ericsson, we call this approach multi-technology: the ability to automate and orchestrate Open RAN and purpose-built RAN.

 

Avoiding SMO siloes

Another major barrier to adoption is the management of multiple vendors’ RAN. For Open RAN networks the open A1 and O1 interfaces enable a common approach to managing the new Open RAN technologies, but the challenge is how to manage multiple vendors’ purpose-built 4G and 5G RAN networks.

Extending SMO to address purpose-built RAN

Figure 2. Extending SMO to address purpose-built RAN

These networks have a vendor specific equipment management system, or EMS, such as Ericsson’s own Ericsson Network Manager (ENM) or Nokia’s NetAct. There have been successful approaches, such as the Operational Support Systems interworking initiative (OSSii) to encourage interworking between equipment vendors, but not every vendor is actively participating. However, what OSSii proves is that there are ways to manage multiple vendors EMS and RAN.

The reason this is important is that without effective interworking in the existing purpose-built RAN networks, there will be a tendency to deploy vendor specific SMOs. Having one SMO per equipment vendor is counter-intuitive and would appear to add to complexity rather than reduce it.

Our belief is that an operator network should have a single SMO: deployed on-premises, in the cloud, or as-a-Service (aaS), depending on the wishes of the service provider. This single SMO will handle the new Open RAN technologies via the open A1 and O1 interfaces and purpose-built networks from multiple vendors through their own proprietary interfaces.

This approach gives a single ‘pane of glass’ operational overview of the network, which reduces complexity and ultimately drives down operational costs.

At Ericsson, we call this approach multi-vendor.

 

Recommendations to accelerate SMO adoption

In Ericsson’s new SMO white paper, An intelligent platform: The use of O-RAN’s SMO as the enabler for openness and innovation in the RAN domainwe outline five recommendations to accelerate the adoption of SMO and shorten the hype cycle:

  1. Deploy in complex RAN environments: we recommend deploying SMO into complex multi-vendor, multi-technology networks where the automation platform can create the greatest operational impact.
  2. Target addressable areas of high operational costs: network deployment and network operation are two of the largest addressable areas of OPEX, we recommend that these areas are early targets for new automation rApps and xApps.
  3. Leverage proven use cases: there is an opportunity to take proven capabilities and deploy them at scale by taking the existing centralized SON and network design and optimization applications and using them to create new automation applications running on a common platform.
  4. Use best in class applications: use automation applications from a wide selection of developers including network equipment vendors, CSPs and third-party specialists. Be prepared to test similar applications and select those that best utilize any tools that accelerate open innovation such as software development toolkits, partner development ecosystems and even marketplaces to maximize choice.
  5. Deploy across multi-technology networks: SMO provides maximum return on investment (ROI) where it enables automation for 100 percent of the network. Having high levels of automation in an Open RAN network is a way of ensuring that the new technology is future-proof. However, adding high levels of automation to the existing purpose-built RAN that makes up more than 98 percent of CSP networks today is also highly desirable.

Learn more

 

Source: https://www.ericsson.com/en/blog/2021/11/want-to-get-ahead-in-service-management-and-orchestration 12 11 21
By: Santiago Rodriguez – Head of Strategic Projects / Justin Paul – Senior Solutions Marketing Manager OSS

Five Ways Cloud Platforms Need To Be More Secure In 2021

8 Nov
Five Ways Cloud Platforms Need To Be More Secure In 2021

Bottom Line: Closing cloud security gaps starting with Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) needs to happen now as cloud services-based cyberattacks have grown 630% since January alone, creating a digital pandemic just as insidious as Covid-19.

Cyberattacks are going through a digital transformation of their own this year, with their targets more frequently being cloud services and the workloads running on them. McAfee Labs Covid-19 Threats Report from July found a 630% increase in cloud services cyberattacks between January and April of this year alone.

The cyberattacks look to capitalize on opportunistic gaps in cloud platforms’ security structures. The 2020 Oracle KPMG Cloud Threat Report provides insights into much faster cloud migration is outpacing security readiness. 92% of security leaders admitted that their organization has a gap between current and planned cloud usage and their program’s maturity. Of the 92%, 48% say they have a moderate public cloud security readiness gap and 44% say the gap is wider than average. The research team looked at the overlap of which organizations believe they can secure their use of public cloud services and found 44% do today across both categories.  

Five Ways Cloud Platforms Need To Be More Secure In 2021
SOURCE: ORACLE KPMG CLOUD THREAT REPORT, 2020

The urgency to close security gaps is amplified by the increasing adoption rate of IaaS and PaaS, with the last three years shown below.

Five Ways Cloud Platforms Need To Be More Secure In 2021
SOURCE: ORACLE KPMG CLOUD THREAT REPORT, 2020

What Are The Fastest Growing Cybersecurity Skills In 2021?Enterprises’ AI & Cybersecurity Needs Are Rejuvenating Mainframes83% Of Enterprises Transformed Their Cybersecurity In 2020

The Oracle KPMG team also found that nearly a third of enterprise applications are either going through or are planned for a lift-and-shift to the cloud, further accelerating the need for better cloud security.

Five Ways Cloud Platforms Need To Be More Secure In 2021

The majority of IT and cybersecurity teams I talk with today are overwhelmed. From troubleshooting remote access for legacy on-premise applications to keeping the queues in their ITSM systems under control, there’s not much time left. When it comes to cybersecurity, the more practical the advice, the more valuable it is.

This week I read Gartner’s recent report, 5 Things You Must Absolutely Get Right for Secure IaaS and PaaS, available as a complimentary read by Centrify. Based on insights gained from the report and ongoing discussions with IT and cybersecurity teams, the following are five ways cloud platforms need to be made more secure in 2021:

  1. Prioritize Privileged Access Management (PAM) and Identity & Access Management (IAM) using cloud-native controls to maintain least privilege access to sensitive data starting at the PaaS level. By getting access controls in place first, the entire cloud platform is innately more secure. To save time and secure cloud platforms as thoroughly as possible, it’s advisable to utilize cloud-native Privileged Access Management (PAM) solutions that enforce Multifactor Authentication (MFA) and create specific roles admin functions that have a time limit associated with them. Leading vendors offering cloud-ready PAM capabilities include Centrify, which has proven its ability to deploy cloud-based PAM systems optimized to the specific challenges of organizations digitally transforming themselves today.
  2. Start using customer-controlled keys to encrypt all data, migrating off legacy operating systems and controls that rely on trusted and untrusted domains across all IaaS instances. IT teams say getting consistent encryption across each cloud provider is proving elusive as each interprets the Cloud Shared Responsibility Model differently, given their respective product and services mix. Cloud platform providers offer IAM and PAM tools fine-tuned to their specific platforms but can’t control access across multi-cloud environments. Securing and encrypting data across multiple cloud platforms takes a more centralized approach, providing full logging and audit capabilities that secure audit data from Cloud Service Providers (CSPs).
  3. Before implementing any cloud infrastructure project, design in Zero Trust Security (ZTS) and micro-segmentation first and have IaaS and PaaS structure follow. Both ZTS and micro-segmentation are pivotal to securing cloud infrastructure today. Think of IAM, ZTS, MFA and PAM as the foundation of a secure cloud infrastructure strategy. Having these core foundational elements in place assures the PaaS layer of cloud infrastructure is secure. As traditional IT network perimeters dissolve, enterprises need to replace the “trust but verify” adage with a Zero Trust-based framework. Zero Trust Privilege mandates a “never trust, always verify, enforce least privilege” approach to privileged access, from inside or outside the network. Centrify is a leader in this area, combining password vaulting with brokering of identities, multifactor authentication enforcement and “just enough” privilege while securing remote access and monitoring all privileged sessions.
  4. Before implementing any PaaS or IaaS infrastructure, define the best possible approach to identifying, isolating and correcting configuration mistakes or errors in infrastructure. From the basics of scanning unsecured configurations to auditing unsecured ports, every organization can take steps to better identify, isolate and correct infrastructure configuration errors. The fast-growing area of Cloud Security Posture Management (CSPM) is purpose-built to identify misconfigured cloud components across an entire infrastructure. Many IT teams get started with an initial strategy of monitoring and progress to more proactive tools that provide real-time alerts of any anomalous errors. CSPM tools in the most advanced IT organizations are part of a broader cloud infrastructure security strategy that also encompasses web application and API protection (WAAP) applications that ensure external and internal API security and stability.
  5. Standardize on a unified log monitoring system that ideally as AI and machine learning built to identify cloud infrastructure configuration and performance anomalies in real-time. CIOs are also saying that the confusing array of legacy monitoring tools makes it especially challenging to find gaps in cloud infrastructure performance. As a result, CIOs’ teams are on their own to interpret often-conflicting data sets that may signal risks to business continuity that could be easily overlooked. Making sense of potentially conflicting data triggers false-positives of infrastructure gaps, leading to wasted time by IT Operations teams troubleshooting them. Most organizations have SIEM capabilities for on-premises infrastructures, such as desktops, file servers and hosted applications. However, these are frequently unsuitable and cost-prohibitive for managing the exponential growth of cloud logs. AIOps is proving effective in identifying anomalies and performance event correlations in real-time, contributing to greater business continuity. One of the leaders in this area is LogicMonitor, whose AIOps-enabled infrastructure monitoring and observability platform have proven successful in troubleshooting infrastructure problems and ensuring business continuity. LogicMonitor’s AIOps capabilities are powered by LM Intelligence, a series of AI-based algorithms that provide customer businesses with real-time warnings into potential trouble spots that could impact business continuity

Conclusion

Protecting cloud infrastructures against cyberattacks needs to be an urgent priority for every organization going into 2021. IT and cybersecurity teams need practical, pragmatic strategies that deliver long-term results. Starting with Privilege Access Management (PAM) and Identity & Access Management (IAM), organizations need to design in a least privilege access framework that can scale across multi-cloud infrastructure.

Adopting customer-controlled keys to encrypt all data and designing in Zero Trust Security (ZTS) and micro-segmentation need to be part of the built core cloud infrastructure. Identifying, isolating and correcting configuration mistakes or errors in infrastructure helps to protect cloud infrastructures further. Keeping cloud infrastructure secure long-term needs to include a unified log monitoring system too. Selecting one that is AI- or machine learning-based helps automate log analysis and can quickly identify cloud infrastructure configuration and performance anomalies in real-time.

All of these strategies taken together are essential for improving cloud platform security in 2021.

Source: https://www.forbes.com/sites/louiscolumbus/2020/11/08/five-ways-cloud-platforms-need-to-be-more-secure-in-2021/?sh=a250c1c32960 08 11 20

Europe Focuses on 6GHz Regulation, While Wi-Fi 7 Looms Beyond

4 Nov

European regulators have been coming under increasing pressure regarding the slow pace at which the necessary lower 6-GHz band for Wi-Fi and its successor, Wi-Fi 6E, has been made available in the region.

At the same time, standards setters at the IEEE 802.1 committee have been making good progress in finalizing specifications for the next stage in wireless LANs. A working group has recently released detailed technical criteria for what is now referred to as 802.11be, but which is widely expected to be  designated Wi-Fi 7 when the technology becomes a reality, now expected to be late 2024.

Let’s focus on the positive first. The technologists and standards setters working to define 802.11be [or Extremely High Throughput (EHT)] have set themselves hugely ambitious goals so as to meet the ever-increasing connectivity demands as well as ensuring the sector makes even more efficient use of the spectrum.

Wi-Fi roadmap
The Wi-Fi standards roadmap (click on the image for a larger view).

The developing standard targets higher data rates, lower latency, better power (and cost) efficiency, improved interference mitigation and higher capacity density — achieving all of these incremental improvements together is going to be tough. As is the ability to meet the mid-2024 target date set by the Committee for publishing the 802.11 be amendment,  so that certification and interoperability tests can commence under the auspices of the Wi-Fi Alliance by the end of that year.

Of course as with previous iterations of the WLAN,  pre-certified end-user gear is likely to appear before late 2024, as is happening now with Wi-Fi 6 and will soon with the next generation to follow — Wi-Fi 6E.

And backward compatibility with previous generations of the WLAN will need to be ensured for a smooth transition to the next generation.

The standards for 802.11be (let’s call it Wi-Fi 7) will still be based on OFDMA, but some key advances are expected that should allow the option to deploy 4096-QAM.

An improved MU-MIMO is being specified (to date referred to as “cooperative” CMU-MIMO), designed to support the defined 16 spatial streams, double that being used in Wi-Fi 6. This is expected increase throughput by 20%, but as noted, this will be offered as an option, and lower modulation schemes will continue to be supported.

Wi-Fi

The standards setters suggest that making this work could turn out to be perhaps the biggest design challenge for Wi-Fi 7.

The maximum channel size being targeted is 320MHz, also double that used in Wi-Fi 6, such that Wi-Fi 7 will be targeted for deployment in the 6GHz band, the most recent part of the spectrum added  for unlicensed use  (at least in some countries — see later) and supported by Wi-Fi 6E. Doubling the maximum channel should also double the throughput for Wi-Fi 7. In addition, the specifications will also support 160+160MHz, 240+180 MHz and 160+80 MHz channels so as to combine non-adjacent spectrum blocks.

Multi-link operation is also expected to be mandated for Wi-Fi 7. This will allow devices to simultaneously receive and/or transmit data across different channels or bands, with separation of data and control planes. This is what will give Wi-Fi 7 the ability to significantly increase the throughput to multiple devices, lower the latency and thus offer higher reliability.

These advances are expected to lead to the anticipated much higher maximum data rates to a theoretical 46 Gbps. A more realistic data rate anticipated by the standards setters will peg this back to about 30 Gbps, for real-world deployment shared across numerous devices.

Of course by the time all this is commercialized, the 6 GHz band will already be widely used for other wireless services, not least 5G cellular. It is anticipated that the Automated Frequency Co-ordinator (AFC) under development will be a work-around for this and will ensure efficient spectrum sharing.

And a recent technology brief from Monica Paolini, founder of networking consultancy Senza Fili , and supported by Intel, noted that “Wi-Fi 7 brings more flexibility and capabilities to enterprises,” extending the reach of wireless LANs.

She stresses the two networking technologies will need to work together “to introduce edge computing, distributed and cloud architectures, virtualization and digitalization in the emerging private wireless networks” as well.

Paolini notes that Wi-Fi 7 will also play a major role in supporting applications that require deterministic latency, high reliability and improved QoS.

Still on the impact on enterprises, the improved Wi-Fi should also offer even greater opportunities in IoT and IIoT applications such as industrial automation, surveillance, remote control, AV/VR and other video-based applications.

Paolini also organized and moderated a webinar late last week, in collaboration with the Wi-Fi Alliance, focusing on progress of allocating Wi-Fi spectrum in Europe.

Andreas Geiss, Head of Unit for Radio Spectrum Policy in GD CONNECT of the European Commission, a “special guest” during the webinar, was put on the spot. Host Alex Roytblat, VP of Worldwide Regulatory Affairs at the Wi-Fi Alliance, referenced the landmark ruling of the US FCC to release about 1200 MHz of the 6GHz bands for Wi-Fi 6 and the follow-up Wi-Fi 6E, as well as other more recent moves in the same direction by authorities in South Korea and the UK, and urged Geiss to clarify progress in Europe in this area.

Geiss noted that the process was “very convoluted” as the discussions involve not just the 27 countries in the European Union (soon to be 26 with the imminent departure of the UK) but all the other of the 48 countries within the CEPT (Conference of European Posts and Telecommunications  — the regulatory body representing all European countries in all matters relating to telecommunications.

Geiss also stressed that “being limited to on-line meetings over the past months rather than the much easier route of face-to-face meetings has not helped in trying to reach consensus.”

But he then revealed that European regulators are targeting April 2021 as the date for releasing 500 MHz (between 5945 MHz and 6425 MHz) for Wi-Fi use. “We hope to finalize our proposals by the end of November,” said Geiss.

He stressed that after this these proposals “will need to be looked at by other harmonization bodies, including, importantly, the European Radio Spectrum Committee (RSC) that I also chair. It is very important that we get this harmonization effort right.”

Here, it might be apposite to precis the continent-wide rules for making such important decisions as the release of the 6GHz band.

The CEPT’s Electronic Communications Committee (ECC) is, according to Geiss, expected to approve the draft of the Working Group’s proposals for 6GHz regulation very shortly. This will then have to be rubber-stamped by the wider ECC plenary planned for mid- November.

The proposals are then passed up to the European Commission   to organize approval by all the countries of the European Union, with input from specialist groups such as the RSC.

If there is broad agreement, the group should then review and adopt the proposals by December, which will then have to be approved by all member states by, as Geiss suggests, next April.

One of the most contentious issues during the discussions within the ECC apparently relates to protection of the so-called CBTC (Communications Based Train Control) signalling regulations that are used by many European train operators.

But stay with us as there is yet another twist in the process. Current rules in the EU oblige all member states to transfer the approved rules into national regulation within six months after adoption. But the wider CEPT rules state countries can take up to two years to fully implement the rules. The expectation, though, is that all countries will in fact nod the new rules through within the six months.

And both groups are expected to adopt harmonized Low Power Indoor (LPI) and Very Low Power (VLP) versions of the 6GHz regulations.

The differences between the two categories of equipment will be transmission power and portability. LPI gear will only be allowed to be deployed inside buildings and have access to the full 480 MHz, while VLP equipment will be sanctioned for both indoor and outdoor use. The spectrum for that will be divided into two categories — 400 MHz and 80 MHz.

Most installations are expected to come under the LPI umbrella, while the newly devised VLP version will focus on consumer applications such as VR/AR glasses and other applications that can be connected to smartphones.

Another of the panel speakers, Chris Szymanski, director of product marketing and government affairs at Broadcom, was keen to get an insight of when the upper part of the 6 GHz band would become available for use.

“We  are open to studying this aspect next, but we do need to make more studies into this area, notably regarding connectivity interference issues — notably with 5G operations,” noted Geiss.

“Sharing that spectrum in the correct way is one of the key issues for achieving the European target for a ‘gigabit society’, said Geiss. “But for now, member states want to focus on ensuring they can make best use of the lower 6GHz region, and ensuring there are no mitigation issues.”

Szymanski welcomed the progress in regulation which he said “is set to offer a huge opportunity for companies like ours who are readying the components and end-user equipment at pace Wi-Fi 6 and soon Wi-Fi 6E.

“Yes, at times it has been challenging and frustrating, but we are getting there.”

Source: https://www.eetimes.com/europe-focuses-on-6ghz-regulation-while-wi-fi-7-looms-beyond/# 04 11 20

5G Evolution Supports a New Wave of Wireless Services

27 Sep

Expanding 5G/6G connectivity to include network-to-smart device communications, combined with AI and IoT, will usher in a new industrial wave and offer greater business value for both industry and society.

The release of each new generation of wireless technology every decade or so has enabled mobile communication to progress considerably since the first portable phones appeared in the 1980s. Technical advances have created new services and business opportunities, driving what’s being referred to as the third wave of communications. The evolution made possible through 5G and future 6G technology will support even more new services for industry and society, well into the 2030s and beyond (Fig. 1).

1. There have now been five generations of mobile communications technology and services, with a sixth on the way.1. There have now been five generations of mobile communications technology and services, with a sixth on the way.

5G represents the first step toward this next wave of services with expanded connectivity and a significant upgrade in multimedia capabilities combined with artificial intelligence (AI), machine learning (ML), and the Internet of Things (IoT). 5G will be the first generation of mobile communications to utilize millimeter-wave (mmWave) band frequencies, supporting bandwidths of several hundred megahertz and actualizing ultra-high-speed wireless data communications of many gigabits per second.

This article examines the expansion of 5G/6G wireless communications to new areas of service that will drive another industrial wave and offer greater business value for industry and society.

The Third Wave of Wireless Communications

5G and future systems will close the gap between the physical and cyber worlds. Today, mobile consumers use wireless connectivity to access the web from almost any location. In the future, high-speed coverage will be more widespread and faster, with greater emphasis on uplinking information from real-world events, either human and/or IoT, to the internet.

Once this information is in the cloud, AI could reproduce the real world in cyberspace and emulate it beyond physical, economical, and time constraints, so that “future prediction” and “new knowledge” can be discovered and shared. The role of wireless communications in this cyber-physical fusion is assumed to include high-capacity and low-latency transmission of real-world images and sensing information, and feedback to the real world through high-reliability and low-latency control signaling.

Radio communications in this cyber-physical fusion scenario correspond to the role of the nervous system transmitting information between the brain and the body. Communications convert real-world events to the cyber world through enhanced uplink capabilities and provide feedback information to humans and devices through low-latency downlink functionality.

2. These are some of the new business services that will be enabled with the third wave of communications, which has been initiated with the advent of 5G.

2. These are some of the new business services that will be enabled with the third wave of communications, which has been initiated with the advent of 5G.

The next wave of communications focuses on three areas of service (Fig. 2) including:

  • Enhanced mobile broadband (eMBB), which extends the current mobile experience with high data throughput on the order of more than 10 Gb/s, high system capacity on the order of more than 1000X that of LTE, and a much better spectral efficiency (3-4X) than LTE. Its use cases are high-speed mobile broadband, virtual reality, augmented reality, gaming, and more.
  • Ultra-reliable, low-latency communications (URLLC), which focuses on achieving low latency, high reliability, and high availability. The expectation is for latencies of less than 1 ms. This is basically for mission-critical use cases and applications.
  • Massive machine-type communications (mMTC), which provides connectivity to a huge number of devices whose traffic profile is typically a small amount of data (spread) sporadically. Consequently, latency and throughput aren’t a big concern. The main concern is the optimal power utilization of those devices because they’re battery-powered and the expectation of battery life is around 10 years or so.

Current activity in mmWave front-end design, including antenna-in-package (AiP) phased arrays (Fig. 3), large-scale beamforming RF integrated circuits (RFICs), multi-technology integration, and system-level electromagnetic (EM) analysis will all contribute to realizing New Radio (NR) access technology that can be cost effective and easy to install. This will support the small-cell networks that achieve 5G/6G performance.

3. The images depict dual-polarized AiP phased-array technology designed in Cadence AWR Design Environment software.3. The images depict dual-polarized AiP phased-array technology designed in Cadence AWR Design Environment software.

Early 5G deployment and related trials have shown room for improvement in coverage and uplink performance in non-line-of-sight (NLOS) environments and heavy traffic use cases. While future system performance goals are still in the early phases of consideration, full realization of the promise of this next wave of communications creates a need for continued enhancements. Extreme performance is necessary to provide the high reliability and low latency called for by mission-critical (time-sensitive) applications such as self-driving vehicles.

6G will implement many different technologies to achieve its performance goals. Among them will be new topologies of overlapping cells with distributed networks of beamforming antenna controlled by AI and ML to dynamically select optimum transmission paths. Previous cellular communications were based on networks of hexagonal cells spaced far enough apart to avoid signal interference with neighboring cells. In the future, there will be overlapping cells that can be dynamically reconfigured. In addition, coverage will expand through space, sea, and high-altitude drones.

New functionality in spatial multiplexing and massive multiple-input multiple-output (mMIMO) are under investigation, including the use of reflective surfaces and metamaterials to manage signal propagation in crowded urban environments with limited LOS. 6G may employ a spatially non-orthogonal, overlapped, and dynamic topology to increase path selection. Beam control through AI and ML will help reduce intercell interference at a cost of complexity.

Besides reconfiguring networks, much of the 5G/6G focus will be on physical design of radio access front ends. Strategic design partitioning, leveraging of optimal semiconductor processes, and multi-fabric assemblies will undoubtedly be utilized, calling for a range of simulation technologies, design and manufacturing flows, and tool interoperability. These trends will require software design support for co-design and co-optimization of next-generation wireless electronic systems across multiple domains—including RF, analog, and digital simulation—aided by large-scale EM and thermal analysis, and robust design verification and signoff.

A common RFIC and system-in-package (SiP) design challenge is concerned with how RF intellectual-property (IP) blocks are subject to EM, capacitive, supply, and substrate coupling. These potentially harmful interactions result from effects that occur with high-frequency signaling. Noise, either into or out of a block, can travel through the substrate or through power bussing. Placement (floorplanning) of noise-generating and noise-sensitive blocks is design critical. Parasitic coupling capacitance within/between RF blocks, and signals routed nearby (100s to 1000s of microns away) can cause disrupted performance. Self-inductance of any nets connected to the RF block, and mutual inductance between any net within an RF block to nets in neighboring blocks, is another concern.

Using higher-frequency bands in 6G (94 GHz to 3 THz) will help reduce the size of these antennas, making efforts to shrink component footprints easier. However, the antennas, feed networks, and package interconnects will all be more susceptible to parasitics and unintended coupling (crosstalk), requiring rigorous EM analysis and design verification. This move to higher spectrum will lead to a wide range of design and integration challenges.

Need for Wireless Technology Improvements

5G deployment and related trials have shown there’s room and need for improvements to the coverage and uplink performance of NLOS environments and heavy-traffic use cases. To fully achieve the promise of this next wave of communications, we’ll need continued enhancements to guarantee the high reliability and low latency necessary to close the gap between the cyber and physical worlds. 5G evolution will concentrate on better uplinking and toward more delivery guarantees, as opposed to “best effort,” with a focus on URLLC (Fig. 4).

4. Integration of beamsteering RF front-end module technologies will improve uplink data rates for 5G/6G devices.4. Integration of beamsteering RF front-end module technologies will improve uplink data rates for 5G/6G devices.

Achieving URLLC performance with < 1-ms latency and up to “nine nines” (99.9999999%) mission-critical reliability (for factory operational technology) for networks based on edge cloud computing will require a massive number of small-cell radio access points and distributed antennas with AI/ML controlled beamsteering. Implicit in the requirements for this vast deployment of non-orthogonal networks will be the need to drive down the cost of the beamforming antenna arrays and complex receivers, especially if faster-than-Nyquist signaling techniques are applied, while greatly improving (beyond the current best effort) uplink technology.

Conclusion

A key goal of 5G has been to expand the reach and value of wireless technology beyond the individual mobile subscriber in support of mMTC and URLLC. Expanding connectivity to include network-to-smart device communications, combined with AI and IoT, will usher in a new industrial wave and offer greater business value for both industry and society.

Source: https://www.mwrf.com/technologies/software/article/21142848/5g-evolution-supports-a-new-wave-of-wireless-services 27 09 20

Drones are transforming how Telcos inspect towers

27 Jul

How telcos are increasingly using drones and what this could mean for edge computing.

Drone technology offers a powerful way for enterprises to conduct remote asset inspections during and after the Covid-19 pandemic. The demand for drones among enterprises is forecast to continue to grow over the next few years, with Gartner predicting that shipments of enterprise drones will reach 1.3 million by 2023.

Telcos and MNOs are already leveraging drone technology to automate cell tower inspections, boost operational efficiency, and accelerate the rollout of 5G infrastructure. As connectivity improves and automation increases, we can expect to see drones at the edge, completing autonomous missions, and uploading data directly to the cloud, bringing substantial business benefit to telcos and other enterprises.

Edge computing is a distributed computing framework that makes it possible to process data closer to where it is captured by drone sensors. With its intersection of computing and connectivity, the edge gives drones the ability to process data far more efficiently and far more reliably than legacy connectivity solutions. It provides lower latency and better use of computing and network resources to support applications that are part of the drone-based Internet of Things (IoT) ecosystem.

With edge computing capabilities, drones can automate data management and stream data directly to the cloud for processing and analytics. Edge computing also makes it possible to upload data directly from the drone’s location, for example at a cell tower or local edge data center.

These developments have major implications for drone delivery and inspection, giving enterprises and public safety officials the ability to deploy more drones to execute complex use cases, which generate and consume vast amounts of data and edge connectivity.

The distributed edge is fast becoming a reality as Microsoft Azure and other cloud service providers deploy native clouds that are being pushed to more granular edge positions. Conversely, private LTE is accelerating globally, enabling enterprises and other non-traditional Mobile Network Operators (MNOs) to utilise edge-connected clouds to support their drone operations.

We are seeing global MNOs and native cloud vendors intersecting with full stack, fully standards-compliant LTE enterprise networks that leverage identical native cloud platforms. This configuration of industrial MNO edge and DIY enterprise edge will drive new workflows and business models for broad drone adoption.

Tower inspections: a first use case

Today’s drone inspections of cell towers use automation to great effect: flight planning, data capture, and data processing are all automated. Drone automation software enables telcos and MNOs to create high-precision digital twins of their infrastructure, establish precise inventories of their base station equipment, measure their antenna tilts, spot rust, and plan tower maintenance remotely at scale.

That portfolio intelligence helps towercos and MNOs make smart business decisions, including accelerating the physical rollout of 5G and other new networks. For example, Rakuten Mobile is using drone operations management & visualisation software to inspect base station sites with drones as it rolls out the world’s first fully virtualised mobile network.

Edge computing is necessary for drone enablement at scale. There is no 5G without edge, but edge can be deployed as a precursor to broad-based 5G deployment in conjunction with 4G connectivity and P-LTE. Towercos and MNOs have an inside-out ROI equation, where drone operations can drive enormous savings and revenue creation.

That starts with creating the automation for 5G deployments and will ultimately result in a full Enterprise Resource Planning (ERP) system for the broader mobile industry. With configuration for low altitude connectivity, 4G and 5G networks will enable a high volume of drone operations, facilitate greater data exchange, and enable edge cloud computing at very low latency.

Mobile edge will make beyond visual line of sight (BVLOS) operations, dynamic deconfliction, and ultimately urban air mobility (UAM) possible. With 5G, drone inspections at the edge will become the norm, unlocking true automation, generating instant insights, and allowing for automated tasking based upon independent systems and sensors. These operational benefits will eventually equate to significant cost savings for MNOs and tower inspection companies.

The next generation: drone-in-a-box

Drone-in-a-box represents the next generation of drone technology. Towercos will lease prime edge real estate to data center operators, enabling edge cloud and real time management of drones-in-a-box. Those drones will be deployed at local edge positions around the world and autonomously conduct missions in their vicinity, including tower inspections and many other shared commercial missions. They’ll then return to their towers, where they can recharge and continue to send data to the cloud. It’s a fully automated solution that makes the most of edge computing and mobile connectivity and will give enterprises access to low-cost drone and edge computing resources.

Long-term edge applications

Drones at the edge will become a key part of our infrastructure and will be used to complete missions in the public interest, such as infrastructure inspections or public safety missions. We live in an ever-increasing shared resource economy. Drone services are valuable shared resources that can be used by multiple stakeholders, including public safety, emergency services and enterprises across multiple industries such as construction, utilities, insurance, real estate, and agriculture. These stakeholders will leverage the same physical connectivity, edge cloud data centers, and drone equipment to bring the benefits of automation to their workflows.

In many cases this full stack is justified by the initial use case, like cellular tower inspection, with additional utility and use cases coming for free. Drones at the edge represent a new frontier of automation technology that will revolutionize telcos, transform industrial inspections, and create exciting new use cases that serve society and industry alike.

Source: https://telecoms.com/opinion/drones-are-transforming-how-telcos-inspect-towers/ 27 07 20

The “Next Big Thing” in Technology : 20 Inventions That Will Change the World

23 Jul

An article titled “I Compiled a List of Tech’s “Next Big Things” So You Wouldn’t Have to” in 2018. As anyone reading it in the 2020s will notice, a lot of what was written then is now obsolete. It is thus necessary to write an update, highlighting the key technologies emerging today that will be all the rage in in 2022, 2025 and 2030s.

Obviously, these dates should be taken with a grain of salt : predictions are wrong more often than not. They are often wrong because we tend to use history, which is at heart the study of surprises and changes, as a guide to the future. This should however not stop us from aiming to better understand the future of technology : the knowledge gained through planning is crucial to the selection of appropriate actions as future events unfold. We don’t know the answer, but we can at least ask useful questions and catalyse the conversation.

The boring, expected stuff (2022 technologies)

1. Blockchain

By now, we’ve all heard about blockchain revolutionising just about every industry imaginable. Banking, politics, healthcare… all could technically benefit from the creation of a decentralised digital ledger which tracks and stores information in various places, thus making forgery impossible. Identification is provided through complex calculations, making identity theft virtually impossible, too.

There is however one word which stands out in the description above. Decentralised. Banks, governments, hospitals… these institutions don’t want to see their power curtailed (unless on their own terms). As such, it is likely that we will see some advances in the blockchain space, but it will remain on the fringes of technology, missing the revolution predicted by its (many) fans.

More on blockchain here [Investopedia].

2. Cryptocurrency

Often mentioned in the same breath as blockchain, cryptocurrencies use the principles explained above to facilitate the exchange of goods and services online (again in a decentralised fashion, which is one of its main appeals).

Sounds fantastic, but there are two big issues with cryptocurrencies :

  • It’s key appeal (excluding illegal dealings) is that it’s cool and trendy. It was never meant to sustain the attention it got in 2017, and will never recover from the crypto-bros’ unrelenting idiocy. The technology works, there’s just no mass market.

  • Secondly, its value is VERY subjective (unlike gold, don’t @ me). Crypto-currencies are always either in pre-bubble or bubble territory. Add to that the decentralised aspect that governments and banks will seek to discredit, and you can be sure that it will continue to be a mere toy Chad keeps bringing up at frat parties (there’s always a Chad).

More on cryptocurrency here [Investopedia].

3. Affective AI / Affective computing

Artificial Intelligence is already everywhere in 2020; it’s just not as fun as we thought it’d be. If you’ve missed the AI train, it can be described as follows : the increase in storage space (cloud), calculation capabilities (chips) and access to massive datasets (e-commerce, social media…) has allowed companies to create statistical models on steroid which can evolve when fed new information.

Affective AI would take this process one step further and apply it to emotions. Effectively, an algorithm could tell your mood from the way you look (by training a deep learning algorithm on facial data), the way you write, or the way you speak, and offer a product or service in accordance. Feeling happy ? How about a Starbucks advert for a frappuccino to keep the good times coming ? Feeling down ? How about a Starbucks advert for a frozzen coffee to turn that frown upside down ?

More on Artificial Intelligence here [The Pourquoi Pas], and more on affective computing here [MIT].

4. AI Cloud Services / Data-as-a-service / AI PaaS

Most great technologies aren’t considered to be revolutionary until they reach the public en masse (forgive my French). This may be one of the reasons there’s been so much disappointment in AI of late. Indeed, only major companies have been able to benefit from automating tasks that once required human input, while the petit peuple is forced to continue using comparatively medieval algorithms. This can in part be explained by a lack computing power within individual households, but is mostly a data problem.

This may not be the case for long. Companies are realising that renting an algorithm gives the double benefit of generating extra revenue from an existing asset, while extracting more data from customers to feed the beast. As such, get ready to witness the rise of AI platforms and marketplaces, which will promise to provide algorithms that specifically match unique customer pain points (chatbots and digital assistants are only the beginning). As devs get automated and join the gig economy, this movement is likely to expand exponentially. This would allow smaller companies, and even individuals, to optimise their day-to-day processes. If that seems harmful to our collective mental health, follow your instincts.

More on AI as a service here [Towards Data Science].

5. Connected Homes / Smart Homes

The trend of artificial intelligence in our homes is already ongoing, and will only accelerate over the next few years. In fact, we’ve already become accustomed to Google’s Nest and Amazon’s Alexa being able to adjust the settings of smart objects within our houses to fit pre-set parameters.

But these two use cases are just the beginning : as with most things internet-related, these services benefit from network effects, and will exponentially gain customer value as functionalities are added. An algorithm that can make a cup of coffee while opening the blinds and increasing the bathroom temperature when it senses someone waking up is a lot more valuable than the sum of three different algorithms doing these tasks.

More on connected homes here [McKinsey]

6. 5G

Of course, connected objects cannot afford to be as laggy as the original iPhone (shots fired) : they must transmit massive amounts of data quickly and reliably. That’s where 5G comes in.

5G is the logical successor to 4G, and achieves much greater speeds thanks to higher-frequency radio waves. Though this seems simple enough, a few terms have to be understood to fully capture the difficulty of implementing 5G throughout the world.

  • Millimeter waves : this refers to a specific part of the radio frequency spectrum between 24GHz and 100GHz, which have a very short wavelength. Not only is this section of the spectrum pretty much unused, but it can also transfer data incredibly fast, though its transfer distance is shorter.

  • Microcells, femtocells, picocells : Small cell towers which act as relays within comparatively small areas such as large buildings. This infrastructure is necessary : as highlighted above, 5G transfer distance is much shorter than that of 4G (and struggles to go through thick walls).

  • Massive MIMO : The ability to transfer and receive much more data than when using 4G, from a wider variety of sources.

  • Beamforming : all these transfers need to be organised and choreographed. Beamforming does just that. Also, it sounds cool.

  • Full Duplex : the ability to send and receive data at the same time, on the same wavelength.

The technology will have a huge effect on most industries as it will change orders of magnitude in terms of the speed and quantity of data transmitted, as well as the quality of the connection. It will, among other things, connect autonomous vehicles and drones to the internet, but will also allow major advances in virtual reality and IoT. 5G is therefore not a technology that should be taken lightly.

More on 5G here [CNN]

7. Mega-constellations of satellites / Low-earth orbit satellite systems

Speaking of Internet… Over the next few years, SpaceX plans to deploy up to 42,000 satellites to create an Internet connection anywhere on the planet. The company isn’t alone in this niche: the OneWeb constellation aims to include 600 satellites by 2022, and Amazon has announced plans to launch 3,236 low-orbit satellites to cover white areas.

All this is made possible thanks to the low cost of launching these nanosatellites, which weigh barely a few pounds. A lower altitude would also make managing fleets a lot easier and cleaner.

The deployment in space of so many objects, however, poses problems in terms of interference with other satellite services, increasing the risk of collision and disturbing astronomical observation.

More on mega-constellations here [The Verge]

The pretty OK stuff (2025 technologies)

8. Autonomous Vehicles

2020 was supposed to be the year of the autonomous car. That’s not worked out quite as expected. The “coronavirus setback” will however not dampen large companies’ spirits, which will continue to update their algorithms to create cars that do away with drivers entirely.

As a quick reminder, it is generally agreed that there are 5 levels of autonomous driving, ranging from “no automation” to “full automation”. Level 0 to 2 require extensive human monitoring,while levels 3 to 5 rely on algorithms to monitor the driving environment. The most advanced autonomous cars on the market (Tesla) are currently straddling level 3 and 4. It is hoped that we can make the jump to level 5 (and full driving automation) by 2025, if not earlier. But the road ahead is long, as issues ranging from ethical dilemmas to statistical headaches still plague the industry.

Even if level 5 is reached, it’s likely that we will never truly replace the cars as we know it, but instead create special roads and spaces for autonomous cars, so that the two don’t mix. Indeed, the car as we know it is so central to our daily lives that changing it may mean rebuilding most of our daily world : parking would become less important, charging stations would change, the ways pedestrians interact with safer roads would be forever altered…

More on Autonomous vehicles here [Spectrum].

9. Quantum computing

First things first : scientists have been announcing the arrival of the quantum computer for over 50 years. But this time might be it. In October 2019, Google announced that it had achieved quantum supremacy (superiority of a quantum computer compared to a conventional computer on a particular task) by performing in three minutes a calculation which would require approximately 10,000 years on a conventional supercomputer. These figures were challenged by IBM, which estimates that a conventional computer program could have solved it in just 2.5 days.

Quantum computers, where bits are replaced by qubits with superimposable states (ex : a 0 can also be a 1 at the same time), are in theory much faster and more efficient than their older brothers, but tend to suffer from decoherence issues (loss of information). Nevertheless, developing them for pharmaceutical companies, for example, could theoretically lead to major breakthroughs in medicine creation.

More interestingly, quantum computers could easily figure out encrypted blockchain passwords, making the whole thing irrelevant (did I not say earlier that Bitcoin was doomed).

More on Quantum computing here [MIT Technology Review]. You can also take a Quantum computing class here [Qmunity].

10. Genetic predictions

The raw computing power highlighted above can be used to analyse one’s genome and predict one’s chances of getting conditions such as heart disease or breast cancer. If that sounds exactly like the plot of Gattaca, trust your instincts.

Regardless of the risks of genetic discrimination, DNA-based “predictions” could be the next great public health leap. For example, if women at high risk for breast cancer got more mammograms and those at low risk got fewer, those exams might catch more real cancers and set off fewer false alarms, leading to a better treatment rate and lower insurance premia.

It could also lead to the rise of personalised medicine, though the logistics of such a task would likely be a financial and logistical disaster given the current political climate (at least in the US).

More on Genetic predictions here [MIT Technology Review].

11. CRISPR

Even if a Gattaca-like future does come about from genetic predictions, we might still create a similar situation through straight up genetic engineering. CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) allows researchers to easily alter DNA sequences and modify gene functions. Its many potential applications include correcting genetic defects, treating and preventing the spread of diseases and improving crops.

Editing germs to make new viruses or a “master race” is however a less fun prospect, should this technology get into unethical hands. Either way, I look forward to a time when every man looks like a mix of Tom Hiddleston and Idris Elba.

More on CRISPR here [Live Science].

12. Human Augmentations / enhancements

Thankfully, going genetic is not the answer to everything. Sometimes, some good old ingenuity and robotics is enough to solve our issues.

Slowly but surely, we are seeing more and more natural, artificial, or technological alteration of the human body in order to enhance physical or mental capabilities, often in the form of bionic limbs. As we begin to better understand how the brain transmits information to the body, more and more companies will begin to see the value of improving people’s life (for a steep fee) and descend upon this space.

It’s very likely that beyond the arms and legs augmentations that we’re already starting to see, there will be a point at which the back and the eyes are also augmented. Then, slowly but surely, augmentations will become elective, with interesting ethical implications.

More on human enhancements here [International Journal of Human-Computer Studies]

The Very Exciting Stuff (2030 Technologies)

13. Graphene

Though graphene has been over-hyped for so many years, we’re finally seeing something good come out of it. If you haven’t paid attention to the hype, graphene is a byproduct of graphite, which is itself carbon’s very close cousin. It is extremely strong, yet extremely thin, light and flexible (stronger than steel, thinner than paper). Oh, and it also conducts electricity really well.

The applications are numerous, specifically for wearable electronics and space travel, where resistance and weight is a key component. Nevertheless, it will take many years to get to a wide array of use cases : we’ve built the world around silicon, and it’s very had to displace that kind of well-established, mature technology.

More on graphene here [Digital Trends]

14. Edge Computing / Intelligent Edge

While the vast majority of data processing for connected devices now happens in the cloud, constantly sending data back and forth can take far too long (as much as a few seconds, sometimes). 5G is a temporary answer, as mentioned above, but there might be a simpler solution : allowing objects to process data on their own, without using Cloud technology (at the”edge” of the eco-system). This would unlock a wide variety of issues in manufacturing, transport, and healthcare, where split-second decisions are key to a variety of process. Even fashion could benefit by creating self-sufficient smart wearables.

As intelligent “things” proliferate, expect a shift from stand-alone intelligent objects to swarms of collaborative intelligent things. In this model, multiple devices would work together, either independently or with human input by grouping together their computing power. The leading edge of this area is being used by the military, which is studying the use of drone swarms to attack or defend military targets, but could likely go much further with hundreds of potential civilian uses.

The technology is nearly available, but as with other developments both above and below, we must first let the hardware capabilities catch up before implementing these ideas.

More on Edge computing here [The Verge]

15. Micro-chips / Bio-chips

The current main idea behind micro-chips (which are made from an array of molecular sensors on the chip surface that can analyze biological elements and chemicals) is for tracking biometrics in a medical context. It has also seen use cases emerge within the smart workspace technology ecosystem. It could however have a much wider appeal if customers decide to put their trust into it (such as banking - imagine never having to bring your wallet anywhere ever again).

Unless everyone suddenly agrees to let their blood pressure be monitored daily at work, this type of tracking is likely to remain benign in the near future. One might nevertheless imagine them becoming fairly commons in hospitals.

More on micro-chips here [The Guardian].

16. Nanorobotics

For those wanting to go even smaller than micro-chips, allow me to introduce nanorobots. Currently in R&D phases in labs throughout the world, nanorobots are essentially very, very tiny sensors with very limited processing power.

The first useful applications of these nanomachines may very well be in nanomedicine. For example, biological machines could be used to identify and destroy cancer cells or deliver drugs. Another potential application is the detection of toxic chemicals, and the measurement of their concentrations, in the environment.

More on Nanorobotic here [Nature].

17. Smart tattoos

Tattoos that can send signals via touch to interact with the world around us makes a lot of sense :

  • It’s wearable which allows for a greater freedom of movement

  • It tackles the issue of waste, which is seldom discussed when imagining the future of technology

  • It can be personalised, a trend towards which we’ve been moving for 15 years now.

In their current form, they would be temporary on the skin. They can however last much longer on prosthetic, and have the benefit of being cheap compared to a lot of the hardware available out there.

More on Smart Tattoos here [Microsoft]

18. Green Tech

Do you want your great-grand-kids to know what it’s like not to despise the sun? Then forget about all the above and concentrate on Green Tech : the science of making the world liveable. Because so much is being done in this space, we will avoid the details, and refer to better sources :

The issue with most of the above is that they tend to work well in theory, but their adoption cost is incredibly high, as they often struggle to scale. As much as we’d like to see all of them being implemented yesterday, the road ahead is still long.

More on Green Tech here [CB Insights]

19. Hydrogen fuel cells 

In a fuel cell, hydrogen combines with oxygen in the air to produce electricity, releasing only water. In itself, this isn’t new, as this principle was discovered in 1839; up until a few years ago, this idea was not profitable enough to allow for their large-scale use.

In fact, there are still some issues with the technology, as it’s easy to store a small amount of energy (hence its use in the space exploration industry), but incredibly hard to do at a larger scale.

See you in 2030 to see if we’ve solved these issues.

More on Hydrogen fuel cells here [FCHEA].

20. Meatless Meat 

I’ve tried it : lab-made meat smells, looks and tastes just like meat (beyond the odd uncanny-valley-like taste). The only things that change : healthier food, no antibiotics, no growth hormones, no emission of greenhouse gases and no animal suffering.

Above all, this is a gigantic market that whets the appetites of industrialists. After targeting vegetarians, they’ve realised that it’s much easier and rewarding to market these products to flexitarians (back in my days we called them omnivores).

By 2030, 10% of the meat eaten in the world will no longer come from an animal (allegedly). The principle is there, the technology works… all that’s left to see is if it will be widely adapted.

More on Meatless meat here [Vox].

Conclusion

Technology has a tendency to hold a dark mirror to society, reflecting both what’s great and evil about its makers. It’s important to remember that technology is often value-neutral : it’s what we do with it day in, day out that defines whether or not we are dealing with the “next big thing”.

Good luck out there.

Source: https://www.thepourquoipas.com/post/the-next-big-thing-in-technology-20-inventions 23 07 20

Multicloud deployments become go-to strategy as AWS, Microsoft Azure, Google Cloud grab wallet share

28 Apr

How multicloud is becoming a default strategy for many enterprises and the big three providers are grabbing more of your cloud budget.

 

Enterprises are all-in on multicloud as 93% of companies have a strategy to use multiple providers as Microsoft Azure, Amazon Web Services and Google Cloud all vie for customers and grab more wallet share, according to Flexera’s State of the Cloud report.

The findings highlight how multicloud is becoming the primary architecture choice as companies plan to mix clouds to avoid lock-in. Flexera had 750 respondents with 554 enterprises and 196 small and medium sized businesses. Of the respondents, 53% were advanced cloud users.

Flexera found that the top three public cloud providers remain AWS, Azure and Google, which has the fastest adoption in growth compared to a year ago. Flexera also found that Azure is narrowing the gap with AWS in the percentage of enterprises using it as well as the number of virtual machines.

Forty percent of AWS users spend at least $1.2 million annually with 36% spending that amount on Azure, according to Flexera.

flexera-state-of-cloud-2020-f.png
flexera-state-of-cloud-2020-e.png

To move their multicloud strategies along, Flexera found enterprises are betting heavily on containers. According to Flexera:

  • 65% of organizations are using Docker for containers with 58% using Kubernetes.
  • Container-as-a-service offerings from AWS, Azure and Google are all seeing strong growth.
  • Enterprises say they are lacking in resources and expertise to deal with container challenges.
  • And 33% of organizations use multicloud management tools.

COVID-19 may also accelerate shifts to the cloud. Flexera found at least half of companies are accelerating their cloud plans amid the COVID-19 pandemic and move to remote work. Indeed, 59% of enterprises say cloud usage will exceed prior plans due to the pandemic, according to Flexera.

flexera-state-of-cloud-2020-covid.png

Among other key findings:

  • Respondents said more than 50% of enterprise workloads and data are expected to be in the public cloud within 12 months.
  • SMBs are moving to the cloud at a faster rate with 70% of smaller enterprises saying data and workloads will be in the cloud in the next 12 months.
  • 87% have a hybrid cloud strategy.
flexera-state-of-cloud-2020-b.png
  • 79% of respondents plan to optimize existing use of cloud for cost savings, with 61% focused on cloud migration.
  • IoT is the top growing cloud platform as a service offering followed by container as a service, machine learning and AI, data warehouse and serverless.
  • 73% of enterprises have a cloud team that is centralized and 51% use managed service providers to manage usage.
  • 83% of respondents said security is their top challenge with 82% citing costs.
  • 56% of organizations said understanding the cost implications of software licenses is a big cloud challenge. Respondents said 30% of cloud spend is wasted.
  • Understanding app dependencies was the top challenge for cloud migrations followed by technical feasibility and assessing on-prem vs. cloud costs.
flexera-state-of-cloud-2020-a.png
flexera-state-of-cloud-2020-c.png
  • Ansible and Terraform were the two most widely adopted configuration tools.
  • 63% of enterprises are adopting relational DBaaS.
  • VMware vSphere leads in private cloud adoption with Azure Stack and AWS Outpost showing strongest growth.
  • 22% of enterprises spend more than $12 million a year on public cloud.
flexera-state-of-cloud-2020-d.png

 

Source: https://www.zdnet.com/article/multicloud-deployments-become-go-to-strategy-as-aws-microsoft-azure-google-cloud-grab-wallet-share/ 28 04 20