Archive | Internet of things (IOT) RSS feed for this section

5G mobile networks: A cheat sheet

17 Aug

As LTE networks become increasingly saturated, mobile network operators are planning for the 5G future. Here is what business professionals and mobile users need to know about 5G networks.

What is 5G?

5G refers to the fifth generation of mobile phone networks. Since the introduction of the first standardized mobile phone network in 1982, succeeding standards have been adopted and deployed approximately every nine years. GSM, the 2nd generation wireless network, was first deployed in 1992, while a variety of competing 3G standards began deployment in 2001. The 4G LTE wireless technology standard was deployed by service providers in 2010. Now, technology companies and mobile network operators are actively deploying 5G cellular networks around the world for new mobile devices. These 5G deployments accompany transitional LTE technologies such as LTE Advanced and LTE Advanced Pro, which are used by network operators to provide faster speeds on mobile devices.

Principally, 5G refers to “5G NR (New Radio),” which is the standard adopted by 3GPP, an international cooperative responsible for the development of the 3G UMTS and 4G LTE standards. Other 5G technologies do exist. Verizon’s 5G TF network operates on 28 and 39 GHz frequencies, and is used only for fixed wireless broadband services, not in smartphones. Verizon’s 5G TF deployments were halted in December 2018, and will be transitioned to 5G NR in the future. Additionally, 5G SIG was used by KT for a demonstration deployment during the 2018 Winter Olympics in Pyeongchang.

5G NR allows for networks to operate on a wide variety of frequencies, including the frequencies vacated by decommissioning previous wireless communications networks. The 2G DCS frequency bands, the 3G E-GSM and PCS frequency bands, and the digital dividend of spectrum vacated by the transition to digital TV broadcasts are some of the bands available for use in 5G NR.

5G standards divide frequencies into two groups: FR1 (450 MHz – 6 GHz) and FR2 (24 GHz – 52 GHz). Most early deployments will be in the FR1 space. Research is ongoing into using FR2 frequencies, which are also known as extremely high frequency (EHF) or millimeter wave (mmWave) frequencies. Discussions of the suitability of millimeter wave frequencies have been published in IEEE journals as far back as 2013.

Millimeter wave frequencies allow for faster data speeds, though they do come with disadvantages. Because of the short distance of communication, millimeter wave networks have a much shorter range; for densely-populated areas, this requires deploying more base stations (conversely, this makes it well suited to densely-populated places such as arenas and stadiums). While this would be advantageous in certain use cases, it would be a poor fit for use in rural areas. Additionally, millimeter wave communication can be susceptible to atmospheric interference. Effects such as rain fade make it problematic for outdoor use, though even nearby foliage can disrupt a signal.

Tests of early 5G mmWave networks by sister site CNET surfaced a number of performance problems, with the Moto Z3Samsung Galaxy S10 5G, and LG V50 depleting their battery faster than on 4G networks. In the case of the Moto Z3—which uses a pogo-pin connected Moto Mod add-on to deliver 5G—four hours of testing completely drained the battery in the attachment; the use of sub-6 GHz 5G networks is expected to lessen this effect. Likewise, increased efficiency in Qualcomm’s upcoming Snapdragon X55 modem will alleviate some performance issues.

It is vital to remember that 5G is not an incremental or backward-compatible update to existing mobile communications standards. It does not overlap with 4G standards like LTE or WiMAX, and it cannot be delivered to existing phones, tablets, or wireless modems by means of tower upgrades or software updates, despite AT&T’s attempts to brand LTE Advanced as “5G E.”While upgrades to existing LTE infrastructure are worthwhile and welcome advances, these are ultimately transitional 4G technologies and do not provide the full range of benefits of 5G NR.

For an overview of when 5G smartphones are being released, as well as the benefits and drawbacks of 5G smartphones, check out TechRepublic’s cheat sheet about 5G smartphones.

What constitutes 5G technology?

For mobile network operators, the 3GPP has identified three aspects for which 5G should provide meaningful advantages over existing wireless mobile networks. These three heterogenous service types will coexist on the same infrastructure using network slicing, allowing network operators to create multiple virtual networks with differing performance profiles for differing service needs.

eMBB (Enhanced Mobile Broadband)

Initial deployments of 5G NR focused on eMBB, which provides greater bandwidth, enabling improved download and upload speeds, as well as moderately lower latency compared to 4G LTE. eMBB will be instrumental in enabling rich media applications such as mobile AR and VR, 4K and 360° video streaming, and edge computing.

URLLC (Ultra Reliable Low-Latency Communications)

URLLC is targeted toward extremely latency sensitive or mission-critical use cases, such as factory automation, robot-enabled remote surgery, and driverless cars. According to a white paper (PDF link) by Mehdi Bennis, Mérouane Debbah, and H. Vincent Poor of the IEEE, URLLC should target 1ms latency and block error rate (BLER) of 10−9 to 10−5, although attaining this “represents one of the major challenges facing 5G networks,” as it “introduces a plethora of challenges in terms of system design.”

Technologies that enable URLLC are still being standardized; these will be published in 3GPP Release 16, scheduled for mid-2020.

mMTC (Massive Machine Type Communications)

mMTC is a narrowband access type for sensing, metering, and monitoring use cases. Some mMTC standards that leverage LTE networks were developed as part of 3GPP Release 13, including eMTC (Enhanced Machine-Type Communication) and NB-IoT (Narrowband IoT). These standards will be used in conjunction with 5G networks, and extended to support the demands of URLLC use cases on 5G networks and frequencies in the future.

The ways in which 5G technologies will be commercialized are still being debated and planned among mobile network operators and communications hardware vendors. As different groups have differing priorities, interests, and biases, including spectrum license purchases made with the intent of deploying 5G networks, the advantages of 5G will vary between different geographical markets and between consumer and enterprise market segments. While many different attributes are under discussion, 5G technology may consist of the following (the attributes are listed in no particular order).

Proactive content caching

Particularly for millimeter wave 5G networks, which require deploying more base stations compared to LTE and previous communications standards, those base stations in turn require connections to wired backhauls to transmit data across the network. By providing a cache at the base station, access delays can be minimized, and backhaul load can be reduced. This has the added benefit of reducing end-to-end delay. As 4K video streaming services—and smartphones with 4K screens—become more widespread, this caching capability will be important to improve quality of service.

Multiple-hop networks and device-to-device communication

In LTE networks, cellular repeaters and femtocells bridge gaps in areas where signal strength from traditional base stations is inadequate to serve the needs of customers. These can be in semi-rural areas where population density complicates serving customers from one base station, as well as in urban areas where architectural design obstructs signal strength. Using multiple-hop networks in 5G extends the cooperative relay concept by leveraging device-to-device communication to increase signal strength and availability.

Seamless vertical handover

Although proposals for 5G position it as the “one global standard” for mobile communications, allowing devices to seamlessly switch to a Wi-Fi connection, or fall back to LTE networks without delay, dropped calls, or other interruptions, is a priority for 5G.

Who does 5G benefit?

Remote workers / off-site job locations

One of the major focuses of 5G is the ability to use wireless networks to supplant traditional wireline connections by increasing data bandwidth available to devices and minimizing latency. For telecommuters, this greatly increases flexibility in work locations, allowing for cost-effective communication with your office, without being tied to a desk in a home office with a wireline connection.

For situations that involve frequently changing off-site job locations, such as location movie shoots or construction sites, lower technical requirements for 5G deployment allow for easily set up a 5G connection to which existing devices can connect to a 5G router via Wi-Fi. For scenes of live breaking news, 5G technologies can be used to supplant the traditional satellite truck used to transmit audio and video back to the newsroom. Spectrum formerly allocated to high-speed microwave satellite links has been repurposed for 5G NR communication.

Internet of Things (IoT) devices

One priority for the design of 5G networks is to lower barriers to network connectivity for IoT devices. While some IoT devices (e.g., smartwatches) have LTE capabilities, the practical limitations of battery sizes that can be included in wearable devices and the comparatively high power requirements of LTE limit the usefulness of mobile network connectivity in these situations. Proposals for 5G networks focusing on reducing power requirements, and the use of lower-power frequencies such as 600 MHz, will make connecting IoT devices more feasible.

Smart cities, office buildings, arenas, and stadiums

The same properties that make 5G technologies a good fit for IoT devices can also be used to improve the quality of service for situations in which large numbers of connected devices make extensive use of the mobile network in densely populated areas. These benefits can be realized easily in situations with variable traffic—for instance, arenas and stadiums are generally only populated during sporting events, music concerts, and other conventions. Large office towers, such as the 54-story Mori Tower in Tokyo’s Roppongi Hills district, are where thousands of employees work during the week. Additionally, densely populated city centers can benefit from the ability of 5G networks to provide service to more devices in physically smaller spaces.

When and where are 5G rollouts happening?

Early technical demonstrations

The first high-profile 5G rollout was at the 2018 Winter Olympic Games in Pyeongchang, South Korea. KT (a major mobile network operator) Samsung, and Intel collaborated to deliver gigabit-speed wireless broadband, and low-latency live streaming video content. During the games, 100 cameras were positioned inside the Olympic Ice Arena, which transmitted the video to edge servers, then to KT’s data center to be processed into “time-sliced views of the athletes in motion,” and then transmitted back to 5G-connected tablets for viewing. This demonstration used prototype 5G SIG equipment, which is distinct from the standardized 5G NR hardware and networks being commercialized worldwide.

Similarly, Intel and NTT Docomo have announced a partnership to demonstrate 5G technology at the 2020 Tokyo Olympic Games. The companies will use 5G networks for 360-degree, 8K-video streaming, drones with HD cameras, and smart city applications, including “pervasive facial recognition, useful for everything from stadium access to threat reduction.”

Other 5G tests and rollouts have occurred worldwide. Ericsson and Intel deployed a 5G connection to connect Tallink cruise ships to the Port of Tallinn in Estonia. Huawei and Intel demonstrated 5G interoperability tests at Mobile World Congress 2018. In China, ZTE conducted tests in which the company achieved speeds in excess of 19 Gbps on a 3.5 GHz base station. Additionally, in tests of high-frequency communications, ZTE exceeded 13 Gbps using a 26 GHz base station, and a latency of 0.416 ms in a third test for uRLLC.

Where is 5G available in the US?

Verizon Wireless deployed mmWave-powered 5G, marketed as “Ultra Wideband (UWB),” in Chicago, IL and Minneapolis, MN on April 3, 2019; in Denver, CO on June 27, 2019; in Providence, RI on July 1, 2019; in St. Paul, MN on July 18, 2019; and in Atlanta, GA, Detroit, MI, Indianapolis, IN, and Washington, DC on July 31, 2019.

Future deployments of Verizon’s 5G services have been announced for Boston, MA, Charlotte, NC, Cincinnati, Cleveland, and Columbus, OH, Dallas, TX, Des Moines, IA, Houston, TX, Little Rock, AR, Memphis, TN, Phoenix, AZ, Providence, RI, San Diego, CA, and Salt Lake City, UT, as well as Kansas City, by the end of 2019.

Verizon Wireless started deployments of its 5G fixed wireless internet service on October 1, 2018 in Los Angeles and Sacramento, CA, Houston, TX, and Indianapolis, IN. Verizon’s initial 5G network deployments use its proprietary 5G TF hardware, though the company plans to transition these networks to 5G NR in the future. Verizon’s 5G TF network is only used for home internet service, not in smartphones.

AT&T has active 5G deployments in Atlanta, GA, Austin, Dallas, Houston, San Antonio, and Waco, TX, Charlotte, NC, Indianapolis, IN, Jacksonville and Orlando, FL, Las Vegas, NV, Los Angeles, San Diego, San Francisco, and San Jose, CA, Louisville, KY, Nashville, TN, New Orleans, LA, New York City, NY, Oklahoma City, OK, and Raleigh, NC. Deployments have also been announced for Chicago, IL, Cleveland, OH, and Minneapolis, MN.

AT&T has deployed LTE Advanced nationwide; the company is marketing LTE Advanced as a “5G Evolution” network, though LTE-Advanced is not a 5G technology. AT&T has a history of mislabeling network technologies; the company previously advertised the transitional HSDPA network as 4G, though this is commonly considered to be an “enhanced 3G” or “3.5G” standard.

Sprint started deployments of 5G on May 30, 2019 in the Dallas / Ft. Worth and Houston, TX, Kansas City / Overland Park, KS, and Atlanta, GA metro areas. Sprint’s 5G networks run on 2.5 GHz, providing more widespread coverage throughout a region than is possible on line-of-sight mmWave connections, though with a modest decrease in speed compared to mmWave networks. Sprint activated 5G service in Chicago on July 11, 2019. The company has also announced plans to deploy 5G in Los Angeles, CA, New York, NY, Phoenix, AZ, and Washington, DC.

T-Mobile USA has active 5G services in Atlanta, GA and Cleveland, OH, with future plans to bring 5G services to Dallas, TX, Los Angeles, CA, Las Vegas, NV, and New York, NY. T-Mobile’s deployment is powered by Ericcson AIR 3246 modems, which support both 4G LTE and 5G NR. This equipment allows for 5G and LTE networks to be operated from the same equipment.

The purchase of Sprint by T-Mobile has been approved by the Justice Department, though a multi-state lawsuit is aiming to prevent the deal from proceeding. If the merger goes forward, “only the New T-Mobile will be able to deliver… real, game-changing 5G,” according to T-Mobile CEO John Legere in a June 2019 blog post. Following a merger, the New T-Mobile will have 600 MHz low-band, 2.5 GHz mid-band, and mmWave spectrum holdings, putting it at an advantage relative to AT&T and Verizon.

Where is 5G available in the UK?

EE debuted 5G services in Belfast, Birmingham, Cardiff, Edinburgh, London, and Manchester on May 30, 2019. Availability of 5G by the end of 2019 is planned for Bristol, Coventry, Glasgow, Hull, Leeds, Leicester, Liverpool, Newcastle, Nottingham, and Sheffield. Availability of 5G in 2020 is planned for Aberdeen, Cambridge, Derby, Gloucester, Peterborough, Plymouth, Portsmouth, Southampton, Wolverhampton, and Worcester.

BT, which owns EE, is anticipated to deploy separate BT-branded 5G services in London, Manchester, Edinburgh, Birmingham, Cardiff, and Belfast in autumn 2019.

Vodafone provides 5G services in Birkenhead, Birmingham, Bolton, Bristol, Cardiff, Gatwick, Glasgow, Lancaster, Liverpool, London, Manchester, Newbury, Plymouth, Stoke-on-Trent, and Wolverhampton at present, with deployments planned for Blackpool, Bournemouth, Guildford, Portsmouth, Reading, Southampton, and Warrington by the end of 2019.

Three will begin rollout of 5G services in London in August 2019, with services for Birmingham, Bolton, Bradford, Brighton, Bristol, Cardiff, Coventry, Derby, Edinburgh, Glasgow, Hull, Leeds, Leicester, Liverpool, Manchester, Middlesbrough, Milton Keynes, Nottingham, Reading, Rotherham, Sheffield, Slough, Sunderland, and Wolverhampton expected before the end of the year.

Three and Vodafone do not charge a premium for 5G network services in the UK, compared to their rate plans for 4G.

O₂ announced availability of 5G services for Belfast, Cardiff, Edinburgh, London, Slough, and Leeds “from October 2019,” with plans to bring expand 5G services to “parts of 20 towns and cities, before rolling out to a total of 50 by summer 2020.”

Where is 5G available in Australia?

Optus has 100 5G-capable sites in service, and has pledged to build 1,200 by March 2020.

Telstra commenced rollout of 5G networks, starting with the Gold Coast in August 2018. Telstra services select neighborhoods in Adelaide, Brisbane, Canberra, Gold Coast, Hobart, Launceston, Melbourne, Perth, Sydney, and Toowoomba.

Australia’s National Broadband Network (NBN) operator has declared its intent to provide 5G fixed wireless internet access in a statement to ZDNet.

Chinese vendors Huawei and ZTE have been banned by the Australian government from providing 5G networking equipment to mobile network operators due to national security concerns.

Where else in the world is 5G available?

South Korea was the first country to have a commercially available 5G network, with SK Telecom, KT, and LG Uplus activating 5G networks on April 3, 2019, two hours before Verizon Wireless activated 5G in the US, according to ZDNet’s Cho Mu-Hyun. By April 30, 2019, 260,000 subscribers in South Korea were using 5G networks. KT, the country’s second-largest mobile carrier, is working on deployments of in-building repeaters for use in crowded buildings such as airports and train stations.

5G is also seen as vital for economic development among Gulf states, with Saudi Arabia including 5G as part of the Vision 2030 economic development plan, and Qatari network operator Ooredoo claiming “the first commercially available 5G network in the world” on May 14, 2018, prior to the availability of smartphones that can use 5G.

Ookla maintains a map of 5G network services worldwide, with networks categorized into Commercial Availability, Limited Availability, and Pre-Release to demonstrate the extent of availability for each observed deployment.

How does a 5G future affect enterprises and mobile users?

As technology advances, older devices will inevitably reach end-of-life; in the mobile space, this is an outsized concern, as wireless spectrum is a finite resource. Much in the same way that the digital switchover occurred for over-the-air TV broadcasts, older mobile networks are actively being dismantled to free spectrum for next-generation networks, including transitional LTE Advanced, LTE Advanced Pro, and “true” 5G networks.

In the US, AT&T disabled its 2G network on January 1, 2017, rendering countless feature phones—as well as the original iPhone—unusable. Verizon plans to disable its legacy 2G and 3G networks by the end of 2019, which will render most feature phones and older smartphones unusable, as well as IoT devices such as water meters. Verizon stopped activations of 3G-only phones in July 2018. End-of-life plans for the 2G networks of Sprint and T-Mobile have not been publicly disclosed.

Additionally, as 5G is used increasingly to deliver wireless broadband, wireline broadband providers will face competition as the two services approach feature parity. With many people using smartphones both as their primary computing device and for tethering a traditional computer to the internet, the extra cost of a traditional wireline connection may become unnecessary for some people, and enable those outside the reach of traditional wireline connections to have affordable access to high-speed for the first time.

Business customers may also integrate 5G technology in proximity-targeted marketing. 5G’s reliance on microcells can be used as a secondary means of verification to protect against GPS spoofing, making proximity-targeted marketing resistant to abuse.

As 5G specifications are designed around the needs of businesses, the low-power and low-latency attributes are expected to spark a revolution in IoT deployments. According to Verizon Wireless President Ronan Dunne, 5G will enable the deployment of 20 billion IoT devices by 2020, leading to the creation of the “industrial internet,” affecting supply chain management, as well as agriculture and manufacturing industries. These same attributes also make 5G well suited to use cases that require continuous response and data analysis, such as autonomous vehicles, traffic control, and other edge computing use cases.



How will 5G, IoT and small cells support the enterprise?

7 Aug

As early 5G deployments begin amid expectations that the next generation of wireless technology will be enterprise-led, Small Cell Forum has been exploring the intersection of 5G, small cell infrastructure and the internet of things.

“It is widely accepted in the industry that small cells, based on 5G radio technologies (New Radio) and 5G-network technologies … along with auxiliary technologies such as Edge Computing and Network Slicing are essential components in realization of … 5G-Era use case categories,” the organization said in a newly published technical paper on enterprise 5G IoT use cases. Small Cell Forum said that it considers “5G-Era” to be a time when the three major use-case categories for 5G — enhanced mobile broadband, ultra-reliable and low-latency communications and massive and critical IoT — are commercially deployed.

While 4G IoT technologies are being deployed to serve IoT needs, Small Cell Forum said that those offering suit “a particular set of applications with limited bandwidth, latency and reliability requirements” and that  current performance and scalability are lower than the targets for 5G, but sufficient to establish new connectivity business models. It also considers vehicle-to-everything technologies an “application-focused building block” for 5G IoT.

The forum categorizes new 5G IoT use cases in three broad categories:

-Massive IoT, requiring very high connection densities of up to 1 million devices per square kilometer, with low per-device data rates.

-Critical IoT, where performance and reliability are the key requirements.

-An emerging category of high-bandwidth IoT devices, such as security cameras and some healthcare applications.

Small cells will be necessary to support 5G IoT use cases because their small physical footprint means they can be deployed in high densities, the forum said. They are also the obvious choice to support the short propagation range of millimeter-wave-based 5G networks.

Small Cell Forum cited estimates from McKinsey that the overall IoT market opportunity could be as high as $11 trillion by 2025, and that the segments served best by indoor small cells or small cell networks (homes, retail, offices, factories and worksites) will make up an estimated $6.29 trillion by 2025.

The technical paper made note of several trends in specific enterprise segments that are suited for 5G IoT support via small cells, including:

-Modern office spaces are already becoming more mobile, the forum noted, depending on wireless technology — primarily enterprise-grade Wi-Fi — for collaboration and fluid environments, but “5G wireless broadband technologies can provide more reliable solutions, with intrinsically built-in mobility and security aspects of cellular systems.”

-Industrial IoT applications are particularly suited to benefit from auxiliary technologies including a virtualized, distributed network assets and edge computing, in order to help with reduced latency for applications such as robotic control. 5G network technologies including the Control Plane and User Plane Separation, or CUPS, offers the possibility for user plane functions to be brought closer to end devices to benefit IoT implementations; while network slicing will allow network resources to be partitioned and allocated for specific IoT applications in a way that is either static or potentially dynamic. Meanwhile, virtualization offers flexibility.

“Retailers with distributed locations, banks with branch models, and restaurant chains are some of the most likely targets for [enterprise small cell network]solutions,” the report said. “These IoT-rich environments have broad requirements for unencumbered coverage coupled with novel efficiency and isolation needs that drive specific optimizations and slicing needs. Similarly, worksites, cities, and transportation/logistics hubs also require novel security solutions best met with closed or private access and small cells.”


Why 5G, a battleground for US and China, is also a fight for military supremacy

4 Aug
  • Next-generation networks will be vital to future military operations, raising the stakes between those developing the technology
  • It may be easier to hack 5G, but strategic motivations are also behind concerns of the United States, experts say
Illustration: Brian Wang
Illustration: Brian Wang

Apart from its tremendous commercial benefits, 5G – the fifth generation of mobile communication – is revolutionising military and security technology, which is partly why it has become a focal point in the United States’ efforts to contain China’s rise as a tech power and its allegations against Chinese companies.

The future landscape of warfare and cybersecurity could be fundamentally changed by 5G. But experts say 5G is more susceptible to hacking than previous networks, at a time of rising security concerns and US-China tensions on various interconnected fronts that include trade, influence in the Asia-Pacific region and technological rivalry.

These tensions provide the backdrop to controversy surrounding Huawei, the world’s largest telecoms equipment supplier.

Long before the Chinese company was indicted in the US this week on multiple charges including stealing trade secrets and violating US sanctions – charges it denies – US intelligence voiced concerns that Huawei’s telecommunications equipment could contain “back doors” for Chinese espionage.
US lawmakers seek to ban chip sales to China’s Huawei and ZTE for ‘violating American sanctions’

Huawei has repeatedly denied these allegations, but the controversies have underlined 5G’s growing importance and stepped up the technological arms race between China and the US.

To most people, the next-generation networks, which will be at least 20 times faster than the most advanced networks today, may just mean faster downloads of movies or smoother streaming. But they have much bigger potential than that.

Whereas existing networks connect people to people, the next generation will connect a vast network of sensors, robots and autonomous vehicles through sophisticated artificial intelligence.

The so-called internet of things will allow objects to “communicate” with each other by exchanging vast volumes of data in real time, and without human intervention.

5G explainer: how new network is different and how it will change the mobile web experience

Autonomous factories, long-distance surgery or robots preparing your breakfast – things that previously existed only in science fiction – will be made possible.

Meanwhile, though, it is being identified by many military experts as the cornerstone of future military technology.

Imagine a group of skirmishers in a jungle. They are moving forward speedily with a distance from one another of a few hundred metres. Each of them wears a wristwatch that displays fellow members’ positions. This is not satellite positioning, because reception in the tropical forest is unstable; it’s machine-to-machine communication.

China could ‘weaponise cities’ if it controlled 5G networks, retired US general says

Suddenly one soldier, ambushed by an enemy combatant, is shot and loses consciousness. His smart wearable device detects his condition via sensors, immediately tightens a belt around his wounded thigh, injects an adrenaline shot and sends an emergency alert to the field hospital as well as the entire team.

Having received the signal on their wristwatches, the team switch to a coordinated combat formation and encircle the enemy. An ambulance helicopter arrives to evacuate the injured soldier while auto-driven armoured vehicles come to reinforce – guided by devices on each soldier and antenna arrays nearby.

Or, imagine a street battle with a group of terrorists in a city. There is a power blackout and terrorists hide in an empty office building. A counterterrorism technician hacks into the building’s audio control system and collects high-sensitivity soundwaves using the microphones on surveillance cameras – the system is still running thanks to the devices’ low power consumption and long endurance.

China says it will fast-track 5G commercial licences amid push back on Huawei’s overseas expansion

After the acoustic data is sent back, artificial intelligence (AI) analysis determines the locations of the terrorists. A drone is called from nearby, enters through a window and fires a mini-gun at them.

These are not movie plots, but technologies already or about to be developed, as the internet of things – built on 5G and AI technologies – reshapes warfare.

“The 5G network and the internet of things enlarge and deepen the cognition of situations in the battlefield by several orders of magnitude and produce gigantic amounts of data, requiring AI to analyse and even issue commands,” said Dr Clark Shu, an AI and telecommunication researcher at the University of Electronic Science and Technology of China.

With the ability to carry much more data, much lower network latency (network response time) and energy consumption and much better stability than the previous generation of technologies, 5G is expected to transform digital communication.

Using 5G, data can be transmitted at up to 10 gigabytes per second, much faster than using a 4G network, and the latency is reduced to under a millisecond, or 1 per cent that of 4G.

Such features enhance connectivity in remote locations, connect sensors and robots, and will enable vehicles, traffic control, factories and construction to become more autonomous. In particular, 5G will enhance the connectivity of the internet of things (IoT).

2019 China tech look ahead: trade war likely to cast a shadow as AI, e-commerce, smartphone progress continues

“Internet of things involves close-range telecommunications technology to connect and exchange information between two devices, and 5G is the fastest data transmission method to realise it,” said Zhou Zhaoxiong, a senior engineer at China Mobile IoT Company, a subsidiary of China Mobile.

Military equipment embedded with communication devices can also form the internet of things, he added. The communication can take place from device to device, without satellites or early-warning planes, saving those limited resources for other uses and significantly lowering the cost of a military operation, according to a 2017 report in China Defence News, a mouthpiece of China’s People’s Liberation Army (PLA).

Huawei’s new Balong 5000 chip for 5G devices was unveiled last week. Photo: Bloomberg
Huawei’s new Balong 5000 chip for 5G devices was unveiled last week. Photo: Bloomberg

China has been one of the powerhouses in research and development of 5G technologies. Its telecoms operators have said they will begin to introduce commercial 5G networks from 2020, although Zhou said this would involve only regional pilot schemes because 5G devices are still quite expensive for mass commercial use.

US charges Chinese telecoms giant Huawei with conspiracy, money laundering

Last week, Huawei launched a chip that it claimed to be the world’s most powerful 5G modem.

Then came the US Justice Department’s 13-count indictment on Monday against the Chinese company, its affiliates and its chief financial officer Sabrina Meng Wanzhou, following the arrest of Meng in Canada on December 1 at the US’ request.

But questions over the 5G technology made by Huawei and other Chinese firms date back further. In 2012, the US House Intelligence Committee released a report alleging Chinese telecoms equipment makers posed a threat to national security because of their relationship with the Chinese government. China’s 2017 National Intelligence Law asks Chinese companies to cooperate for national intelligence purposes when necessary.

The US has lobbied its allies to ban Huawei from building their next generation of mobile phone networks, and countries such as Britain, Germany, Australia, New Zealand and Canada have either banned Huawei or are reviewing whether to do so.

Huawei’s founder Ren Zhengfei – Meng’s father – is a former PLA engineer, which has further fuelled questions in the West about Huawei’s ties to the Chinese army and government.

Huawei’s troubles grow in Europe as more countries follow the US in shunning it over security concerns

But Huawei executives have asked repeatedly, in vain, for evidence of “back doors” in its equipment. BIS, the German internet security watchdog, inspected Huawei labs in Germany and found no evidence, and The New York Times last week quoted American officials as saying that the case against the company had “no smoking gun – just a heightened concern about the firm’s rising technological dominance”.

Moves by the US and its allies to block Huawei from 5G networks on national security grounds were last month described as a “concerted strategy” by Kevin Allison, of US-headquartered political risk consultancy Eurasia Group, talking to US broadcaster CNBC.

A report last year to the White House by the US’ National Security Council called for action and strategy to “protect US technology leadership” and prevent China challenging US dominance in tech.

Song Zhongping, a Hong Kong-based military commentator, said China has commissioned research institutions and state-owned companies, not Huawei, for its military 5G development.

Huawei’s founder Ren Zhengfei breaks years of silence amid continued US attacks on Chinese tech giant

“For example, branches of the China Electronics Technology Group Corporation, which makes military radars and other electronic systems, are focused in this area,” said Song.

The US, too, has been investing in military use of 5G, while prototype 5G networks for civilian use have been launched in some cities.

In a recent interview with military technologies publication C4ISRNET, Brent Upson, a director at American aerospace and security company Lockheed Martin, predicted machine-to-machine communication, using information from several sources to form a unified picture of battlespaces, and AI-assisted decision-making would be among the trends in 2019.

US indictments against Huawei a step towards splitting the world’s telecoms industry in two

Todd Wieser, chief technology officer of the US Air Force’s Special Operations Command, has said 5G tech will enhance his forces’ mobile communications and geospatial functionality.

But commercial 5G networks are regarded by the US government as easy prey for foreign intelligence agents and hackers, and such concerns are heightened where a military network is subject to hacking and intrusion attempts by adversaries.

“The biggest disadvantage of a 5G network in the battlefield is the vulnerability to electromagnetic interference – and hacking and intrusion,” said Shu.

“The significant increase in sensors and data nodes means an increase of exposure, and an increased risk of being attacked.”


5G: Use it to leapfrog, others will be left behind

4 Aug

The most innovative sector, usually non-carrier related, must be supported by government. Cross-discipline talents are crucial…

The Philippines launched Southeast Asia’s beginnings of a commercial 5G service through a home broadband wireless connection in June 2019. The new service allows internet users connection speed of 20 Mbps to 100 Megabits per second (Mbps) through the 5G wireless network sans the trouble of time-consuming physical fiber optic connection underground. The new technology will speed up the adoption of broadband internet in the country and expedite the introduction of 5G wireless mobile communication in the 2020s.

Like other earlier generations of 1G to 4G mobile communication, the new 5G mobile communication system has the potential to drastically affect society through the new application areas developed along with the technology.

The table below shows how each successive generation of mobile communication has changed the world.

From the perspective of national development, the current transition from 4G to 5G is more significant than the earlier mobile communication generation shifts. While the technology behind 5G is essentially an engineering improvement over 4G, the application arena represents a revolution. The earlier 1G to 4G worked mainly on how to change and improve communications, and confined itself mostly to the consumer spaces in the economy. The critical application area in 5G will likely move to business and government space in the economy and holds significant promise to improve any countries’ productivity, regardless of whether they are developed or developing.
5G: Use it to leapfrog, others will be left behind 2

New use cases of 5G make it different 

Each generation of mobile communication encompasses all technologies of the previous generations and expands its economic footprint by embracing new activities as well as improving the old one. There are three new different use cases of the 5G network:

1. Enhanced mobile broadband (eMBB): High bandwidth internet access suitable for web browsing, video streaming and virtual reality. eMBB is the service just introduced in the Philippines via the 5G wireless broadband, and its full functionality will be utilized when mobile 5G smartphone service is introduced.

2. Massive machine type communication: This feature means we can install as many as a million monitoring devices in 1 square kilometer without physical wiring connection, and collect real-time data for analysis and action. This functionality means sensors can be monitoring everything anywhere in real-time.

3. Ultra-reliable low latency communication (URLLC): This means 5G system can receive and send back a signal from a faraway place in less than 10 milliseconds with an accuracy of 99.999 percent. This performance is better than the human response that runs into 50 to 100 milliseconds. URLLC allows remote control of many time-sensitive operations such as remote surgery, autonomous vehicle.

5G business model development and the critical role of government regulation

The three different use cases of 5G mean any operation that will benefit from more accurate real-time data collection, analysis and response is a candidate for productivity improvement using the technical capability of the 5G. There are many current government activities and business applications in developing countries that can tap the 5G platform and significantly promote economic development.

One prime candidate for using 5G platform is in real-time traffic management. The management system can use the Internet of Things (IoT) and low latency to build a connected traffic management infrastructure using the 5G platform to link all data collected by the various traffic monitoring sensors at appropriate control junctions. The data collected can then be processed by the artificial intelligence-based traffic management algorithms on the platform to issue real-time management instructions to direct and modify traffic at a particular traffic choke point.

5G: Use it to leapfrog, others will be left behind 3
The 5G technology opens the door of using the new communication technology to promote the growth of developing countries by boosting the productivity of existing economic activities. The opportunity can only be exploited by entrepreneurs who are also well versed in the details of the particular application domain. The country should take the initiative to foster the development of such entrepreneurship and help the entrepreneur to become the drivers in developing relevant business usage models for 5G.

One of the critical regulatory frameworks the government should provide is helping the non-carrier related entrepreneurs to tap the 5G network. For example, 5G network slicing allows operators to divide a single physical network — everything from the radio to the core network — into multiple virtual networks. Each network slice can have different speed limits, different latencies and different quality of service configuration. The charges levied by the carrier will materially affect the development of the use cases by the entrepreneurs. How to partition the revenue of different stakeholders through different fee structure is going to be a social issue for the government to resolve.

Another critical challenge to the government in using 5G to improve the economy is the reorientation of the country’s education setup. We noted in the case of traffic management: AI ability is a complementary competency if one wants to tap the potential of 5G. Hence the government should look at its education setup to develop more cross-discipline talents who can integrate the new technology with the requirement in the fields.


Artificial intelligence in America’s digital city

31 Jul

Afbeeldingsresultaat voor artificial intelligence

Cities are an engine for human prosperity. By putting people and businesses in close proximity, cities serve as the vital hubs to exchange goods, services, and even ideas. Each year, more and more people move to cities and their surrounding metropolitan areas to take advantage of the opportunities available in these denser spaces.

Technology is essential to make cities work. While putting people in close proximity has certain advantages, there are also costs associated with fitting so many people and related activities into the same place. Whether it’s multistory buildings, aqueducts and water pipes, or lattice-like road networks, cities inspire people to develop new technologies that respond to the urban challenges of their day.

Today, we can see the responses made possible by the advances of the second industrial revolution, namely steel and electricity. Multistory buildings and skyscrapers responded to our demand for proximity to do business in the same locations. Electrified and subterranean railways offered faster travel for more people in tight, urban quarters. The elevator, escalator, and advanced construction equipment allowed our buildings to grow taller and our subways to burrow deeper. Electric lighting turned our cities, suburbs, and even small towns into 24-hour activity centers. Air conditioning greatly improved livability in warmer locations, unlocking a population boom. Radios and television extended how far we can communicate and the fidelity of the messages we sent.

We are now in the midst of a new industrial era: the digital age. And like the industrial revolutions to precede it, the digital age doesn’t represent a single set of new products. Instead, the digital age represents an entirely new platform on top of which many everyday activities operate. Making all this possible are rapid advances in the power, portability, and price of computing and the emergence of reliable, high-volume digital telecommunications.

Some of the most important developments are taking place in the area of artificial intelligence (AI). At its most essential level, AI is a collection of programmed algorithms to mimic human decisionmaking. Definitions can vary widely on exactly what constitutes AI, what its applications will look like in the real world, the solutions AI applications will provide, and the new challenges those same applications will introduce. What is not in question is the heightened curiosity and eagerness to better understand AI to maximize its value to humanity and our planet.

Like every form of technology to proceed it, society must be intentional with the exact challenges we want AI to solve and be considerate of the social groups and industries who stand to benefit from the applications we deliver.

How AI will function in the built environment certainly fits into that category—and for good reason. Even though AI is still in its infant stages, we already encounter it on a daily basis. When your video conference shifts the microphone to pick up the speaker’s voice, when your smartphone automatically reroutes you around traffic, when your thermostat automatically lowers the air conditioning on a cool day—that’s all AI in action.

This brief explores how AI and related applications can address some of the most pressing challenges facing cities and metropolitan areas. Like every form of technology to proceed it, society must be intentional with the exact challenges we want AI to solve and be considerate of the social groups and industries who stand to benefit from the applications we deliver. While AI is just in its early development, now is the ideal time to bring that intentionality to urban applications.


Data has always been central to how practitioners plan, construct, and operate built environment systems. At its core, constructing those physical systems requires extensive knowledge of various engineering, geographic, and design principles, all of which are powered by mathematics. Quantitative information and mathematical principles are essential to successfully bring large-scale projects from their blueprints to physical reality, and that was as true in the ancient world as it is today.

The digital age only intensifies the need to use data to manage the built environment. Seemingly every human activity in the 21st century creates a data trail: business transactions, phone calls and text messages, turn-by-turn navigation. If you own a cellphone, simply moving from neighborhood to neighborhood creates a data trail as you jump from one cell tower to the next. Meanwhile, the equipment that constructs our buildings and infrastructure is now digitized, many of which can export data wirelessly. The computing industry also continues to innovate, creating ever-more processing power, storage capacity, and analytical software. We’re simply awash in data and processing power.The question is how to how to maximize data’s value. As the production cost of environmental sensors and network devices continues to drop, the ability to use reliable mobile telecommunications and cloud computing is bringing the concept of the Internet of Things (or IoT) to life. Effectively, IoT represents the systems that will enable sensors deployed across various built environment systems and equipment to speak to one another, increasing both the volume and velocity of data movement and creating new opportunities to interconnect physical operations.

The emerging result is a new kind of data-driven approach to urban management, what many communities commonly refer to as smart cityprograms. While there is no single definition of a smart city program—and online listicles aside, there’s really no way to judge whether an entire municipality or metropolitan area is “smart”—the common element is the use of interconnected sensors, data management, and analytical platforms to enhance the quality and operation of built environment systems.

This is where artificial intelligence and machine learning come into play. My Brookings colleague Chris Meserole authored a piece that explains machine learning in greater detail, including how statistics inform algorithms’ estimates of probability. The goal of machine learning is to replicate how humans would assess a given problem set using the best available data, primarily by building a layered network of small, discrete steps into a larger whole known as a neural network. As the algorithms continue to process more and more data, they learn which data better suits a given task. It’s beyond the scope of this brief to describe machine learning in greater detail, but you can learn more through Brookings’s Blueprint for the Future of AI.

In conjunction with machine learning, AI is well-suited to form the analytical foundation of smart city programs. Machine learning can process the enormous data volumes spit-off by built environment systems, creating automated, real-time reactions where appropriate and delivering manageable analytics for humans to consider. And since data volumes will continue to grow exponentially, local governments and their partners will be able to use AI to maximize opportunities from the data deluge. For these reasons, Gartner expects AI to become a critical feature of 30% of smart city applications by 2020, up from just 5% a few years prior.

In conjunction with machine learning, AI is well-suited to form the analytical foundation of smart city programs.

But AI is relatively worthless without a set of intentional goals to complement it. Organizing, processing, analyzing, and even automatically acting on data is only a secondary set of actions. Instead, the initial task facing the individuals who plan, build, and manage physical systems is to determine the kind of outcomes they want machine-learning algorithms to pursue.


No city is the same. Across the United States, some places face the strain of swelling populations, often due to a mix of new job opportunities or attractive weather. Many older cities face the dim prospect of little to negative population growth. The majority of cities find themselves somewhere in the middle. Yet no matter the growth trajectory, local leadership must design interventions that increase the quality of life for those who do live there, help local businesses grow and attract new ones, and promote environmental resilience.

AI can help achieve those shared outcomes. But to do so, AI must put shared challenges at the core of each intervention’s design. The following categories delineate some of the most pressing challenges facing cities of all kinds.

Climate change and urban resilience

There is no greater existential threat to our communities—from the smallest farming villages to megacities—than climate-related impacts. As the natural environment continues to transform, every place must prepare for the impacts of climate insecurity. That includes managing the most extreme events, including the devastating flooding, property destruction, and human misery delivered by Hurricanes Katrina, Sandy, and Harvey. Places must also prepare for more consistent climate patterns that bring more sustained threats, whether they be rising sea levels in Florida, flooding in the Midwest, or extreme heat and water scarcity in the Mountain West. Communities simply did not design their decades-old built environment systems, from wastewater infrastructure to land use controls, to manage these kinds of climate realities.

Communities will need a new agenda to prioritize environmental resilience across multiple dimensions. Physical designs will need to consider a broader range of climate scenarios. Financing models will need to explicitly recognize the costs climate change could inflict and the benefits of delivering long-term environmental resilience. Land use policies will need to be more forceful around what land is suitable for human development and what land should be left undisturbed. Communities will even need a modernized workforce to undertake resilience-focused activities.

Growth and attraction of tradable industries

Trade is the lifeblood of urban economies. Selling goods and services beyond a city and metropolitan area’s borders brings fresh income to a community, allowing new income to cycle through the rest of the economy—whether it be local restaurants or local schools. Business profits are also essential to reinvest in new products and people. If done successfully, communities build an industrial ecosystem that creates long-term viability; if trade dries up, entire communities can disappear.

To stay competitive in today’s global marketplace, American businesses must be able to develop products that leverage the capabilities of the newest technological platforms—and that includes a prominent role for local governments. Public infrastructure networks should promote efficient and equitable movement of goods, data, and people. Education and workforce systems should support a pipeline of talent, including the promotion of non-routine skills that can help manage the rise of automation. Laws should help investment capital flow into a community to invest in entrepreneurs and fixed assets. Likewise, laws should promote free-flowing data while protecting consumer privacy.

Rising income and wealth inequality

While many United States macroeconomic indicators point to strong long-term growth—including GDP levels, total household wealth, even average incomes—the effects are not equally felt among households. In inflation-adjusted terms, median household income in the U.S. barely grew between 1999 and 2017. The Federal Reserve’s research team found that only 40% of households have enough money saved to manage an unexpected cost of $400 or more. There are persistent gaps in wage levels by race. Even intergenerational mobility is down, including alarming limitations related to the neighborhood where someone grows up. Urban economies that do not work for all people—that do not create truly shared pathways to prosperity—are not places reaching their full economic potential.

Urban economies that do not work for all people—that do not create truly shared pathways to prosperity—are not places reaching their full economic potential.

Cities and their public, private, and civic leadership must address economic inequality head-on. Beyond facing earnings issues related to automation, it also includes a significant set of targets related to the built environment. Housing should be affordable for all people. The same applies to essential infrastructure services like local transportation, water, energy, and broadband. Government services should promote access to public services, including digital skills trainingdigital financial services, and auto-enrolled programming tied to identification cards. And since many built environment projects can take years if not decades to reach full maturity—think large housing efforts or a new energy grid—it’s essential to codify these shared values early.

Outdated governance models

Political and economic geography do not align in the United States. We may colloquially use the term “city” to reference local economies, but those economies now extend far beyond the municipal borders of central cities and counties. Instead, local economies touch an expansive set of cities, towns, villages, counties, and regional governments to manage the built environment. With such a fragmented governance design, it can be difficult to set common objectives across an entire metropolitan area. For example, American metro areas have struggled to implement road pricing policies due to tension between suburban and central city interests. Similarly, certain government units tend to have more preparedness for a digital future than their metropolitan peers, whether it’s the budget to hire data scientists or a willingness to experiment with new products and services.

Addressing climate instability, industrial competitiveness, and household inequality requires coordinated action, much of it multidisciplinary in nature. Metropolitan areas need a governance platform that promotes collaboration between different local governments and reduces the friction caused by parochialism.

Fiscal constraint and risk tolerance

Every local government confronts fiscal capacity issues. No matter local population and economic growth rates, local governments must be responsive to current revenues, future revenue projections, state and federal support levels, and what private capital markets will bear in terms of borrowing. As a result, limited fiscal resources can reduce local leadership’s tolerance to invest in future technologies, many of which are unproven and may not deliver positive results. All told, this creates friction around investing in future technology, which typically requires higher up-front spending to generate long-term operational savings.

Local governments need ways to generate confidence in digital technology services, including AI. This can include new financing models that spread risk among technology developers, private equity, and government purchasers. Civic programs to support information sharing among local governments, some of which already exist, are essential.


While AI and machine learning are uniquely well-suited to help manage the challenges facing cities and metropolitan areas, AI is not a panacea. There is a unique set of challenges related to the design and deployment of AI systems, many of which already appear in cities across the United States. To ensure smart city programs and their related AI interventions deliver economic, social, and environmental value while protecting individual privacy, these challenges must be faced head-on.

While AI and machine learning are uniquely well-suited to help manage the challenges facing cities and metropolitan areas, AI is not a panacea.

What ties each of these AI-related challenges together is the idea of urban ethics. Developing AI services and their related algorithms will require local governments—as well as their peers in state and federal government—to codify a set of shared moral principles. Sometimes those will be specific to a given place, sometimes they should be national standards. But in every instance, we as a society must be explicit and purposeful about our morals and use them to inform both AI algorithms themselves and the management principles that govern the algorithms.

Redundancy and security

Today, a city power outage effectively means modern life grinds to a halt. Buildings without backup generators will see their HVAC systems shut down, lights can’t turn on, computers turn off, elevators won’t work, even security systems could become inoperable. The same applies to telecommunications networks if they don’t have backup generators. But much continues working. Cars, bikes, and non-electrified transit can still operate—and humans can navigate streets without traffic lights. If you have a key to a house or building, it opens.

This will not be the same situation in a city governed by AI. Autonomous vehicles will switch into manual mode if there’s no centralized computing to govern their actions, but some fleet-based vehicles may not allow a passenger to take over (to say nothing of all the empty vehicles that will quickly fill the side of roads). AI-informed water infrastructure would also switch into manual mode, potentially requiring extra workers to manage systems. Other essential services, like health care, could face the same challenges in a power outage. As AI continues to grow in importance, electricity and staffing redundancy becomes even more important.

But it’s the very threat of outright service failure that makes security especially important in a digitalized city. Recent stories of cyberattacks impacting entire municipal operations, including Baltimore and Atlanta, show how information security is essential to keeping cities operational in a digital, connected era. Moreover, it reveals a new kind of global security threat from global adversaries.

Privacy issues

The emergence of digitally connected technologies has invigorated a global debate around information privacy. As it becomes possible to know every single physical movement a person makes, to know every website they visit and every web service they use, to monitor the inner-workings of their homes and workplaces, enormous questions emerge around who should own the data, how the government should regulate data collection and use, and what are the accepted standards to anonymize and encrypt the data.

These tensions are already playing out in public. Location-tracking systems via our smartphones and vehicles make it possible to know frighteningly personal information—including the ability to triangulate a person’s identity with relatively little data. But it’s also impossible to enable location-specific services, from cellular calls to ride-sharing services, without the data trail. Likewise, accurate movement data can enable local governments to make better informed urban planning decisions, from where to put a ride-share pickup spot to where to promote taller buildings.

With industry power closely tied to controlling personal information, and with even more opportunities for personal information to leak, we must strike the right balance between making data and algorithms open to the public and enforcing personal protections. Democratic societies may initially reject surveillance state applications like those found in China, but one only has to look to London to find a city awash in AI-assisted video monitoring. Codifying legal ethics is the only way to protect the right amount of privacy in the digital age.

Algorithmic bias

All AI systems rely on algorithms, which are effectively a set of instructions on how to organize and manage data. The issue is that algorithms themselves can formalize biases, whether via the individuals who write the algorithms or biased data the algorithms compute against. And once biases are written into code, the use of layered code within algorithms can make them even harder to locate over time. As a result, it’s essential that cities have a set of bias detection strategies to protect against AI-created inequities.

We can already see algorithmic bias playing-out in public view. Academic research by Inioluwa Deborah Raji and Joy Buolamwini found Amazon’s facial recognition software biased against individuals with darker skin tones, leading to protests from other researchers. In Chicago, a policing “heat list” system for identifying at-risk individuals failed to significantly reduce violent crime and also increased police harassment complaints by the very populations it was meant to protect.

These instances are only likely to increase as more AI systems come online and more skilled onlookers develop ways to measure for systemic bias. For example, concerned residents could check whether urban services like snow removal are more responsive to complaints from advantaged communities. Such criticism is another reason to promote open algorithms. Allowing public access to an algorithm’s underlying code makes it easier to review for bias, whether one can read the code itself or you would rely on an intermediary to explain how the code works. This is a core argument within the Obama administration’s National Artificial Intelligence Research and Development Strategic Plan.


We don’t need to guess when AI systems will appear in our cities—they’re already here and growing in number.

We don’t need to guess when AI systems will appear in our cities—they’re already here and growing in number. In Montreal, the regional public transportation agency and Transit, the maker of a well-subscribed smartphone application, are using machine learning to better predict future bus arrivals. In New Orleans, the city’s Office of Performance and Accountability used machine learning and public data to predict where fire-related deaths were most likely to occur, helping the fire department better target operations. In New York City and Washington, both cities use a system called ShotSpotter and public data to better locate and assess gun fire. Some cities are even creating exact, digital replicas of their cities—known as digital twins—to create an environment for AI to model future interventions.

As AI services continue to grow in number, it’s also clear that complementary policies will need to develop in tandem. The open-source movement will continue to promote open data availability and shared standards for organization and data analysis, but debates will be had over what data should stay in private hands. Cities, states, and national governments will continue to debate the appropriate amount of personal privacy in a digitized world, as is the ongoing case with the Sidewalk Toronto project. We’re likely to see more cyberattacks against public infrastructure systems as cities continue their digital security build-out.

Continued experimentation with pilot AI projects and complementary policies are essential to build digital cities that benefit all people. But to deliver such shared prosperity, AI is only a secondary intervention. The first step is the same as it always was, no matter the technological era: Local leadership, from civic groups to elected officials to the business community, must collaborate to codify the shared challenges cities want technology to address. It’s only with a common sense of purpose that cities can tap AI’s full promise.


Executive Insights on IoT Today

28 May

Looking to implement an IoT solution? Here’s some advice from those who have come before: start small, have a strategy, and focus on a problem to solve, not the tech.

Having a Strategy

Several keys to success were recommended for an effective and successful IoT strategy. The most frequently mentioned tips were focused on having a strategy and use case in mind before starting a project. Understand what you want to accomplish, what problem you are trying to solve, and what customer needs you are going to fulfill to make their lives simpler and easier. Drive business value by articulating the business challenge you are trying to solve – regardless of the vertical in which you are working.

Architecture and data were the second most frequently mentioned keys to a successful IoT strategy. You must think about the architecture for a Big Data system to be able to collect and ingest data in real-time. Consider the complexity of the IoT ecosystem, which includes back-ends, devices, and mobile apps for your configuration and hardware design. Start with pre-built, pre-defined services and grow your IoT business to a point where you can confidently identify whether building an internal infrastructure is a better long-term investment.

Problem Solving

Companies can leverage IoT by focusing on the problem they are trying to solve, including how to improve the customer experience. Answer the question, “What will IoT help us do differently to generate action, revenue, and profitability?” Successful IoT companies are solving real business problems, getting better results, and finding more problems to solve with IoT.

Companies should also start small and scale over time as they find success. One successful project begets another. Put together a journey map and incrementally apply IoT technologies and processes. Remember that the ability to scale wins.

Data collection is important, but you need to know what you’re going to do with the data. A lot of people collect data and never get back to it, so it becomes expensive to store and goes to waste. You must apply machine learning and analytics to massage and manipulate the data in order to make better-informed business decisions more quickly. Sensors will collect more data, and more sophisticated software will perform better data analysis to understand trends, anomalies, and benchmarks, generate a variety of alerts, and identify previously unnoticed patterns.

A Core Component

IoT has made significant advancements in the adoption curve over the past year. Companies are realizing the value IoT data brings for them, and their end-user customers, to solve real business problems. IoT has moved from being a separate initiative to an integral part of business decision-making to improve efficiency and yield.

There’s also more data, more sources of data, more applications, and more connected devices. This generates more opportunities for businesses to make and save money, as well as provide an improved customer experience. The smart home is evolving into a consolidated service, as opposed to a collection of siloed connected devices with separate controls and apps.

Data Storage

There is not a single set of technical solutions being used to execute an IoT strategy since IoT is being used in a variety of vertical markets with different problems to solve. Each of these verticals and solutions are using different architectures, platforms, and languages based on their needs. However, everyone is in the cloud, be it public or private, and needs a data storage solution.

All the Verticals

The real-world problems being solved with IoT are expanding exponentially into multiple verticals. The most frequently shared by respondents include: transportation and logistics, self-driving cars, and energy and utilities. Following are three examples:

  • A shipping company is getting visibility into delays in shipping, customs, unloading, and delivery by leveraging open source technologies for smarter contacts (sensors) on both the ship and the 3,500 containers on the ship.
  • Renault self-driving cars are sending all data back to a corporate scalable data repository so Renault can see everything the car did in every situation to build a smarter and safer driverless car that will result in greater adoption and acceptance.
  • A semiconductor chip manufacturer is using yield analytics to identify quality issues and root causes of failure, adding tens of millions of dollars to their bottom line every month.

Start Small

The most common issues preventing companies from realizing the benefits of IoT are the lack of a strategy, an unwillingness to “start small,” and concerns with security.

Companies pursue IoT because it’s a novelty versus a strategic decision. Everyone should be required to answer four questions: 1) What do we need to know? 2) From whom? 3) How often? 4) Is it being pushed to me? Companies need to identify the data that’s needed to drive their business.

Expectations are not realistic and there’s a huge capital expenditure. Companies cannot buy large-scale M2M solutions off the shelf. As such, they need to break opportunities into winnable parts. Put a strategy in place. Identify a problem to solve and get started. Crawl, walk, then run.

There’s concern around security frameworks in both industrial and consumer settings. Companies need to think through security strategies and practices. Everyone needs to be concerned with security and the value of personally identifiable information (PII).

Deciding which devices or frameworks to use (Apple, Intel, Google,Samsung, etc.) is a daunting task, even for sophisticated engineers. Companies cannot be expected to figure it out. All the major players are using different communication protocols trying to do their own thing rather than collaborating to ensure an interoperable IoT infrastructure.

Edge Computing and PII

The continued evolution and growth of IoT, to 8.4 billion connected devices by the end of 2017, will be driven by edge computing, which will handle more data to provide more real-time actionable insights. Ultimately, everything will be connected as intelligent computing evolves. This is the information revolution, and it will reduce defects and improve the quality of products while improving the customer experience and learning what the customer wants so you will know what to be working on next. Smarter edge event-driven microservices will be tied to blockchain and machine learning platforms; however, blockchain cannot scale to meet the needs of IoT right now.

For IoT to achieve its projected growth, everyone in the space will need to balance security with the user experience and the sanctity of PII. By putting the end-user customer at the center of the use case, companies will have greater success and ROI with their IoT initiatives.


All but a couple of respondents mentioned security as the biggest concern regarding the state of IoT today. We need to understand the security component of IoT with more devices collecting more data. As more systems communicate with each other and expose data outside, security becomes more important. The DDoS attack against Dyn last year shows that security is an issue bigger than IoT – it encompasses all aspects of IT, including development, hardware engineering, networking, and data science.

Every level of the organization is responsible for security. There’s a due diligence responsibility on the providers. Everywhere data is exposed is the responsibility of engineers and systems integrators. Data privacy is an issue for the owner of the data. They need to use data to know what is being used and what can be deprecated. They need a complete feedback loop to make improvements.

If we don’t address the security of IoT devices, we can look for the government to come in and regulate them like they did to make cars include seatbelts and airbags.


The key skills developers need to know to be successful working on IoT projects are understanding the impact of data, how databases work, and how data applies to the real world to help solve business problems or improve the customer experience. Developers need to understand how to collect data and obtain insights from the data, and be mindful of the challenges of managing and visualizing data.

In addition, stay flexible and keep your mind open since platforms, architectures, and languages are evolving quickly. Collaborate within your organization, with resource providers, and with clients. Be a full-stack developer that knows how to connect APIs. Stay abreast of changes in the industry.

And here’s who we spoke with:

  • Scott Hanson, Founder and CTO, Ambiq Micro
  • Adam Wray, CEO, Basho
  • Peter Coppola, SVP, Product Marketing, Basho
  • Farnaz Erfan, Senior Director, Product Marketing, Birst
  • Shahin Pirooz, CTO, Data Endure
  • Anders Wallgren, CTO, Electric Cloud
  • Eric Free, S.V.P. Strategic Growth, Flexera
  • Brad Bush, Partner, Fortium Partners
  • Marisa Sires Wang, Vice President of Product, Gigya
  • Tony Paine, Kepware Platform President at PTC, Kepware
  • Eric Mizell, Vice President Global Engineering, Kinetica
  • Crystal Valentine, PhD, V.P. Technology Strategy, MapR
  • Jack Norris, S.V.P., Database Strategy and Applications, MapR
  • Pratibha Salwan, S.V.P. Digital Services Americas, NIIT Technologies
  • Guy Yehaiv, CEO, Profitect
  • Cees Links, General Manager Wireless Connectivity, Qorvo
  • Paul Turner, CMO, Scality
  • Harsh Upreti, Product Marketing Manager, API, SmartBear
  • Rajeev Kozhikkuttuthodi, Vice President of Product Management, TIBCO


The Four Internet of Things Connectivity Models Explained

21 May

At its most basic level, the Internet of Things is all about connecting various devices and sensors to the Internet, but it’s not always obvious how to connect them.

1. Device-to-Device

Device-to-device communication represents two or more devices that directly connect and communicate between one another. They can communicate over many types of networks, including IP networks or the Internet, but most often use protocols like Bluetooth, Z-Wave, and ZigBee.


This model is commonly used in home automation systems to transfer small data packets of information between devices at a relatively low data rate. This could be light bulbs, thermostats, and door locks sending small amounts of information to each other.

Each connectivity model has different characteristics, Tschofenig said. With Device-to-Device, he said “security is specifically simplified because you have these short-range radio technology [and a] one-to-one relationship between these two devices.”

Device-to-device is popular among wearable IoT devices like a heart monitor paired to a smartwatch where data doesn’t necessarily have be to shared with multiple people.

There are several standards being developed around Device-to-Device including Bluetooth Low Energy (also known as Bluetooth Smart or Bluetooth Version 4.0+) which is popular among portable and wearable devices because its low power requirements could mean devices could operate for months or years on one battery. Its lower complexity can also reduce its size and cost.

2. Device-to-Cloud

Device-to-cloud communication involves an IoT device connecting directly to an Internet cloud service like an application service provider to exchange data and control message traffic. It often uses traditional wired Ethernet or Wi-Fi connections, but can also use cellular technology.

Cloud connectivity lets the user (and an application) to obtain remote access to a device. It also potentially supports pushing software updates to the device.

A use case for cellular-based Device-to-Cloud would be a smart tag that tracks your dog while you’re not around, which would need wide-area cellular communication because you wouldn’t know where the dog might be.

Another scenario, Tschofenig said, would be remote monitoring with a product like the Dropcam, where you need the bandwidth provided by Wifi or Ethernet. But it also makes sense to push data into the cloud in this scenario because makes sense because it provides access to the user if they’re away. “Specifically, if you’re away and you want to see what’s on your webcam at home. You contact the cloud infrastructure and then the cloud infrastructure relays to your IoT device.”

From a security perspective, this gets more complicated than Device-to-Device because it involves two different types of credentials – the network access credentials (such as the mobile device’s SIM card) and then the credentials for cloud access.

The IAB’s report also mentioned that interoperability is also a factor with Device-to-Cloud when attempting to integrate devices made by different manufacturers given that the device and cloud service are typically from the same vendor. An example would be the Nest Labs Learning Thermostat, where the Learning Thermostat can only work with Nest’s cloud service.

Tschofenig said there’s work going into making Wifi devices that make cloud connections while consuming less power with standards such as LoRa, Sigfox, and Narrowband.

3. Device-to-Gateway


In the Device-to-Gateway model, IoT devices basically connect to an intermediary device to access a cloud service. This model often involves application software operating on a local gateway device (like a smartphone or a “hub”) that acts as an intermediary between an IoT device and a cloud service.

This gateway could provide security and other functionality such as data or protocol translation. If the application-layer gateway is a smartphone, this application software might take the form of an app that pairs with the IoT device and communicates with a cloud service.

This might be a fitness device that connects to the cloud through a smartphone app like Nike+, or home automation applications that involve devices that connect to a hub like Samsung’s SmartThings ecosystem.

“Today, you more or less have to more or less buy a gateway from a dedicated vendor or use one of these mulit-purpose gateways,” Tschofenig said. “You connect all your devices up to that gateway and it does something like data aggregation or transcoding, and it either hands [off the data] locally to the home or shuffles it off to the cloud, depending on the use case.”

Gateway devices can also potentially bridge the interoperability gap between devices that communicate on different standards. For instance, SmartThings’ Z-Wave and Zigbee transceivers can communicate with both families of devices.

4. Backend Data Sharing


Back-End Data-Sharing essentially extends the single device-to-cloud communication model so that IoT devices and sensor data can be accessed by authorized third parties. Under this model, users can export and analyze smart object data from a cloud service in combination with data from other sources, and send it to other services for aggregation and analysis.

Tschofenig said the app Map My Fitness is a good example of this because it compiles fitness data from various devices ranging from the Fitbit to the Adidas miCoach to the Wahoo Bike Cadence Sensor. “They provide hooks, REST APIs to allow security and privacy-friendly data sharing to Map My Fitness.” This means an exercise can be analyzed from the viewpoint of various sensors.

“This [model] runs contrary to the concern that everything just ends up in a silo,” he said.

There’s No Clear IoT Deployment Model; It All Depends on the Use Case

Tschofenig said that the decision process for IoT developers is quite complicated when considering how it will be integrated and how it will get connectivity to the internet working.

To further complicate things, newer technologies with lower power consumption, size and cost are often lacking in maturity compared to traditional Ethernet or Wi-Fi.

“The equation is not just what is most convenient for me, but what are the limitations of those radio technologies and how do I deal with factors like the size limitations, energy consumption, the cost – these aspects play a big role.”


IoT: New Paradigm for Connected Government

9 May

The Internet of Things (IoT) is an uninterrupted connected network of embedded objects/ devices with identifiers without any human intervention using standard and communication protocol.  It provides encryption, authorization and identification with different device protocols like MQTT, STOMP or AMQP to securely move data from one network to another. IoT in connected Government helps to deliver better citizen services and provides transparency. It improves the employee productivity and cost savings. It helps in delivering contextual and personalized service to citizens and enhances the security and improves the quality of life. With secure and accessible information government business makes more efficient, data driven, changing the lives of citizens for the better. IoT focused Connected Government solution helps in rapidly developing preventive and predictive analytics. It also helps in optimizing the business processes and prebuilt integrations across multiple departmental applications. In summary, this opens up the new opportunities for government to share information, innovate, make more informed decisions and extend the scope of machine and human interaction.

The Internet of Things (IoT) is a seamless connected system of embedded sensors/devices in which communication is done using standard and interoperable communication protocols without human intervention.

The vision of any Connected Government in the digital era is “To develop connected and intelligent IoT based systems to contribute to government’s economy, improving citizen satisfaction, safe society, environment sustainability, city management and global need.”

IoT has data feeds from various sources like cameras, weather and environmental sensors, traffic signals, parking zones, shared video surveillance service.  The processing of this data leads to better government – IoT agency coordination and the development of better services to citizens.

Market Research predicts that, by 2020, up to 30 billion devices with unique IP addresses are connected to the Internet [1]. Also, “Internet of Everything” has an economic impact of more than $14 trillion by 2020 [2].  By 2020, the “Internet of Things” is powered by a trillion sensors [3]. In 2019, the “Internet of Things” device market is double the size of the smartphone, PC, tablet, connected car, and the wearable market combined [4]. By 2020, component costs will have to come down to the point that connectivity will become a standard feature even for processors costing less than $1 [5].

This article articulates the drivers for connected government using IoT and its objectives. It also describes various scenarios in which IoT used across departments in connected government.

IoT Challenges Today
The trend in government seems to be IoT on an agency-by-agency basis leading to different policies, strategies, standards and subsequent analysis and use of data. There are number of challenges preventing the adoption of IoT in governments. The main challenges are:

  • Complexity: Lack of funding, skills and usage of digital technologies, culture and strategic leadership commitment are the challenges today.
  • Data Management: In Government, there is a need for managing huge volumes of data related to government departments, citizens, land and GIS. This data needs to be encrypted and secured. To maintain the data privacy and data integrity is a big challenge.
  • Connectivity: IoT devices require good network connectivity to deliver the data payload and continuous streaming of unstructured data. Example being the Patient medical records, rainfall reports, disaster information etc.  Having a network connectivity continuously is a challenge.
  • Security: Moving the information back and forth between departments, citizens and third parties in a secure mode is the basic requirement in Government as IoT introduces new risks and vulnerabilities. This leaves users exposed to various kinds of threats.
  • Interoperability: This requires not only the systems be networked together, but also that data from each system has to be interoperable. Majority of the cases, IoT is fragmented and lacks in interoperability due to different OEMs, OS, Versions, Connecters and Protocols.
  • Risk and Privacy: Devices sometimes gather and provides personal data without the user’s active participation or approval. Sometimes gathers very private information about individuals based on indirect interactions violating the privacy policies.
  • Integration: Need to design an integration platform that can connect any application, service, data or device with the government eco system. Having a solution that comprises of an integrated “all-in-one” platform which provides the device connectivity, event analytics, and enterprise connectivity capabilities is a big challenge.
  • Regulatory and Compliance – Adoption of regulations by an IoT agencies is a challenge.
  • Governance: One of the major concerns across government agencies is the lack of big picture or an integrated view of the IoT implementation. It has been pushed by various departments in a silo-ed fashion.  Also, government leaders lack a complete understanding of IoT technology and its potential benefits.

IoT: Drivers for Connected Government
IoT can increase value by both collecting better information about how effectively government servants, programs, and policies are addressing challenges as well as helping government to deliver citizen-centric services based on real-time and situation-specific conditions. The various stakeholders that are leveraging IoT in connected government are depicted below,


Information Flow in an IoT Scenario
The Information flow in Government using IoT has five stages (5C) : Collection, Communication, Consolidation, Conclusion and Choice.

  1. Collection: Sensors/devices collect data on the physical environment-for example, measuring things such as air temperature, location, or device status. Sensors passively measure or capture information with no human intervention.
  2. Communication: Devices share the information with other devices or with a centralized platform. Data is seamlessly transmitted among objects or from objects to a central repository.
  3. Consolidation: The information from multiple sources are captured and combined at one point. Data is aggregated as a devices communicate with each other. Rules determine the quality and importance of data standards.
  4. Conclusion: Analytical tools help detect patterns that signal a need for action, or anomalies that require further investigation.
  5. Choice: Insights derived from analysis either initiate an action or frame a choice for the user. Real time signals make the insights actionable, either presenting choices without emotional bias or directly initiating the action.

Figure 2: IoT Information Flow

Role of IoT in Connected Government
The following section highlights the various government domains and typical use cases in the connected government.

Figure 3: IoT Usage in Connected Government

a. Health
IoT-based applications/systems of the healthcare enhance the traditional technology used today. These devices helps in increasing the accuracy of the medical data that was collected from large set of devices connected to various applications and systems. It also helps in gathering data to improve the precision of medical care which is delivered through sophisticated integrated healthcare systems.

IoT devices give direct, 24/7 X 365 access to the patient in a less intrusive way than other options. IoT based analytics and automation allows the providers to access the patient reports prior to their arrival to hospital. It improves responsiveness in emergency healthcare.

IoT-driven systems are used for continuous monitoring of patients status.  These monitoring systems employ sensors to collect physiological information that is analyzed and stored on the cloud. This information is accessed by Doctors for further analysis and review. This way, it provides continuous automated flow of information. It helps in improving the quality of care through altering system.

Patient’s health data is captured using various sensors and are analyzed and sent to the medical professional for proper medical assistance remotely.

b. Education
IoT customizes and enhances education by allowing optimization of all content and forms of delivery. It reduces costs and labor of education through automation of common tasks outside of the actual education process.

IoT technology improves the quality of education, professional development, and facility management.  The key areas in which IoT helps are,

  • Student Tracking, IoT facilitates the customization of education to give every student access to what they need. Each student can control experience and participate in instructional design. The student utilizes the system, and performance data primarily shapes their design. This delivers highly effective education while reducing costs.
  • Instructor Tracking, IoT provides instructors with easy access to powerful educational tools. Educators can use IoT to perform as a one-on-one instructor providing specific instructional designs for each student.
  • Facility monitoring and maintenance, The application of technology improves the professional development of educators
  • Data from other facilities, IoT also enhances the knowledge base used to devise education standards and practices. IoT introduces large high quality, real-world datasets into the foundation of educational design.

c. Construction
IoT enabled devices/sensors are used for automatic monitoring of public sector buildings and facilities or large infrastructure. They are used for managing the energy levels of air conditioning, electricity usage. Examples being lights or air conditioners ON in empty rooms results into revenue loss.

d. Transport
IoT’s can be used across transport systems such as traffic control, parking etc. They provide improved communication, control and data distribution.

The IoT based sensor information obtained from street cameras, motion sensors and officers on patrol are used to evaluate the traffic patterns of the crowded areas. Commuters will be informed of the best possible routes to take, using information from real-time traffic sensor data, to avoid being stuck in traffic jams.

e. Smart City
IoT simplifies examining various factors such as population growth, zoning, mapping, water supply, transportation patterns, food supply, social services, and land use. It supports cities through its implementation in major services and infrastructure such as transportation and healthcare. It also manages other areas like water control, waste management, and emergency management. Its real-time and detailed information facilitate prompt decisions in emergency management.  IoT can automate motor vehicle services for testing, permits, and licensing.

f. Power
IoT simplifies the process of energy monitoring and management while maintaining a low cost and high level of precision. IoT based solutions are used for efficient and smart utilization of energy. They are used in Smart grid, Smart meter solution implementations.

Energy system reliability is achieved through IoT based analytics system. It helps in preventing system overloading or throttling and also detects threats to system performance and stability, which protects against losses such as downtime, damaged equipment, and injuries.

g. Agriculture
IoT minimizes the human intervention in farming function, farming analysis and monitoring. IoT based systems detect changes to crops, soil environment etc.

IoT in agriculture contribute to,

  • Crop monitoring: Sensors can be used to monitor crops and the health of plants using the data collected. Sensors can also be used for early monitoring of pests and disease.
  • Food safety: The entire supply chain, the Farm, logistics and retails, are all becoming connected. Farm products can be connected with RFID tags, increasing customer confidence.
  • Climate monitoring: Sensors can be used to monitor temperature, humidity, light intensity and soil moisture. These data can be sent to the central system to trigger alerts and automate water, air and crop control.
  • Logistics monitoring: Location based sensors can be used to track vegetables and other Farm products during transport and storage. This enhances scheduling and automates the supply chain.
  • Livestock farming monitoring: The monitoring of Farm animals can be monitored via sensors to detect potential signs of disease. The data can be analysed from the central system and relevant information can be sent to the farmers.

There are many opportunities for the government to use the IoT to make government services more efficient. IoT cannot be analyzed or implemented properly without collaborative efforts between Industry, Government and Agencies. Government and Agencies need to work together to build a consistent set of standards that everyone can follow.

Connected Government solutions using IoT is used in the domain front:

  • Public Safety departments to leverage IoT for the protection of citizens. One method is through using video images and sensors to provide predictive analysis, so that government can provide security to citizen gathering during parades or inaugural events.
  • Healthcare front, advanced analytics of IoT delivers better and granular care of patients. Real time access of patient’s reports, monitoring of patients health status improves the emergency healthcare.
  • IoT helps in content delivery, monitoring of the students, faculty and improving the quality of education and professional development in Education domain.
  • In energy sector, IoT allows variety of energy controls and monitoring functions. It simplifies the process of energy monitoring and management while maintaining low cost and high level of precision. It helps in preventing system overloading, improving performance of the system and stability.
  • IoT strategy is being utilized in the agricultural industry in terms of productivity, pest control, water conservation and continuous production based on improved technology and methods.

In the technology front:

  • IOT connects billions of devices and sensors to create new and innovative applications. In order to support these applications, a reliable, elastic and agile platform is essential. Cloud computing is one of the enabling platforms to support IOT.
  • Connected Government solution can manage the large number of devices and volume of data emitted with IoT. This large volume of new information generated by IoT allows a new collaboration between government, industry and citizens. It helps in rapidly developing IoT focused preventive and predictive analytics.
  • Optimizing the business processes with process automation and prebuilt integrations across multiple departmental applications. This opens up the new opportunities for government to share information, innovate, save lives, make more informed decisions, and actually extend the scope of machine and human interaction.


  1. Gartner Says It’s the Beginning of a New Era: The Digital Industrial Economy.” Gartner.
  2. Embracing the Internet of Everything to Capture your share of $14.4 trillion.” Cisco.
  3. With a Trillion Sensors, the Internet of Things Would Be the “Biggest Business in the History of Electronics.” Motherboard.
  4. The ‘Internet of Things’ Will Be The World’s Most Massive Device Market And Save Companies Billions of Dollars.” Business Insider.
  5. Facts and Forecasts: Billions of Things, Trillions of Dollars. Siemens.


IoT, encryption, and AI lead top security trends for 2017

28 Apr

The Internet of Things (IoT), encryption, and artificial intelligence (AI) top the list of cybersecurity trends that vendors are trying to help enterprises address, according to a Forrester report released Wednesday.

As more and more breaches hit headlines, CXOs can find a flood of new cybersecurity startups and solutions on the market. More than 600 exhibitors attended RSA 2017—up 56% from 2014, Forrester noted, with a waiting list rumored to be several hundred vendors long. And more than 300 of these companies self-identify as data security solutions, up 50% from just a year ago.

“You realize that finding the optimal security solution for your organization is becoming more and more challenging,” the report stated.

In the report, titled The Top Security Technology Trends To Watch, 2017, Forrester examined the 14 most important cybersecurity trends of 2017, based on the team’s observations from the 2017 RSA Conference. Here are the top five security challenges facing enterprises this year, and advice for how to mitigate them.

  1. IoT-specific security products are emerging, but challenges remain

The adoption of consumer and enterprise IoT devices and applications continues to grow, along with concerns that these tools can increase an enterprise’s attack surface, Forrester said. The Mirai botnet attacks of October 2016 raised awareness about the need to protect IoT devices, and many vendors at RSA used this as an example of the threats facing businesses. While a growing number of companies claim to address these threats, the market is still underdeveloped, and IoT security will require people and policies as much as technological solutions, Forrester stated.

The Internet of Things (IoT), encryption, and artificial intelligence (AI) top the list of cybersecurity trends that vendors are trying to help enterprises address, according to a Forrester report released Wednesday.

As more and more breaches hit headlines, CXOs can find a flood of new cybersecurity startups and solutions on the market. More than 600 exhibitors attended RSA 2017—up 56% from 2014, Forrester noted, with a waiting list rumored to be several hundred vendors long. And more than 300 of these companies self-identify as data security solutions, up 50% from just a year ago.

“You realize that finding the optimal security solution for your organization is becoming more and more challenging,” the report stated.

In the report, titled The Top Security Technology Trends To Watch, 2017, Forrester examined the 14 most important cybersecurity trends of 2017, based on the team’s observations from the 2017 RSA Conference. Here are the top five security challenges facing enterprises this year, and advice for how to mitigate them.

1. IoT-specific security products are emerging, but challenges remain

The adoption of consumer and enterprise IoT devices and applications continues to grow, along with concerns that these tools can increase an enterprise’s attack surface, Forrester said. The Mirai botnet attacks of October 2016 raised awareness about the need to protect IoT devices, and many vendors at RSA used this as an example of the threats facing businesses. While a growing number of companies claim to address these threats, the market is still underdeveloped, and IoT security will require people and policies as much as technological solutions, Forrester stated.

“[Security and risk] pros need to be a part of the IoT initiative and extend security processes to encompass these IoT changes,” the report stated. “For tools, seek solutions that can inventory IoT devices and provide full visibility into the network traffic operating in the environment.”

2. Encryption of data in use becomes practical

Encryption of data at rest and in transit has become easier to implement in recent years, and is key for protecting sensitive data generated by IoT devices. However, many security professionals struggle to overcome encryption challenges such as classification and key management.

Enterprises should consider homomorphic encryption, a system that allows you to keep data encrypted as you query, process, and analyze it. Forrester offers the example of a retailer who could use this method to encrypt a customer’s credit card number, and keep it to use for future transactions without fear, because it would never need to be decrypted.
Image: iStockphoto/HYWARDS

3. Threat intelligence vendors clarify and target their services

A strong threat intelligence partner can help organizations avoid attacks and adjust security policies to address vulnerabilities. However, it can be difficult to cut through the marketing jargon used by these vendors to determine the value of the solution. At RSA 2017, Forrester noted that vendors are trying to improve their messaging to help customers distinguish between services. For example, companies including Digital Shadows, RiskIQ, and ZeroFOX have embraced the concept of “digital risk monitoring” as a complementary category to the massive “threat intelligence” market.

“This trend of vendors using more targeted, specific messaging to articulate their capabilities and value is in turn helping customers avoid selection frustrations and develop more comprehensive, and less redundant, capabilities,” the report stated. To find the best solution for your enterprise, you can start by developing a cybersecurity strategy based on your vertical, size, maturity, and other factors, so you can better assess what vendors offer and if they can meet your needs.

4. Implicit and behavioral authentication solutions help fight cyberattacks

A recent Forrester survey found that, of firms that experienced at least one breach from an external threat actor, 37% reported that stolen credentials were used as a means of attack. “Using password-based, legacy authentication methods is not only insecure and damaging to the employee experience, but it also places a heavy administrative burden (especially in large organizations) on S&R professionals,” the report stated.

Vendors have responded: Identity and access management solutions are incorporating a number of data sources, such as network forensic information, security analytics data, user store logs, and shared hacked account information, into their IAM policy enforcement solutions. Forrester also found that authentication solutions using things like device location, sensor data, and mouse and touchscreen movement to determine normal baseline behavior for users and devices, which are then used to detect anomalies.

Forrester recommends verifying vendors’ claims about automatic behavioral profile building, and asking the following questions:

  • Does the solution really detect behavioral anomalies?
  • Does the solution provide true interception and policy enforcement features?
  • Does the solution integrate with existing SIM and incident management solutions in the SOC?
  • How does the solution affect employee experience?

5. Algorithm wars heat up

Vendors at RSA 2017 latched onto terms such as machine learning, security analytics, and artificial intelligence (AI) to solve enterprise security problems, Forrester noted. While these areas hold great promise, “current vendor product capabilities in these areas vary greatly,” the report stated. Therefore, it’s imperative for tech leaders to verify that vendor capabilities match their marketing messaging, to make sure that the solution you purchase can actually deliver results, Forrester said.

While machine learning and AI do have roles to play in security, they are not a silver bullet, Forrester noted. Security professionals should focus instead on finding vendors that solve problems you are dealing with, and have referenceable customers in your industry.


You Can’t Hack What You Can’t See

1 Apr
A different approach to networking leaves potential intruders in the dark.
Traditional networks consist of layers that increase cyber vulnerabilities. A new approach features a single non-Internet protocol layer that does not stand out to hackers.

A new way of configuring networks eliminates security vulnerabilities that date back to the Internet’s origins. Instead of building multilayered protocols that act like flashing lights to alert hackers to their presence, network managers apply a single layer that is virtually invisible to cybermarauders. The result is a nearly hack-proof network that could bolster security for users fed up with phishing scams and countless other problems.

The digital world of the future has arrived, and citizens expect anytime-anywhere, secure access to services and information. Today’s work force also expects modern, innovative digital tools to perform efficiently and effectively. But companies are neither ready for the coming tsunami of data, nor are they properly armored to defend against cyber attacks.

The amount of data created in the past two years alone has eclipsed the amount of data consumed since the beginning of recorded history. Incredibly, this amount is expected to double every few years. There are more than 7 billion people on the planet and nearly 7 billion devices connected to the Internet. In another few years, given the adoption of the Internet of Things (IoT), there could be 20 billion or more devices connected to the Internet.

And these are conservative estimates. Everyone, everywhere will be connected in some fashion, and many people will have their identities on several different devices. Recently, IoT devices have been hacked and used in distributed denial-of-service (DDoS) attacks against corporations. Coupled with the advent of bring your own device (BYOD) policies, this creates a recipe for widespread disaster.

Internet protocol (IP) networks are, by their nature, vulnerable to hacking. Most if not all these networks were put together by stacking protocols to solve different elements in the network. This starts with 802.1x at the lowest layer, which is the IEEE standard for connecting to local area networks (LANs) or wide area networks (WANs). Then stacked on top of that is usually something called Spanning Tree Protocol, designed to eliminate loops on redundant paths in a network. These loops are deadly to a network.

Other layers are added to generate functionality (see The Rise of the IP Network and Its Vulnerabilities). The result is a network constructed on stacks of protocols, and those stacks are replicated throughout every node in the network. Each node passes traffic to the next node before the user reaches its destination, which could be 50 nodes away.

This M.O. is the legacy of IP networks. They are complex, have a steep learning curve, take a long time to deploy, are difficult to troubleshoot, lack resilience and are expensive. But there is an alternative.

A better way to build a network is based on a single protocol—an IEEE standard labeled 802.1aq, more commonly known as Shortest Path Bridging (SPB), which was designed to replace the Spanning Tree Protocol. SPB’s real value is its hyperflexibility when building, deploying and managing Ethernet networks. Existing networks do not have to be ripped out to accommodate this new protocol. SPB can be added as an overlay, providing all its inherent benefits in a cost-effective manner.

Some very interesting and powerful effects are associated with SPB. Because it uses what is known as a media-access-control-in-media-access-control (MAC-in-MAC) scheme to communicate, it naturally shields any IP addresses in the network from being sniffed or seen by hackers outside of the network. If the IP address cannot be seen, a hacker has no idea that the network is actually there. In this hypersegmentation implementation of 16 million different virtual network services, this makes it almost impossible to hack anything in a meaningful manner. Each network segment only knows which devices belong to it, and there is no way to cross over from one segment to another. For example, if a hacker could access an HVAC segment, he or she could not also access a credit card segment.

As virtual LANs (VLANs) allow for the design of a single network, SPB enables distributed, interconnected, high-performance enterprise networking infrastructure. Based on a proven routing protocol, SPB combines decades of experience with intermediate system to intermediate system (IS-IS) and Ethernet to deliver more power and scalability than any of its predecessors. Using the IEEE’s next-generation VLAN, called an individual service identification (I-SID), SPB supports 16 million unique services, compared with the VLAN limit of 4,000. Once SPB is provisioned at the edge, the network core automatically interconnects like I-SID endpoints to create an attached service that leverages all links and equal cost connections using an enhanced shortest path algorithm.

Making Ethernet networks easier to use, SPB preserves the plug-and-play nature that established Ethernet as the de facto protocol at Layer 2, just as IP dominates at Layer 3. And, because improving Ethernet enhances IP management, SPB enables more dynamic deployments that are easier to maintain than attempts that tap other technologies.

Implementing SPB obviates the need for the hop-by-hop implementation of legacy systems. If a user needs to communicate with a device at the network edge—perhaps in another state or country—that other device now is only one hop away from any other device in the network. Also, because an SPB system is an IS-IS or a MAC-in-MAC scheme, everything can be added instantly at the edge of the network.

This accomplishes two major points. First, adding devices at the edge allows almost anyone to add to the network, rather than turning to highly trained technicians alone. In most cases, a device can be scanned to the network via a bar code before its installation, and a profile authorizing that device to the network also can be set up in advance. Then, once the device has been installed, the network instantly recognizes it and allows it to communicate with other network devices. This implementation is tailor-made for IoT and BYOD environments.

Second, if a device is disconnected or unplugged from the network, its profile evaporates, and it cannot reconnect to the network without an administrator reauthorizing it. This way, the network cannot be compromised by unplugging a device and plugging in another for evil purposes.

SPB has emerged as an unhackable network. Over the past three years, U.S. multinational technology company Avaya has used it for quarterly hackathons, and no one has been able to penetrate the network in those 12 attempts. In this regard, it truly is a stealth network implementation. But it also is a network designed to thrive at the edge, where today’s most relevant data is being created and consumed, capable of scaling as data grows while protecting itself from harm. As billions of devices are added to the Internet, experts may want to rethink the underlying protocol and take a long, hard look at switching to SPB.


%d bloggers like this: