Archive | Connectivity RSS feed for this section

IoT devices in private 5G networks bring new verification tests

19 Mar

With private networks connecting to many IoT devices, testing the device’s user interface requires updating test processes.

Many IoT use cases rely on private 5G networks because they offer greater network control, better security, more reliable performance, and dedicated coverage and capacity as opposed to using a public network. With these advantages, private networks play an important role in specialized use cases for vertical markets.

Based on current GSA data (Figure 1), manufacturing is the top industry vertical for private 5G, followed by mining and education. Other industries are expected to grow as long as one key hurdle — UE operation — can be overcome.

Figure 1. Vertical markets implementing private 5G networks (image: GSA).

Device performance in private 5G is a challenge because, while operator and private 5G networks have similar building blocks, UE is very device-centric based on use case. Additionally, 5G introduces control user plane separation (CUSP), which enables vendors to combine RAN and core network hardware components with software from other sources. With so many varieties in vendors, testing only against 3GPP specifications compliance is not enough.

You should properly test IoT devices against different configurations and combinations and ensure the key performance indicators (KPIs) are properly measured. For engineers, understanding all elements of how users can use the UE, as well as the environments in which they are being deployed, are necessary to ensure devices meet performance parameters.

3GPP Release 16 opens doors
3GPP Release 16 paves the way to private 5G networks. It lets 5G become a substitute for private wired Ethernet, Wi-Fi, and LTE networks by including multiple capabilities for industrial environments.

3GPP also provides standards and guidance on private 5G network deployment. Network architecture and deployment environment affect how you need to test an IoT device’s UE.

The most “private” architecture is a non-public network (NPN), which is an enterprise with a dedicated, on-premises network. 3GPP categorizes NPNs in two ways:

  • Stand-alone non-public network (SNPN): this design does not rely on network functions from a public land mobile network (PLMN). An SNPN-enabled UE must be configured with a subscriber identifier (SUPI) and credentials for each subscribed SNPN identified by the combination of PLMN ID and NID (Network identifier).In addition, 3GPP Release 16 specifies the ability for a UE to obtain PLMN services while on a stand-alone non-public RAN. This is related to when the UE has a subscription and credentials to obtain services from both PLMN and SNPN.
  • Public network integrated NPN (PNI-NPN): in this model, a PLMN ID recognizes the network, while a closed-access group (CAG) ID locates appropriate cells. A CAG cell broadcasts the designated CAG identifiers per PLMN, which must be supported by UE operating on the network. Only devices that have access credentials for that specific CAG ID can latch on to such cells, thus providing access restriction.

Hybrid private 5G networks use a mix of public mobile network components and dedicated on-premises elements. UE for hybrid networks has its own set of performance parameters, depending on network configuration. Three hybrid designs exist:

  • Radio access is shared with the public network; everything else is private.
  • The user plane is private, but the control plane and radio access are shared.
  • Network slice option; one virtual slice is dedicated to the private network while all other elements reside on a public network.

Because private 5G networks use unlicensed and shared spectrum, device integration can become complex. Systems integrators, who have become key players in private 5G, must verify that UE operates according to specification, elements are integrated properly to guarantee end-to-end quality of service (QoS), and connectivity between UE and network is reliable.

Ensuring UE performance in private 5G
QoS and connectivity take on an added layer of complexity in many private 5G use cases. For example, in a smart factory, there can be robots with hundreds of sensors and machinery with multiple actuators operating in an environment with considerable interference sources. Such a setting has created the need for stress testing to determine how the UE will operate under such extreme conditions.

Given the proprietary nature of many private 5G networks, the prevalence of Open RAN architecture, and data sensitivities, security is a main priority. Many UE manufacturers employ practical security testing, which uses a network simulator to conduct necessary tests, such as functional security measurements. They thoroughly test all security-related functions inside UE or other systems under test to ensure correct behavior and operational robustness (Figure 2).

Figure 2. A typical test configuration for cybersecurity covers functionality, vulnerability, fuzzing, and penetration.

Stress tests and security are primary considerations but hardly the only issues for engineers. Private 5G networks have unique requirements that are more specific and varied than open public networks. Not only are there a tremendous amount of frequency/band combinations that must be considered for sunny-day testing but attention needs to be given to ensure devices that are supposed to work exclusively in an NPN environment do not connect to macro networks and unauthorized UE do not connect to an NPN. For this reason, other tests must be conducted to ensure performance, including:

  • Connectivity — 5G IoT devices need proper testing to verify call connection, cell selection/reselection, access control, and any mobility implications in NPN environments. There are new features of 5G NPN that allow the device to selectively connect to the correct network. Verify that a private 5G network is truly only catering to private 5G devices.
  • Compatibility — many devices used in a private network support cellular, Wi-Fi, and short-range wireless technologies, such as Bluetooth and Zigbee. Ensuring UE can seamlessly transfer from one technology to another is essential to private 5G network performance.
  • Interference — given most private 5G network use cases, interference testing is critical. In addition to supporting multiple technologies, devices must operate in less-than-ideal real-world environments and in mission-critical scenarios. Engineers must have confidence product performance will not degrade due to interference before they are shipped to customers.

Creating a test environment
Implementing a test process to support private 5G UE requires a practical approach. The test environment must simulate real-world scenarios to efficiently verify that the UE will perform when deployed into a private 5G network. Design your test system with intuitive software to more efficiently support various and ever-changing test conditions and evolving standards, which will help to control test costs.

Conclusion
Private 5G networks play a significant role in the fourth industrial revolution. Engineers responsible for developing UE in these use cases must implement test processes that follow 3GPP standards and create real-world scenarios that precisely mirror the specific private 5G network environment. Such an approach will provide greater confidence that the UE will meet established KPIs.

Source: by Emma Lutjen – https://www.testandmeasurementtips.com/iot-devices-in-private-5g-networks-bring-new-verification-tests/

Megatrends for this decade

17 Nov
A rendering of CubeSat deployed in orbit to provide high bandwidth communication coverage all over the planet. Image credit: Alén Space

Global gigabit connectivity at ultra-low cost

This megatrend is actually the convergence of three: “global”, “gigabit ” and “ultra-low cost”. The quest for coverage and for performance is nothing new, as a matter of fact. the novelty is in the “quality” and “quantity” foreseen by this trend.

Connectivity has kept improving over the last 150 years. However, it is only in the last 20 years, with the advent of low cost wireless technology (particularly on the handset side, the cellphone) that we have see a tremendous growth.

An interesting representation of the world population vs the number of mobile users, smartphones and PC. Notice that if we compare the number of mobile users with the number of people age 10 and older (second bar) we get almost a ono to one relation, 5.8 vs 5.5 billion. Also notable the increase mobile users vs increase in population in the 2015 (grey) and 2020 (orange). The acceleration is clear. Image credit: World Bank

What used to take 50 years in terms of usage adoption has been squeezed into a few years. There are now 5.5 billion people connected via a cellphone, 90% of them with a smartphone.

It is no more just about people. Actually, if we look at the number objects connectivity is already dwarfing people connectivity in number of devices and in number of transactions (not in terms of bandwidth: our usage of bandwidth for movies keeps the bandwidth usage on our side but this will also change in the coming years as more and more streaming video from safety cameras will take the upper end in bandwidth usage).

In this decade connectivity is expected to increase further in two dimensions:

  • broader area of coverage, with expectation to have full planet coverage by 2035 accessible through normal consumer cellphones (today to access satellite networks, the only one providing full coverage, a special -expensive- phone is required). This is expected to be achieved by new generations of satellites (like OneWeb planning to have 48,000 satellites in its constellation, Starlink, already serving US and Canada with 540 satellites and expected to expand coverage in 2021/22 once 1500 more satellites will be deployed), both low orbit constellation and cube-satellites constellation, and the capabilities of cellphones to operate in the THz band, expected to become reality with 6G;
  • higher bandwidth delivered through higher spectrum availability (because of higher frequencies, in the microwave range -above 300GHz in the next decade and in the mmwave range 30-300GHz in the second part of this decade) coupled with more dense network (higher number of access points, 10.to 1,000 times the one existing today, with the higher multiplier effective with the deployment of 6G and networks dynamically set up from the edges).

If “global” is easy to understand, the “gigabit” part is not so straightforward because it raises the question if “gigabit” connectivity will make a difference (there are then sub-questions like “for whom” and “who will be willing to pay for it and how much” but these latter can be superseded by the third forecast, i.e. “at ultra-low cost”.

Expected use of bandwidth by residential customers. Figures are expressed in Mbps. Image credit: Cisco

As it happened in the past we can assume that once increased bandwidth is available someone will find a way to exploit it and that eventually many people will be using it.

The graphic on the side present a forecast from Cisco on the possible bandwidth demand by future applications, in temporal order (from near to far down the lane):

  • Ultra-high definition security cameras: 15 Mbps
  • Ultra-high definition streaming (4k): 15 Mbps
  • Virtual Reality streaming: 17 Mbps
  • Self-driving vehicles diagnostics: 20 Mbps
  • Cloud Gaming: 30 Mbps
  • Ullltra-High definition IP Video: 51 Mbps
  • 8K wall television: 100 Mbps
  • High Definition Virtual lReality: 167 Mbps
  • Ultra-High Definition Virtual Reality: 500 Mbps

Some of the above applications may require low latency (<10ms) or very low latency (<2ms) and will therefore requite edge computing and edge / peer-to-peer communication, hence a quite different network architecture that, in principle, is already possible with 5G but then will surely be implemented for 6G.

Delivering Gigabit capacity to the single user (not to a single cell) requires very dense networks, and of course adequate technology. On the wireline side the optical fibre can deliver multiple Gbps already today, On the wireless side we need sufficient spectrum to funnel 1 Gbps. Considering 20 bit per Hz (a very very high spectral efficiency, never reached in normal conditions were a 4-6 b per Hz would be considered as a very good efficiency) to get 1 Gbps you need 50 MHz spectrum availability (today’s 5G allocated spectrum in Italy has a maximum of 80 MHz and that is for the whole cell, not for a single user!). Hence the need to use mmwaves and µmwaves (in the THz range). These allow allocation of broad spectrum. The evolution of electronics will made this feasible in the last part of this decade.

Recapping: “global” and “gigabit” are reasonable targets. What about “ultra-low cost”?

Here is where I feel it gets really interesting!

If we look back we can see that the shift from wireline to wireless has dramatically slashed the cost of delivering bits. This is due

  • first, to the fact that wireless infrastructures can scale (almost) in synch with demand. When traffic demand grows you can deploy one more cell, and then another and right were it is needed. This makes investment much more effective.
  • second, to the shift of (part) of the infrastructure investment on the customer. In fact, the cell phone, smartphone are network equipment, they carry out actions that once were part of the infrastructure, like digitisation, access selection, … Smartphone represent something like 70% of the overall cost of the end-to-end wireless infrastructure. Hence, the telecom Operators are covering only 30% of the cost, whilst in a wireline infrastructure they have to sustain 100% of the cost!

This is decreasing the perception (and reality) of cost to the end user. As the cost of the smartphones decreases so does the cost of connectivity.

This trend will continue in this decade and it will have a further acceleration by the end of the decade, beginning of the next, as communication will start to be provided by the edges (networks deployed by third parties that are not interested in charging for the access) and by objects themselves. 6G will be the first system designed to create edge networks, in part formed by meshing networks created by objects. This is what will lead to ultra-low cost connectivity.

Source: https://cmte.ieee.org/futuredirections/2020/11/17/megatrends-for-this-decade-iii/ 17 11 20

The 5G edge computing value opportunity

14 Jun

A key pillar in the global economic recovery.

5G was one of the most exciting prospects as we entered 2020, but the world has now changed forever. COVID-19 has delivered a terrible blow causing loss, pain and heartache, and an economic downturn.
An indisputable fact is that the need for organizations to be digital has never been greater, and 5G along with edge computing will likely play a key role in the global economic recovery.

Across all industries, our clients are telling us about the imperative to accelerate their digital transformation. Prior to COVID-19, digital had momentum – everyone could see the potential. Now the need is greater, perhaps even one of survival. Connectivity plays a vital role in the evolution of digital, and this has been seen in both fixed and wireless over the last decade. While the 5G business case was strong before, it’s even stronger now.

Subjects report
The enterprise value: The deployment of 5G technology can help companies solve business challenges, reshape value chains, enhance revenue models, and optimize operations across many industries.

The relevance of 5G+Edge: Edge computing is key to unlocking the financial benefits of 5G, and in thoughtful combination 5G+Edge can create significant incremental economic value for those in the ecosystem.

Everywhere, anytime communications: Few doubt the incredible potential of 5G. It’s set to unleash the power of digital across multiple sectors, making factories, warehouses, workplaces, hospitals and homes more efficient, transportation faster and more convenient, and cities smarter.

5G+Edge ecosystem: When it comes to the intersection between 5G and edge computing, telcos once more have to look beyond mere connectivity and identify new ways to realize value.

The 5G+Edge opportunity: While every sector is likely to benefit from 5G and edge computing, we wanted to highlight the sectors where the new services delivered as a result of their combination was most illuminating.

Expanding the core: Connectivity alone may not be enough. Telcos’ traditional revenue sources are barely tapping into the huge potential of 5G and edge computing.

Changing roles for telcos: There are a number of things that telcos can do to enhance their position in order to capture the maximum amount of value through the ecosystem.

Outpacers can capture more: At KPMG we believe that 5G and edge computing represent the platform on which the next industrial revolution will be delivered. We call companies that lead in digital the ‘Outpacers.’

Download report: 5g-edge-computing-value-opportunity

Source: https://assets.kpmg/content/dam/kpmg/xx/pdf/2020/06/5g-edge-computing-value-opportunity.pdf 14 06 20

5G Health Risks: Here’s What the Experts Say

27 Oct
(Image credit: Shutterstock)

This year has delivered a whirlwind of hype surrounding 5G: how it will change lives, where 5G networks are launching around the world and when exactly your smartphone will be capable of lightning-fast speeds. But some people are concerned that the rollout of 5G is happening so quickly that we don’t truly understand if or how the launch of next-generation connectivity will bring unintended health consequences.

The short answer: The scientific consensus is that 5G, like 3G and 4G before it, is not harmful to your health. In August, the U.S. Federal Communications Commission (FCC) officially determined that 5G’s radio waves are safe.

But that finding probably won’t stem the tide of worry over 5G’s rollout, especially as more 5G phones hit the market and the coverage becomes more widespread.

Why are people concerned that 5G is unsafe?

Early 5G networks — including those launched by AT&TVerizon and T-Mobile — use high-frequency, millimeter-wave (mmWave) spectrum to deliver faster speeds. Some people are concerned that those radio waves, along with the additional cellular infrastructure needed to build out mmWave-based 5G networks in major cities, will increase the amount of radiation in the environment.

Millimeter-wave spectrum has never been used for telecommunications. However, that’s not because it’s dangerous; the higher-frequency bands are just not as effective at transmitting data across distances. An mmWave-based 5G signal can’t penetrate objects, such as glass windows or concrete buildings. It also can’t penetrate the body.

The concerns over 5G are an extension of the worries some people have about cellphones in general.

But 5G is a form of radiation, right?

There are two types of radiation: ionizing and non-ionizing. Ultra-high-frequency ionizing radiation — which includes gamma-rays, UV rays from the sun and X-rays — is harmful to humans because it penetrates the body at the cellular level and causes electrons and atoms to break apart. Ionizing radiation can cause cancer, which is why you’re supposed to wear sunscreen outdoors and avoid unnecessary medical X-rays.

Non-ionizing radiation does not cause cancer, and runs the gamut from FM radio waves to visible light. In between the two is 5G, which operates at a slightly higher frequency than 3G and 4G.

The FCC requires all electronic equipment sold in the U.S. to meet the agency’s safety standards for acceptable radio-frequency (RF) energy by determining the device’s specific absorption rate (SAR), or the rate by which the body absorbs RF energy. The FCC recently reevaluated its standards, which were created in 1996, when determining the safety of 5G. The recommended RF exposure limits remain unchanged.

“The scientific consensus is that there are no known health risks from all forms of RF energy at the low levels approved for everyday consumer use,” a spokesperson for CTIA, a trade group for the wireless communications industry, said in an emailed statement. “The FCC regulates RF emissions, including millimeter waves from 5G devices and equipment, and has adopted the recommendations of expert scientific organizations that have reviewed the science, including dozens of studies focused specifically on millimeter waves, and established safe exposure levels.”

What’s driving the fear of 5G?

There are a few factors contributing to the concern — or outright fear — of 5G’s effects.

The first is scientific research that has been interpreted by some to support concern about cellphone radiation. For instance, a 2018 study released by the National Toxicology Program (NTP) found that when rats and mice were exposed to radio-frequency waves like the kind that emanate from cellphones, they developed malignant tumors. This particular study looked at 2G and 3G phones. However, that doesn’t mean 5G will cause cancerous tumors in humans.

Skeptics, like the University of California, Berkeley’s Joel Moskowitz, are calling for a halt to 5G’s rollout.

“The exposures used in the studies cannot be compared directly to the exposure that humans experience when using a cellphone,” John Bucher, a senior scientist for the NTP, said when announcing the findings. “In our studies, rats and mice received radio-frequency radiation across their whole bodies. By contrast, people are mostly exposed in specific local tissues close to where they hold the phone. In addition, the exposure levels and durations in our studies were greater than what people experience.”

The NTP has said it plans to develop thorough studies to evaluate the safety of 5G.

The World Health Organization’s International Agency for Research on Cancer has categorized RF waves from cellphones as a possible carcinogen, which is another factor contributing to the concerns over 5G. But, for context, an ingredient in coffee is also considered a possible carcinogenRed meat is categorized as a probable carcinogen, which means it has a stronger link to cancer than cellphones do.

The New York Times reported earlier this year that one of the primary 5G fearmongers is Russian propaganda spreading on YouTube, Facebook and blogs across the internet. Videos and news articles filled with misinformation are scaring U.S. consumers even as Russia proceeds with its own 5G plans.

Have there been studies to prove that 5G isn’t a health risk to humans?

5G is a new standard for wireless communication, but from a technological standpoint, it isn’t all that different from 3G and 4G. The radio-frequency waves from 5G cellphones are akin to the RF waves from LTE devices (i.e., non-ionizing). According to the American Cancer Society, most studies have shown that “the RF waves given off by cell phones don’t have enough energy to damage DNA directly or to heat body tissues.”

In 2000, a now-debunked study on the effect of radio waves on brain tissue fueled conspiracy theories about cellphones and radiation. The author of the study, physicist Bill Curry, claimed that wireless devices could cause brain cancer in humans. According to The New York Times, Curry neglected to take into account that our skin protects our internal tissues from high-frequency radio waves (which is, again, why you need to wear sunscreen to protect the skin from even higher-frequency UV rays).

“If you’re more concerned about the base station on your building than you are [about] spending an hour in the noonday sun without any protection, you might want to think about your priorities.”

                  Christopher Collins

However, because 5G networks are just now getting off the ground with a new roster of 5G phones, no long-term studies of the network or the devices and their effects on humans have been conducted. In addition, the types of devices we use and the way we use them are constantly changing. For that reason, skeptics such as Joel Moskowitz, director of the Center for Family and Community Health at the University of California, Berkeley School of Public Health, are calling for a halt to 5G’s rollout.

Moskowitz said it would be unethical to conduct a conclusive scientific study on human beings controlling for the health effects of cellphone radiation, so researchers rely on observational and animal studies. Those studies haven’t proved conclusively that cellphones are harmful to humans, but Moskowitz thinks there’s enough evidence to “put a moratorium on the rollout of new technologies” like 5G infrastructure build-out until more research is done.

“I’m certain that, within the next five years, radio-frequency radiation will be declared at least probably carcinogenic [by the WHO],” Moskowitz said.

But Christopher Collins, a professor in New York University’s radiology department who studies the safety of electromagnetic fields, said the lack of 5G-specific research doesn’t mean researchers are starting from scratch when evaluating 5G’s potential effects on human health.

“A lot of the premise of people who advocate against 5G or wireless communications fields in general seem to suggest that we just don’t know and we need to do more studies,” Collins said. “We know a lot. We’ve been doing experiments on humans and animals for decades over this entire spectrum.”

Collins said scientists “never want to say the book is closed,” but based on what we already know, there’s no evidence to suggest that 5G will cause cancer or other detrimental health effects in most people.

So why are some local governments putting a stop to 5G development?

Prior to the FCC’s 5G safety determination, city and state regulators were hearing from residents who were concerned that not enough was known about 5G. Specifically, people are concerned that the density of small cell sites required to build out mmWave-based 5G networks would emit dangerous amounts of radiation.

The FCC’s 5G FAST Plan, which requires municipalities to approve 5G cell sites within 60 to 90 days, has caused concern. Carriers are moving quickly to build out infrastructure without giving residents notice, The Wall Street Journal reported, and local legislators are pushing back. Some 90 cities and counties have filed suit against the FCC in a case currently pending in the Ninth Circuit Court of Appeals.

Homeowners may not want new antennas outside their homes for aesthetic reasons, or because they want advance notice when changes occur in their communities, but the FCC, industry trade groups and many scientists maintain there is no proven health risk.

“Typical exposure to 5G devices — such as small cells attached to phone poles or the sides of buildings — is far below the permissible levels and comparable to Bluetooth devices and baby monitors,” the CTIA spokesperson said. “The FCC continues to monitor the science to ensure that its regulations are protective of public health.”

Or, as NYU’s Chris Collins put it:

“One thing that we know can cause cancer is sunlight. People would generally do better to worry about that than the exposure levels we’re talking about with cellphones. If you’re more concerned about the base station on your building than you are [about] spending an hour in the noonday sun without any protection, you might want to think about your priorities.”

Bottom line

“Is it time to stop questioning? No, it’s never time to stop questioning,” he said. “It’s important to remember that, based on what we know now, there is no effect except for heating. This is based on many decades’ worth of study in these fields. It’s another thing to say, ‘Should we stop progress?’ based on what I would call unfounded concerns. I am quite certain there’s nothing to be alarmed about for millimeter waves.”

If you are concerned, there are ways to mitigate your personal exposure to cellphone radiation by using fewer wireless devices. That might also do wonders for your mental health, too.

Key Drivers and Research Challenges for 6G Ubiquitous Wireless Intelligence

25 Sep

The University of Oulu in Finland has published the world’s first white paper on 6G wireless technology. The white paper is titled ‘Key Drivers and Research Challenges for 6G Ubiquitous Wireless Intelligence’ and is based on information gathered at a summit of experts in the emerging 6G wireless capability sector held in Levi, Finnish Lapland, in March this year. It focuses on the key drivers and research priorities for the development of 6G technology, which the experts estimated would result in ‘ubiquitous wireless intelligence’ by 2030.

The paper consits of seven themes:

  • Social and business drivers of 6G wireless innovation, including adherence to the United Nations’ Sustainable Development Goals and the evolving needs of the data market: the paper notes that while the technical success of 5G has relied on new developments in many areas and will deliver a much wider range of data rates to a much broader variety of devices and users. 6G will require a substantially more holistic approach to identify future communication needs, embracing a much wider community to shape the requirements of 6G.
  • 6G use cases and new devices – the paper predicts a shift in user devices from smartphones toward wearable devices with virtual, augmented or mixed reality capability, along with the emergence of other innovations in technological engagement such as telepresence, mobile robots and autonomous vehicles; and identifies these as factors to be considered when constructing 6G-enabled networks;
  • Key performance indicators and projected spectrum capability for 6G wireless connectivity, which the experts say should aim to transmit at rates of up to 1Tbps per user;
  • Progress and challenges of the necessary radio hardware – communications applications and architecture must merge in order to offer the spectrum needed to achieve the requisite speeds for 6G connectivity;
  • Wireless systems and the physical layer of development – the paper highlights issues of increased energy consumption and data processing, saying: ‘Meeting all the challenging requirements identified requires a hyper-flexible network with configurable radios. AI and machine learning will be used in concert with radio sensing and positioning to learn about the static and dynamic components of the radio environment’;
  • 6G wireless networking, including secure privacy protection protocols and the growing role of Artificial Intelligence and blockchain capability; and
  • New service enablers – the paper highlights the growth of edge and cloud computing, machine learning and Artificial Intelligence and highlights the importance of shoring up privacy and trust in the network.

As 5G research is maturing and continues to support global standardization, we must start to start discussing what 6G can become and how to get there. Company representatives, researchers, decision-makers and other builders and members of smart society are invited to join this effort.

Click here to download the 6G White Paper on everything RF or  White paper 6G

IoT: New Paradigm for Connected Government

9 May

The Internet of Things (IoT) is an uninterrupted connected network of embedded objects/ devices with identifiers without any human intervention using standard and communication protocol.  It provides encryption, authorization and identification with different device protocols like MQTT, STOMP or AMQP to securely move data from one network to another. IoT in connected Government helps to deliver better citizen services and provides transparency. It improves the employee productivity and cost savings. It helps in delivering contextual and personalized service to citizens and enhances the security and improves the quality of life. With secure and accessible information government business makes more efficient, data driven, changing the lives of citizens for the better. IoT focused Connected Government solution helps in rapidly developing preventive and predictive analytics. It also helps in optimizing the business processes and prebuilt integrations across multiple departmental applications. In summary, this opens up the new opportunities for government to share information, innovate, make more informed decisions and extend the scope of machine and human interaction.

Introduction
The Internet of Things (IoT) is a seamless connected system of embedded sensors/devices in which communication is done using standard and interoperable communication protocols without human intervention.

The vision of any Connected Government in the digital era is “To develop connected and intelligent IoT based systems to contribute to government’s economy, improving citizen satisfaction, safe society, environment sustainability, city management and global need.”

IoT has data feeds from various sources like cameras, weather and environmental sensors, traffic signals, parking zones, shared video surveillance service.  The processing of this data leads to better government – IoT agency coordination and the development of better services to citizens.

Market Research predicts that, by 2020, up to 30 billion devices with unique IP addresses are connected to the Internet [1]. Also, “Internet of Everything” has an economic impact of more than $14 trillion by 2020 [2].  By 2020, the “Internet of Things” is powered by a trillion sensors [3]. In 2019, the “Internet of Things” device market is double the size of the smartphone, PC, tablet, connected car, and the wearable market combined [4]. By 2020, component costs will have to come down to the point that connectivity will become a standard feature even for processors costing less than $1 [5].

This article articulates the drivers for connected government using IoT and its objectives. It also describes various scenarios in which IoT used across departments in connected government.

IoT Challenges Today
The trend in government seems to be IoT on an agency-by-agency basis leading to different policies, strategies, standards and subsequent analysis and use of data. There are number of challenges preventing the adoption of IoT in governments. The main challenges are:

  • Complexity: Lack of funding, skills and usage of digital technologies, culture and strategic leadership commitment are the challenges today.
  • Data Management: In Government, there is a need for managing huge volumes of data related to government departments, citizens, land and GIS. This data needs to be encrypted and secured. To maintain the data privacy and data integrity is a big challenge.
  • Connectivity: IoT devices require good network connectivity to deliver the data payload and continuous streaming of unstructured data. Example being the Patient medical records, rainfall reports, disaster information etc.  Having a network connectivity continuously is a challenge.
  • Security: Moving the information back and forth between departments, citizens and third parties in a secure mode is the basic requirement in Government as IoT introduces new risks and vulnerabilities. This leaves users exposed to various kinds of threats.
  • Interoperability: This requires not only the systems be networked together, but also that data from each system has to be interoperable. Majority of the cases, IoT is fragmented and lacks in interoperability due to different OEMs, OS, Versions, Connecters and Protocols.
  • Risk and Privacy: Devices sometimes gather and provides personal data without the user’s active participation or approval. Sometimes gathers very private information about individuals based on indirect interactions violating the privacy policies.
  • Integration: Need to design an integration platform that can connect any application, service, data or device with the government eco system. Having a solution that comprises of an integrated “all-in-one” platform which provides the device connectivity, event analytics, and enterprise connectivity capabilities is a big challenge.
  • Regulatory and Compliance – Adoption of regulations by an IoT agencies is a challenge.
  • Governance: One of the major concerns across government agencies is the lack of big picture or an integrated view of the IoT implementation. It has been pushed by various departments in a silo-ed fashion.  Also, government leaders lack a complete understanding of IoT technology and its potential benefits.

IoT: Drivers for Connected Government
IoT can increase value by both collecting better information about how effectively government servants, programs, and policies are addressing challenges as well as helping government to deliver citizen-centric services based on real-time and situation-specific conditions. The various stakeholders that are leveraging IoT in connected government are depicted below,

 

Information Flow in an IoT Scenario
The Information flow in Government using IoT has five stages (5C) : Collection, Communication, Consolidation, Conclusion and Choice.

  1. Collection: Sensors/devices collect data on the physical environment-for example, measuring things such as air temperature, location, or device status. Sensors passively measure or capture information with no human intervention.
  2. Communication: Devices share the information with other devices or with a centralized platform. Data is seamlessly transmitted among objects or from objects to a central repository.
  3. Consolidation: The information from multiple sources are captured and combined at one point. Data is aggregated as a devices communicate with each other. Rules determine the quality and importance of data standards.
  4. Conclusion: Analytical tools help detect patterns that signal a need for action, or anomalies that require further investigation.
  5. Choice: Insights derived from analysis either initiate an action or frame a choice for the user. Real time signals make the insights actionable, either presenting choices without emotional bias or directly initiating the action.

Figure 2: IoT Information Flow

Role of IoT in Connected Government
The following section highlights the various government domains and typical use cases in the connected government.

Figure 3: IoT Usage in Connected Government

a. Health
IoT-based applications/systems of the healthcare enhance the traditional technology used today. These devices helps in increasing the accuracy of the medical data that was collected from large set of devices connected to various applications and systems. It also helps in gathering data to improve the precision of medical care which is delivered through sophisticated integrated healthcare systems.

IoT devices give direct, 24/7 X 365 access to the patient in a less intrusive way than other options. IoT based analytics and automation allows the providers to access the patient reports prior to their arrival to hospital. It improves responsiveness in emergency healthcare.

IoT-driven systems are used for continuous monitoring of patients status.  These monitoring systems employ sensors to collect physiological information that is analyzed and stored on the cloud. This information is accessed by Doctors for further analysis and review. This way, it provides continuous automated flow of information. It helps in improving the quality of care through altering system.

Patient’s health data is captured using various sensors and are analyzed and sent to the medical professional for proper medical assistance remotely.

b. Education
IoT customizes and enhances education by allowing optimization of all content and forms of delivery. It reduces costs and labor of education through automation of common tasks outside of the actual education process.

IoT technology improves the quality of education, professional development, and facility management.  The key areas in which IoT helps are,

  • Student Tracking, IoT facilitates the customization of education to give every student access to what they need. Each student can control experience and participate in instructional design. The student utilizes the system, and performance data primarily shapes their design. This delivers highly effective education while reducing costs.
  • Instructor Tracking, IoT provides instructors with easy access to powerful educational tools. Educators can use IoT to perform as a one-on-one instructor providing specific instructional designs for each student.
  • Facility monitoring and maintenance, The application of technology improves the professional development of educators
  • Data from other facilities, IoT also enhances the knowledge base used to devise education standards and practices. IoT introduces large high quality, real-world datasets into the foundation of educational design.

c. Construction
IoT enabled devices/sensors are used for automatic monitoring of public sector buildings and facilities or large infrastructure. They are used for managing the energy levels of air conditioning, electricity usage. Examples being lights or air conditioners ON in empty rooms results into revenue loss.

d. Transport
IoT’s can be used across transport systems such as traffic control, parking etc. They provide improved communication, control and data distribution.

The IoT based sensor information obtained from street cameras, motion sensors and officers on patrol are used to evaluate the traffic patterns of the crowded areas. Commuters will be informed of the best possible routes to take, using information from real-time traffic sensor data, to avoid being stuck in traffic jams.

e. Smart City
IoT simplifies examining various factors such as population growth, zoning, mapping, water supply, transportation patterns, food supply, social services, and land use. It supports cities through its implementation in major services and infrastructure such as transportation and healthcare. It also manages other areas like water control, waste management, and emergency management. Its real-time and detailed information facilitate prompt decisions in emergency management.  IoT can automate motor vehicle services for testing, permits, and licensing.

f. Power
IoT simplifies the process of energy monitoring and management while maintaining a low cost and high level of precision. IoT based solutions are used for efficient and smart utilization of energy. They are used in Smart grid, Smart meter solution implementations.

Energy system reliability is achieved through IoT based analytics system. It helps in preventing system overloading or throttling and also detects threats to system performance and stability, which protects against losses such as downtime, damaged equipment, and injuries.

g. Agriculture
IoT minimizes the human intervention in farming function, farming analysis and monitoring. IoT based systems detect changes to crops, soil environment etc.

IoT in agriculture contribute to,

  • Crop monitoring: Sensors can be used to monitor crops and the health of plants using the data collected. Sensors can also be used for early monitoring of pests and disease.
  • Food safety: The entire supply chain, the Farm, logistics and retails, are all becoming connected. Farm products can be connected with RFID tags, increasing customer confidence.
  • Climate monitoring: Sensors can be used to monitor temperature, humidity, light intensity and soil moisture. These data can be sent to the central system to trigger alerts and automate water, air and crop control.
  • Logistics monitoring: Location based sensors can be used to track vegetables and other Farm products during transport and storage. This enhances scheduling and automates the supply chain.
  • Livestock farming monitoring: The monitoring of Farm animals can be monitored via sensors to detect potential signs of disease. The data can be analysed from the central system and relevant information can be sent to the farmers.

Conclusion
There are many opportunities for the government to use the IoT to make government services more efficient. IoT cannot be analyzed or implemented properly without collaborative efforts between Industry, Government and Agencies. Government and Agencies need to work together to build a consistent set of standards that everyone can follow.

Connected Government solutions using IoT is used in the domain front:

  • Public Safety departments to leverage IoT for the protection of citizens. One method is through using video images and sensors to provide predictive analysis, so that government can provide security to citizen gathering during parades or inaugural events.
  • Healthcare front, advanced analytics of IoT delivers better and granular care of patients. Real time access of patient’s reports, monitoring of patients health status improves the emergency healthcare.
  • IoT helps in content delivery, monitoring of the students, faculty and improving the quality of education and professional development in Education domain.
  • In energy sector, IoT allows variety of energy controls and monitoring functions. It simplifies the process of energy monitoring and management while maintaining low cost and high level of precision. It helps in preventing system overloading, improving performance of the system and stability.
  • IoT strategy is being utilized in the agricultural industry in terms of productivity, pest control, water conservation and continuous production based on improved technology and methods.

In the technology front:

  • IOT connects billions of devices and sensors to create new and innovative applications. In order to support these applications, a reliable, elastic and agile platform is essential. Cloud computing is one of the enabling platforms to support IOT.
  • Connected Government solution can manage the large number of devices and volume of data emitted with IoT. This large volume of new information generated by IoT allows a new collaboration between government, industry and citizens. It helps in rapidly developing IoT focused preventive and predictive analytics.
  • Optimizing the business processes with process automation and prebuilt integrations across multiple departmental applications. This opens up the new opportunities for government to share information, innovate, save lives, make more informed decisions, and actually extend the scope of machine and human interaction.

References

  1. Gartner Says It’s the Beginning of a New Era: The Digital Industrial Economy.” Gartner.
  2. Embracing the Internet of Everything to Capture your share of $14.4 trillion.” Cisco.
  3. With a Trillion Sensors, the Internet of Things Would Be the “Biggest Business in the History of Electronics.” Motherboard.
  4. The ‘Internet of Things’ Will Be The World’s Most Massive Device Market And Save Companies Billions of Dollars.” Business Insider.
  5. Facts and Forecasts: Billions of Things, Trillions of Dollars. Siemens.

Source: http://iotbootcamp.sys-con.com/node/4074527

The IoT: It’s a question of scope

1 Apr

There is a part of the rich history of software development that will be a guiding light, and will support creation of the software that will run the Internet of Things (IoT). It’s all a question of scope.

Figure 1 is a six-layer architecture, showing what I consider to be key functional and technology groupings that will define software structure in a smart connected product.

Figure 1

The physical product is on the left. “Connectivity” in the third box allows the software in the physical product to connect to back-end application software on the right. Compared to a technical architecture, this is an oversimplification. But it will help me explain why I believe the concept of “scope” is so important for everyone in the software development team.

Scope is a big deal
The “scope” I want to focus on is a well-established term used to explain name binding in computer languages. There are other uses, even within computer science, but for now, please just exclude them from your thinking, as I am going to do.

The concept of scope can be truly simple. Take the name of some item in a software system. Now decide where within the total system this name is a valid way to refer to the item. That’s the scope of this particular name.

(Related: What newcomers to IoT plan for its future)

I don’t have evidence, but I imagine that the concept arose naturally in the earliest days of software, with programs written in machine code. The easiest way to handle variables is to give them each a specific memory location. These are global variables; any part of the software that knows the address can access and use these variables.

But wait! It’s 1950 and we’ve used all 1KB of memory! One way forward is to recognize that some variables are used only by localized parts of the software. So we can squeeze more into our 1KB by sharing memory locations. By the time we get to section two of the software, section one has no more use for some of its variables, so section two can reuse those addresses. These are local variables, and as machine code gave way to assembler languages and high-level languages, addresses gave way to names, and the concept of scope was needed.

But scope turned out to be much more useful than just a way to share precious memory. With well-chosen rules on scope, computer languages used names to define not only variables, but whole data structures, functions, and connections to peripherals as well. You name it, and, well yes, you could give it a name. This created new ways of thinking about software structure. Different parts of a system could be separated from other parts and developed independently.

A new software challenge
There’s a new challenge for IoT software, and this challenge applies to all the software across the six boxes in Figure 1. This includes the embedded software in the smart connected device, the enterprise applications that monitor and control the device, as well as the software-handling access control and product-specific functions.

The challenge is the new environment for this software. These software types and the development teams behind them are very comfortable operating in essentially “closed” environments. For example, the embedded software used to be just a control system; its universe was the real-time world of sensors and actuators together with its memory space and operating system. Complicated, but there was a boundary.

Now, it’s connected to a network, and it has to send and receive messages, some of which may cause it to update itself. Still complicated, and it has no control over the timing, sequence or content of the messages it receives. Timing and sequence shouldn’t be a problem; that’s like handling unpredictable screen clicks or button presses from a control panel. But content? That’s different.

Connectivity creates broadly similar questions about the environment for the software across all the six layers. Imagine implementing a software-feature upgrade capability. Whether it’s try-before-you-buy or a confirmed order, the sales-order processing system is the one that holds the official view of what the customer has ordered. So a safe transaction-oriented application like SOP is now exposed to challenging real-world questions. For example, how many times, and at what frequency, should it retry after a device fails to acknowledge an upgrade command within the specified time?

An extensible notion
The notion of scope can be extended to help development teams handle this challenge. It doesn’t deliver the solutions, but it will help team members think about and define structure for possible solution architectures.

For example, Figure 2 looks at software in a factory, where the local scope of sensor readings and actuator actions in a work-cell automation system are in contrast to the much broader scope of quality and production metrics, which can drive re-planning of production, adjustment of machinery, or discussions with suppliers about material quality.

Figure 2

Figure 3 puts this example from production in the context of the preceding engineering development work, and the in-service life of this product after it leaves the factory.

Figure 3

Figure 4 adds three examples of new IoT capabilities that will need new software: one in service (predictive maintenance), and two in the development phase (calibration of manufacturing models to realities in the factory, and engineering access to in-service performance data).

Figure 4

Each box is the first step to describing and later defining the scope of the data items, messages, and sub-systems involved in the application. Just like the 1950s machine code programmers, one answer is “make everything global”—or, in today’s terms, “put everything in a database in the cloud.” And as in 1950, that approach will probably be a bit heavy on resources, and therefore fail to scale.

Dare I say data dictionary?
A bit old school, but there are some important extensions to ensure a data dictionary articulates not only the basic semantics of a data item, but also its reliability, availability, and likely update frequency. IoT data may not all be in a database; a lot of it starts out there in the real world, so attributes like time and cost of updates may be relevant. For the development team, stories, scrums and sprints come first. But after a few cycles, the data dictionary can be the single reference that ensures everyone can discuss the required scope for every artifact in the system-of-systems.

Software development teams for every type of software involved in an IoT solution (for example, embedded, enterprise, desktop, web and cloud) will have an approach (and possibly different approaches) to naming, documenting, and handling design questions: Who creates, reads, updates or deletes this artifact? What formats do we use to move data inside one subsystem, or between subsystems? Which subsystem is responsible for orchestrating a response to a change in a data value? Given a data dictionary, and a discussion about the importance of scope, these teams should be able to discuss everything that happens at their interfaces.

Different programming languages have different ways of defining scope. I believe it’s worth reviewing a few of these, maybe explore some boundaries by looking at some more esoteric languages. This will remind you of all the wonderful possibilities and unexpected pitfalls of using, communicating, and sharing data and other information technology artifacts. The rules the language designers have created may well inspire you to develop guidelines and maybe specific rules for your IoT system. You’ll be saving your IoT system development team a lot of time.

Source: http://sdtimes.com/analyst-view-iot-question-scope/

Building the IoT – Connectivity and Security

25 Jul

Short-range wireless networking, for instance, is another major IoT building block that needs work. It is used in local networks, such as:

and more.With the latest versions of Bluetooth and Zigbee, both protocols can now transport an IP packet, allowing, as IDC represents it, a uniquely identifiable endpoint. A gateway/hub/concentrator is still required to move from the short-range wireless domain to the internet domain. For example, with Bluetooth, a smartphone or tablet can be this gateway.

The main R&D efforts for local area networking are focused on radio hardware and power consumption so that we can avoid needing a power cable or batteries for wireless devices, network topologies and software stacks. 6LoWPAN and its latest evolution under Google’s direction, Thread, are pushing the limits in this area. Because consumers have become accustomed to regularly changing their technology, such as updating their computers and smartphones every few years, the consumer market is a good laboratory for this development.

There is also a need for long-range wireless networking in the IoT to mature. Connectivity for things relies on existing IP networks. For mobile IoT devices and difficult-to-reach areas, IP networking is mainly achieved via cellular systems. However, there are multiple locations where there is no cellular coverage. Further, although cellular is effective, it becomes too expensive as the number of end-devices starts reaching a large number. A user can pay for a single data plan (the use of cellular modems in cars to provide Wi-Fi, for example), but that cost rapidly becomes prohibitive when operating a large fleet.

For end-devices without a stable power supply—such as in farming applications or pipeline monitoring and control—the use of cellular is also not a good option. A cellular modem is fairly power-hungry.

Accordingly, we are beginning to see new contenders for IoT device traffic in long-range wireless connections. A new class of wireless, called low-power wide-area networks (LPWAN), has begun to emerge. Whereas previously you could choose low power with limited distance (802.15.4), or greater distance with high power, LPWAN provide a good compromise: battery-powered operation with distances up to 30KM.

There are a number of competing technologies for LPWAN, but two approaches are of particular significance are LoRa and SIGFOX.

LoRa provides an open specification for the protocol, and most importantly, an open business model. The latter means that anyone can build a LoRa network—from an individual or a private company to a network operator.

SIGFOX is an ultra-narrowband technology. It requires an inexpensive endpoint radio and a more sophisticated base station to manage the network. Telecommunication operators usually carry the largest amount of data; usually high frequencies (such as 5G), whereas SIGFOX intends to do the opposite by using the lower frequencies. SIGFOX advertises that its messages can travel up to 1,000 kilometers (620 miles), and each base station can handle up to 1 million objects, consuming 1/1000th the energy of a standard cellular system. SIGFOX communication tends to be better if it’s headed up from the endpoint to the base station, because the receive sensitivity on the endpoint is not as good as the expensive base station. It has bidirectional functionality, but its capacity going from the base station back to the endpoint is constrained, and you’ll have less link budget going down than going up.

SIGFOX and LoRa have been competitors in the LPWAN space for several years. Yet even with different business models and technologies, SIGFOX and LoRa have the same end-goal: to be adopted for IoT deployments over both city and nationwide LPWAN. For the IoT, LPWAN solves the connectivity problem for simple coverage of complete buildings, campuses or cities without the need for complex mesh or densely populated star networks.

The advantage of LPWAN is well-understood by the cellular operators; so well, in fact, that Nokia, Ericsson and Intel are collaborating on narrowband-LTE (NB-LTE). They argue it is the best path forward for using LTE to power IoT devices. NB-LTE represents an optimized variant of LTE. According to them, it is well-suited for the IoT market segment because it is cheap to deploy, easy to use and delivers strong power efficiency. The three partners face an array of competing interests supporting alternative technologies. Those include Huawei and other companies supporting the existing narrowband cellular IoT proposal.

These technologies are part of the solution to solve some of the cloud-centric network challenges. It is happening, but we can’t say this is mainstream technology today.

Internet concerns

Beyond the issue of wireless connectivity to the internet lie questions about the internet itself. There is no doubt that IoT devices use the Internet Protocol (IP). The IPSO Alliance was founded in 2008 to promote IP adoption. Last year, the Alliance publicly declared that the use of IP in IoT devices was now well understood by all industries. The question now is, “How to best use IP?”

For example, is the current IP networking topology and hierarchy the right one to meet IoT requirements? When we start thinking of using gateways/hubs/concentrators in a network, it also raises the question of network equipment usage and data processing locations. Does it make sense to take the data from the end-points and send it all the way to a back-end system (cloud), or would some local processing offer a better system design?

Global-industry thinking right now is that distributed processing is a better solution, but the internet was not built that way. The predicted sheer breadth and scale of IoT systems requires collaboration at a number of levels, including hardware, software across edge and cloud, plus the protocols and data model standards that enable all of the “things” to communicate and interoperate. The world networking experts know that the current infrastructure made up of constrained devices and networks simply can’t keep up with the volume of data traffic created by IoT devices, nor can it meet the low-latency response times demanded by some systems. Given the predicted IoT growth, this problem will only get worse.

In his article, The IoT Needs Fog Computing, Angelo Corsaro, chief technology officer ofPrismtech, makes many good points about why the internet as we know it today is not adequate. He states that it must change from cloud to fog to support the new IoT networking, data storage and data processing requirements.

The main challenges of the existing cloud-centric network for broad IoT application are:

  • Connectivity (one connection for each device)
  • Bandwidth (high number of devices will exceed number of humans communicating)
  • Latency (the reaction time must be compatible with the dynamics of the physical entity or process with which the application interacts)
  • Cost (for an system owner, the cost of each connection multiplied by the number of devices can sour the ROI on a system)

These issues led to the creation of the OpenFog Consortium (OFC). OFC was created to define a composability architecture and approach to fog/edge/distributed computing, including creating a reference design that delivers interoperability close to the end-devices. OFC’s efforts will define an architecture of distributed computing, network, storage, control, and resources that will support intelligence at the edge of IoT, including autonomous and self-aware machines, things, devices, and smart objects. OFC is one more example that an important building block to achieve a scalable IoT is under development. This supports Gartner’s belief that the IoT will take five to 10 years to achieve mainstream adoption.

Yet the majority of media coverage about the IoT is still very cloud-centric, sharing the IT viewpoint. In my opinion, IT-driven cloud initiatives make one significant mistake. For many of the IoT building blocks, IT is trying to push its technologies to the other end of the spectrum—the devices. Applying IT know-how to embedded devices requires more hardware and software, which currently inflates the cost of IoT devices. For the IoT to become a reality, the edge device unit cost needs to be a lot lower than what we can achieve today. If we try to apply IT technologies and processes to OT devices, we are missing the point.

IT assumes large processors with lots of storage and memory. The programming languages and other software technologies of IT rely on the availability of these resources. Applying the IT cost infrastructure to OT devices is not the right approach. More development is required not only in hardware, but in system management. Managing a network of thousands or millions of computing devices is a significant challenge.

Securing the IoT

The existing internet architecture compounds another impediment to IoT growth: security. Not a single day goes by that I don’t read an article about IoT security requirements. The industry is still analyzing what it means. We understand IT security, but IT is just a part of the IoT. The IoT brings new challenges, especially in terms of networking architecture and device variety.

For example, recent studies are demonstrating that device-to-device interaction complexity doesn’t scale when we include security. With a highly diverse vendor community, it is clear the IoT requires interoperability. We also understand that device trust, which includes device authentication and attestation, is essential to securing the IoT. But device manufacturer-issued attestation keys compromise user privacy. Proprietary solutions may exist for third-party attestation, but again, they do not scale. Security in an IoT system must start with the end-device. The device must have an immutable identity.

Unfortunately, today this situation does not have an answer. Some chip vendors do have solutions for it. However, they are proprietary solutions, which means the software running on the device must be customized for each silicon vendor.

Security in a closed proprietary system is achievable, especially as the attack surface is smaller. As soon as we open the systems to public networking technologies, however, and are looking at the exponential gain of data correlation from multiple sources, security becomes a combinatory problem that will not soon be solved. With semantic interoperability and application layer protocol interoperability required to exchange data between systems, translation gateways introduce trusted third parties and new/different data model/serialization formats that further complicate the combined systems’ complexity.

The IT realm has had the benefit of running on Intel or similar architectures, and having Windows or Linux as the main operating system. In the embedded realm there is no such thing as a common architecture (other than the core—which, most of the time, is ARM—but the peripherals are all different, even within the same silicon vendor product portfolio). There are also a number of real-time operating systems (RTOS) for the microcontrollers and microprocessors used in embedded systems, from open-source ones to commercial RTOS. To lower embedded systems cost and achieve economies of scale, the industry will need to standardize the hardware and software used. Otherwise, development and production costs of the “things” will remain high, and jeopardize reaching the predicted billions of devices.

Fortunately, the technology community has identified several IoT design patterns. A design pattern is a general reusable solution to a commonly occurring problem. While not a finished design that can be transformed directly into hardware or code, a design pattern is a description or template for how to solve a problem that can be used in many different situations.

These IoT design patterns are described in IETF RFC 7452 and in a recent Internet Society IoT white paper. In general, we recognize five classes of patterns:

  • Device-to-Device
  • Device-to-Cloud
  • Gateway
  • Back-end Data Portability
  • IP-based Device-to-Device

Security solutions for each of these design patterns are under development. But considerable work remains.

Finally, all of this work leads to data privacy, which, unfortunately, is not only a technical question, but also a legal one. Who owns the data, and what can the owner do with it? Can it be sold? Can it be made public?

As you can see, there are years of work ahead of us before we can provide solutions to these security questions. But the questions are being asked and, according to the saying, asking the question is already 50% of the answer!

Conclusion

My goal here is not to discourage anyone from developing and deploying an IoT system—quite the contrary, in fact. The building blocks to develop IoT systems exist. These blocks may be too expensive, too bulky, may not achieve an acceptable performance level, and may not be secured, but they exist.

Our position today is similar to that at the beginning of the automobile era. The first cars did not move that fast, and had myriad security issues! A century later, we are contemplating the advent of the self-driving car. For IoT, it will not take a century. As noted before, Gartner believes IoT will take five to ten years to reach mainstream adoption. I agree, and I am personally contributing and putting in the effort to develop some of the parts required to achieve this goal.

Many questions remain. About 10 years ago, the industry was asking if the IP was the right networking technology to use. Today it is clear. IP is a must. The question now is, “How do we use it”? Another question we begin to hear frequently is, “What is the RoI (return on investment) of the IoT”? What are the costs and revenue (or cost savings) that such technology can bring? Such questions will need solid answers before the IoT can really take off.

Challenges also abound. When designing your system, you may find limitations in the sensors/actuators, processors, networking technologies, storage, data processing, and analytics that your design needs. The IoT is not possible without software, and where there is software, there are bug fixes and feature enhancements. To achieve software upgradability, the systems need to be designed to allow for this functionality. The system hardware and operation costs may be higher to attain planned system life.

All that said, it is possible to develop and deploy an IoT system today. And as new technologies are introduced, more and more system concepts can have a positive RoI. Good examples of such systems include fleet management and many consumer initiatives. The IoT is composed of many moving parts, many of which have current major R&D programs. In the coming years, we will see great improvements in many sectors.

The real challenge for the IoT to materialize, then, is not technologies. They exist. The challenge is for their combined costs and performance to reach the level needed to enable the deployment of the forecasted billions of IoT devices.

Source: http://www.edn.com/electronics-blogs/eye-on-iot-/4442411/Building-the-IoT—Connectivity-and-Security

Dawn of the Gigabit Internet Age

14 Mar

The availability of speedier Internet connections will likely transform a variety of products and services for businesses and consumers, according to research from Deloitte Global.

Deloitte Touche Tohmatsu Limited (Deloitte Global) predicts that the number of gigabit-per-second (gbit/s) Internet connections, which offer significantly faster service than average broadband speeds, will surge to 10 million by the end of the year, a tenfold increase. As average data connections get faster and the number of providers offering gigabit services grows, we expect businesses and consumers will steadily use more bandwidth, and a range of new data-intensive services and devices will come to market.

The expansion of gigabit connections will increasingly enable users to take advantage of high-speed data. For instance, the quality of both video streaming and video calling has already ticked up steadily along with data connection speeds over the past 10 years, and both services are now supported by billions of smartphones, tablets, and PCs. In the enterprise, significantly faster Internet speeds could enhance the ability of remote teams to work together: Large video screens could remain on throughout the work day, linking dispersed team members and enabling them to collaborate “side by side” even when they are thousands of miles apart.

Moreover, as available bandwidth increases, we expect many aspects of communication will be affected. Instant messages, for example, have already evolved from being predominantly text-based to incorporating photos and videos in ever-higher resolution and frame rates. Social networks, too, are hosting growing volumes of video views: As of November, there were 8 billion daily video views on Facebook, double the quantity from just seven months prior.¹

The expansion of gigabit services could reinvent the public sector and social services as well. A range of processes, from crowd monitoring to caring for the elderly, could be significantly enhanced through the availability of high-quality video surveillance. Crowd-control systems could use video feeds to accurately measure a sudden swarm of people to an area, while panic buttons used in the event an elderly person falls could be replaced by high-definition cameras.

Gigabit connections may also change home security solutions. Historically, connected home security relied on a call center making a telephone call to the residence, and many home video camera solutions currently record onto hard drives. As network connection speeds increase, however, cameras are likely to stream video, back up online, and offer better resolution and higher frame rates.² As video resolution increases and cameras proliferate, network demand will likely grow, too.

Additionally, some homes have already accumulated a dozen connected devices and will likely accrue more, with bandwidth demand for each device expected to rise steadily over time. There will also likely be a growing volume of background data usage, as an increased number of devices added to a network, from smartphones to smart lighting hubs, would require online updates for apps or for operating systems.

The Internet speed race is not likely to conclude with gigabit service. Deloitte Global expects Internet speeds to continue rising in the long term: 10 gigabits per second has already been announced, and 50 gigabit-per-second connections are being contemplated for the future.³ CIOs should maintain teams that can monitor the progress of bandwidth speeds—and not only those serving businesses and homes, but emerging gigabit options available via cellular networks and Wi-Fi hotspots as well.

Source: http://deloitte.wsj.com/cio/2016/03/09/dawn-of-the-gigabit-internet-age/?id=us:2sm:3tw:ciojournal:eng:cons:031316:deloitteontech&linkId=22081109

Smart Home: Which company will lead the 2014 Trends?

11 Dec
Image

 

International research firm Parks Associates,  will provide an update on the connected home market and analyze the key trends and upcoming announcements ahead of 2014 International CES . Parks Associates estimates that in 2017, more than 11 million U.S. broadband households will have some type of smart home controller, up from two million in 2013..So we are seeing in the marketplace, including the Control 4, LUTRON ,CRESTRON, AMX, and other power company like Wulian etc, that will be a hot war in home automation area.

So which company will win and lead the 2014 trend? As we know, AMX is a famous brand and has a long history in the home automation area, but its technology is wire, and wireless is the trend, so it must be out. LUTRON and CRESTRON  , Control 4 , yes, you can say, now in the market , maybe many people know them and think their products are good, in fact, for these three companies, not all the products are wireless, part of them are wire. It means, you can not DIY by yourself completely , you must pay the installing fees. So can you find one company which can supply the whole set of home automation products and DIY installing completely? Yes, look for in China, there is one company wulian , you will find they can meet any your inquire for home automation products , what’s more, you can get the high cost performance!

Now in the market , Apple also said they goes into the home automation area, and many companies said they have the best wireless  technology , like WiFI, Bluetooth ,ZigBee, Z wave etc. WiFi has advantage in big date transportation like video, but at the same time, it is also its disadvantage, except the video, most of the home automation products need low power dissipation and low energy consumption. Bluetooth, PTP technology, that will not have a wide range of application. ZigBee, now, many investors think that is the best choice to home automation area, and there is a whole complete industry chain to keep the creativity, For Z wave, consider it just can supply more than 200 devices in theory, we just can say it has a limited range in home automation or building automation .