Archive | Connectivity RSS feed for this section

5G Health Risks: Here’s What the Experts Say

27 Oct
(Image credit: Shutterstock)

This year has delivered a whirlwind of hype surrounding 5G: how it will change lives, where 5G networks are launching around the world and when exactly your smartphone will be capable of lightning-fast speeds. But some people are concerned that the rollout of 5G is happening so quickly that we don’t truly understand if or how the launch of next-generation connectivity will bring unintended health consequences.

The short answer: The scientific consensus is that 5G, like 3G and 4G before it, is not harmful to your health. In August, the U.S. Federal Communications Commission (FCC) officially determined that 5G’s radio waves are safe.

But that finding probably won’t stem the tide of worry over 5G’s rollout, especially as more 5G phones hit the market and the coverage becomes more widespread.

Why are people concerned that 5G is unsafe?

Early 5G networks — including those launched by AT&TVerizon and T-Mobile — use high-frequency, millimeter-wave (mmWave) spectrum to deliver faster speeds. Some people are concerned that those radio waves, along with the additional cellular infrastructure needed to build out mmWave-based 5G networks in major cities, will increase the amount of radiation in the environment.

Millimeter-wave spectrum has never been used for telecommunications. However, that’s not because it’s dangerous; the higher-frequency bands are just not as effective at transmitting data across distances. An mmWave-based 5G signal can’t penetrate objects, such as glass windows or concrete buildings. It also can’t penetrate the body.

The concerns over 5G are an extension of the worries some people have about cellphones in general.

But 5G is a form of radiation, right?

There are two types of radiation: ionizing and non-ionizing. Ultra-high-frequency ionizing radiation — which includes gamma-rays, UV rays from the sun and X-rays — is harmful to humans because it penetrates the body at the cellular level and causes electrons and atoms to break apart. Ionizing radiation can cause cancer, which is why you’re supposed to wear sunscreen outdoors and avoid unnecessary medical X-rays.

Non-ionizing radiation does not cause cancer, and runs the gamut from FM radio waves to visible light. In between the two is 5G, which operates at a slightly higher frequency than 3G and 4G.

The FCC requires all electronic equipment sold in the U.S. to meet the agency’s safety standards for acceptable radio-frequency (RF) energy by determining the device’s specific absorption rate (SAR), or the rate by which the body absorbs RF energy. The FCC recently reevaluated its standards, which were created in 1996, when determining the safety of 5G. The recommended RF exposure limits remain unchanged.

“The scientific consensus is that there are no known health risks from all forms of RF energy at the low levels approved for everyday consumer use,” a spokesperson for CTIA, a trade group for the wireless communications industry, said in an emailed statement. “The FCC regulates RF emissions, including millimeter waves from 5G devices and equipment, and has adopted the recommendations of expert scientific organizations that have reviewed the science, including dozens of studies focused specifically on millimeter waves, and established safe exposure levels.”

What’s driving the fear of 5G?

There are a few factors contributing to the concern — or outright fear — of 5G’s effects.

The first is scientific research that has been interpreted by some to support concern about cellphone radiation. For instance, a 2018 study released by the National Toxicology Program (NTP) found that when rats and mice were exposed to radio-frequency waves like the kind that emanate from cellphones, they developed malignant tumors. This particular study looked at 2G and 3G phones. However, that doesn’t mean 5G will cause cancerous tumors in humans.

Skeptics, like the University of California, Berkeley’s Joel Moskowitz, are calling for a halt to 5G’s rollout.

“The exposures used in the studies cannot be compared directly to the exposure that humans experience when using a cellphone,” John Bucher, a senior scientist for the NTP, said when announcing the findings. “In our studies, rats and mice received radio-frequency radiation across their whole bodies. By contrast, people are mostly exposed in specific local tissues close to where they hold the phone. In addition, the exposure levels and durations in our studies were greater than what people experience.”

The NTP has said it plans to develop thorough studies to evaluate the safety of 5G.

The World Health Organization’s International Agency for Research on Cancer has categorized RF waves from cellphones as a possible carcinogen, which is another factor contributing to the concerns over 5G. But, for context, an ingredient in coffee is also considered a possible carcinogenRed meat is categorized as a probable carcinogen, which means it has a stronger link to cancer than cellphones do.

The New York Times reported earlier this year that one of the primary 5G fearmongers is Russian propaganda spreading on YouTube, Facebook and blogs across the internet. Videos and news articles filled with misinformation are scaring U.S. consumers even as Russia proceeds with its own 5G plans.

Have there been studies to prove that 5G isn’t a health risk to humans?

5G is a new standard for wireless communication, but from a technological standpoint, it isn’t all that different from 3G and 4G. The radio-frequency waves from 5G cellphones are akin to the RF waves from LTE devices (i.e., non-ionizing). According to the American Cancer Society, most studies have shown that “the RF waves given off by cell phones don’t have enough energy to damage DNA directly or to heat body tissues.”

In 2000, a now-debunked study on the effect of radio waves on brain tissue fueled conspiracy theories about cellphones and radiation. The author of the study, physicist Bill Curry, claimed that wireless devices could cause brain cancer in humans. According to The New York Times, Curry neglected to take into account that our skin protects our internal tissues from high-frequency radio waves (which is, again, why you need to wear sunscreen to protect the skin from even higher-frequency UV rays).

“If you’re more concerned about the base station on your building than you are [about] spending an hour in the noonday sun without any protection, you might want to think about your priorities.”

                  Christopher Collins

However, because 5G networks are just now getting off the ground with a new roster of 5G phones, no long-term studies of the network or the devices and their effects on humans have been conducted. In addition, the types of devices we use and the way we use them are constantly changing. For that reason, skeptics such as Joel Moskowitz, director of the Center for Family and Community Health at the University of California, Berkeley School of Public Health, are calling for a halt to 5G’s rollout.

Moskowitz said it would be unethical to conduct a conclusive scientific study on human beings controlling for the health effects of cellphone radiation, so researchers rely on observational and animal studies. Those studies haven’t proved conclusively that cellphones are harmful to humans, but Moskowitz thinks there’s enough evidence to “put a moratorium on the rollout of new technologies” like 5G infrastructure build-out until more research is done.

“I’m certain that, within the next five years, radio-frequency radiation will be declared at least probably carcinogenic [by the WHO],” Moskowitz said.

But Christopher Collins, a professor in New York University’s radiology department who studies the safety of electromagnetic fields, said the lack of 5G-specific research doesn’t mean researchers are starting from scratch when evaluating 5G’s potential effects on human health.

“A lot of the premise of people who advocate against 5G or wireless communications fields in general seem to suggest that we just don’t know and we need to do more studies,” Collins said. “We know a lot. We’ve been doing experiments on humans and animals for decades over this entire spectrum.”

Collins said scientists “never want to say the book is closed,” but based on what we already know, there’s no evidence to suggest that 5G will cause cancer or other detrimental health effects in most people.

So why are some local governments putting a stop to 5G development?

Prior to the FCC’s 5G safety determination, city and state regulators were hearing from residents who were concerned that not enough was known about 5G. Specifically, people are concerned that the density of small cell sites required to build out mmWave-based 5G networks would emit dangerous amounts of radiation.

The FCC’s 5G FAST Plan, which requires municipalities to approve 5G cell sites within 60 to 90 days, has caused concern. Carriers are moving quickly to build out infrastructure without giving residents notice, The Wall Street Journal reported, and local legislators are pushing back. Some 90 cities and counties have filed suit against the FCC in a case currently pending in the Ninth Circuit Court of Appeals.

Homeowners may not want new antennas outside their homes for aesthetic reasons, or because they want advance notice when changes occur in their communities, but the FCC, industry trade groups and many scientists maintain there is no proven health risk.

“Typical exposure to 5G devices — such as small cells attached to phone poles or the sides of buildings — is far below the permissible levels and comparable to Bluetooth devices and baby monitors,” the CTIA spokesperson said. “The FCC continues to monitor the science to ensure that its regulations are protective of public health.”

Or, as NYU’s Chris Collins put it:

“One thing that we know can cause cancer is sunlight. People would generally do better to worry about that than the exposure levels we’re talking about with cellphones. If you’re more concerned about the base station on your building than you are [about] spending an hour in the noonday sun without any protection, you might want to think about your priorities.”

Bottom line

“Is it time to stop questioning? No, it’s never time to stop questioning,” he said. “It’s important to remember that, based on what we know now, there is no effect except for heating. This is based on many decades’ worth of study in these fields. It’s another thing to say, ‘Should we stop progress?’ based on what I would call unfounded concerns. I am quite certain there’s nothing to be alarmed about for millimeter waves.”

If you are concerned, there are ways to mitigate your personal exposure to cellphone radiation by using fewer wireless devices. That might also do wonders for your mental health, too.

Key Drivers and Research Challenges for 6G Ubiquitous Wireless Intelligence

25 Sep

The University of Oulu in Finland has published the world’s first white paper on 6G wireless technology. The white paper is titled ‘Key Drivers and Research Challenges for 6G Ubiquitous Wireless Intelligence’ and is based on information gathered at a summit of experts in the emerging 6G wireless capability sector held in Levi, Finnish Lapland, in March this year. It focuses on the key drivers and research priorities for the development of 6G technology, which the experts estimated would result in ‘ubiquitous wireless intelligence’ by 2030.

The paper consits of seven themes:

  • Social and business drivers of 6G wireless innovation, including adherence to the United Nations’ Sustainable Development Goals and the evolving needs of the data market: the paper notes that while the technical success of 5G has relied on new developments in many areas and will deliver a much wider range of data rates to a much broader variety of devices and users. 6G will require a substantially more holistic approach to identify future communication needs, embracing a much wider community to shape the requirements of 6G.
  • 6G use cases and new devices – the paper predicts a shift in user devices from smartphones toward wearable devices with virtual, augmented or mixed reality capability, along with the emergence of other innovations in technological engagement such as telepresence, mobile robots and autonomous vehicles; and identifies these as factors to be considered when constructing 6G-enabled networks;
  • Key performance indicators and projected spectrum capability for 6G wireless connectivity, which the experts say should aim to transmit at rates of up to 1Tbps per user;
  • Progress and challenges of the necessary radio hardware – communications applications and architecture must merge in order to offer the spectrum needed to achieve the requisite speeds for 6G connectivity;
  • Wireless systems and the physical layer of development – the paper highlights issues of increased energy consumption and data processing, saying: ‘Meeting all the challenging requirements identified requires a hyper-flexible network with configurable radios. AI and machine learning will be used in concert with radio sensing and positioning to learn about the static and dynamic components of the radio environment’;
  • 6G wireless networking, including secure privacy protection protocols and the growing role of Artificial Intelligence and blockchain capability; and
  • New service enablers – the paper highlights the growth of edge and cloud computing, machine learning and Artificial Intelligence and highlights the importance of shoring up privacy and trust in the network.

As 5G research is maturing and continues to support global standardization, we must start to start discussing what 6G can become and how to get there. Company representatives, researchers, decision-makers and other builders and members of smart society are invited to join this effort.

Click here to download the 6G White Paper on everything RF or  White paper 6G

IoT: New Paradigm for Connected Government

9 May

The Internet of Things (IoT) is an uninterrupted connected network of embedded objects/ devices with identifiers without any human intervention using standard and communication protocol.  It provides encryption, authorization and identification with different device protocols like MQTT, STOMP or AMQP to securely move data from one network to another. IoT in connected Government helps to deliver better citizen services and provides transparency. It improves the employee productivity and cost savings. It helps in delivering contextual and personalized service to citizens and enhances the security and improves the quality of life. With secure and accessible information government business makes more efficient, data driven, changing the lives of citizens for the better. IoT focused Connected Government solution helps in rapidly developing preventive and predictive analytics. It also helps in optimizing the business processes and prebuilt integrations across multiple departmental applications. In summary, this opens up the new opportunities for government to share information, innovate, make more informed decisions and extend the scope of machine and human interaction.

The Internet of Things (IoT) is a seamless connected system of embedded sensors/devices in which communication is done using standard and interoperable communication protocols without human intervention.

The vision of any Connected Government in the digital era is “To develop connected and intelligent IoT based systems to contribute to government’s economy, improving citizen satisfaction, safe society, environment sustainability, city management and global need.”

IoT has data feeds from various sources like cameras, weather and environmental sensors, traffic signals, parking zones, shared video surveillance service.  The processing of this data leads to better government – IoT agency coordination and the development of better services to citizens.

Market Research predicts that, by 2020, up to 30 billion devices with unique IP addresses are connected to the Internet [1]. Also, “Internet of Everything” has an economic impact of more than $14 trillion by 2020 [2].  By 2020, the “Internet of Things” is powered by a trillion sensors [3]. In 2019, the “Internet of Things” device market is double the size of the smartphone, PC, tablet, connected car, and the wearable market combined [4]. By 2020, component costs will have to come down to the point that connectivity will become a standard feature even for processors costing less than $1 [5].

This article articulates the drivers for connected government using IoT and its objectives. It also describes various scenarios in which IoT used across departments in connected government.

IoT Challenges Today
The trend in government seems to be IoT on an agency-by-agency basis leading to different policies, strategies, standards and subsequent analysis and use of data. There are number of challenges preventing the adoption of IoT in governments. The main challenges are:

  • Complexity: Lack of funding, skills and usage of digital technologies, culture and strategic leadership commitment are the challenges today.
  • Data Management: In Government, there is a need for managing huge volumes of data related to government departments, citizens, land and GIS. This data needs to be encrypted and secured. To maintain the data privacy and data integrity is a big challenge.
  • Connectivity: IoT devices require good network connectivity to deliver the data payload and continuous streaming of unstructured data. Example being the Patient medical records, rainfall reports, disaster information etc.  Having a network connectivity continuously is a challenge.
  • Security: Moving the information back and forth between departments, citizens and third parties in a secure mode is the basic requirement in Government as IoT introduces new risks and vulnerabilities. This leaves users exposed to various kinds of threats.
  • Interoperability: This requires not only the systems be networked together, but also that data from each system has to be interoperable. Majority of the cases, IoT is fragmented and lacks in interoperability due to different OEMs, OS, Versions, Connecters and Protocols.
  • Risk and Privacy: Devices sometimes gather and provides personal data without the user’s active participation or approval. Sometimes gathers very private information about individuals based on indirect interactions violating the privacy policies.
  • Integration: Need to design an integration platform that can connect any application, service, data or device with the government eco system. Having a solution that comprises of an integrated “all-in-one” platform which provides the device connectivity, event analytics, and enterprise connectivity capabilities is a big challenge.
  • Regulatory and Compliance – Adoption of regulations by an IoT agencies is a challenge.
  • Governance: One of the major concerns across government agencies is the lack of big picture or an integrated view of the IoT implementation. It has been pushed by various departments in a silo-ed fashion.  Also, government leaders lack a complete understanding of IoT technology and its potential benefits.

IoT: Drivers for Connected Government
IoT can increase value by both collecting better information about how effectively government servants, programs, and policies are addressing challenges as well as helping government to deliver citizen-centric services based on real-time and situation-specific conditions. The various stakeholders that are leveraging IoT in connected government are depicted below,


Information Flow in an IoT Scenario
The Information flow in Government using IoT has five stages (5C) : Collection, Communication, Consolidation, Conclusion and Choice.

  1. Collection: Sensors/devices collect data on the physical environment-for example, measuring things such as air temperature, location, or device status. Sensors passively measure or capture information with no human intervention.
  2. Communication: Devices share the information with other devices or with a centralized platform. Data is seamlessly transmitted among objects or from objects to a central repository.
  3. Consolidation: The information from multiple sources are captured and combined at one point. Data is aggregated as a devices communicate with each other. Rules determine the quality and importance of data standards.
  4. Conclusion: Analytical tools help detect patterns that signal a need for action, or anomalies that require further investigation.
  5. Choice: Insights derived from analysis either initiate an action or frame a choice for the user. Real time signals make the insights actionable, either presenting choices without emotional bias or directly initiating the action.

Figure 2: IoT Information Flow

Role of IoT in Connected Government
The following section highlights the various government domains and typical use cases in the connected government.

Figure 3: IoT Usage in Connected Government

a. Health
IoT-based applications/systems of the healthcare enhance the traditional technology used today. These devices helps in increasing the accuracy of the medical data that was collected from large set of devices connected to various applications and systems. It also helps in gathering data to improve the precision of medical care which is delivered through sophisticated integrated healthcare systems.

IoT devices give direct, 24/7 X 365 access to the patient in a less intrusive way than other options. IoT based analytics and automation allows the providers to access the patient reports prior to their arrival to hospital. It improves responsiveness in emergency healthcare.

IoT-driven systems are used for continuous monitoring of patients status.  These monitoring systems employ sensors to collect physiological information that is analyzed and stored on the cloud. This information is accessed by Doctors for further analysis and review. This way, it provides continuous automated flow of information. It helps in improving the quality of care through altering system.

Patient’s health data is captured using various sensors and are analyzed and sent to the medical professional for proper medical assistance remotely.

b. Education
IoT customizes and enhances education by allowing optimization of all content and forms of delivery. It reduces costs and labor of education through automation of common tasks outside of the actual education process.

IoT technology improves the quality of education, professional development, and facility management.  The key areas in which IoT helps are,

  • Student Tracking, IoT facilitates the customization of education to give every student access to what they need. Each student can control experience and participate in instructional design. The student utilizes the system, and performance data primarily shapes their design. This delivers highly effective education while reducing costs.
  • Instructor Tracking, IoT provides instructors with easy access to powerful educational tools. Educators can use IoT to perform as a one-on-one instructor providing specific instructional designs for each student.
  • Facility monitoring and maintenance, The application of technology improves the professional development of educators
  • Data from other facilities, IoT also enhances the knowledge base used to devise education standards and practices. IoT introduces large high quality, real-world datasets into the foundation of educational design.

c. Construction
IoT enabled devices/sensors are used for automatic monitoring of public sector buildings and facilities or large infrastructure. They are used for managing the energy levels of air conditioning, electricity usage. Examples being lights or air conditioners ON in empty rooms results into revenue loss.

d. Transport
IoT’s can be used across transport systems such as traffic control, parking etc. They provide improved communication, control and data distribution.

The IoT based sensor information obtained from street cameras, motion sensors and officers on patrol are used to evaluate the traffic patterns of the crowded areas. Commuters will be informed of the best possible routes to take, using information from real-time traffic sensor data, to avoid being stuck in traffic jams.

e. Smart City
IoT simplifies examining various factors such as population growth, zoning, mapping, water supply, transportation patterns, food supply, social services, and land use. It supports cities through its implementation in major services and infrastructure such as transportation and healthcare. It also manages other areas like water control, waste management, and emergency management. Its real-time and detailed information facilitate prompt decisions in emergency management.  IoT can automate motor vehicle services for testing, permits, and licensing.

f. Power
IoT simplifies the process of energy monitoring and management while maintaining a low cost and high level of precision. IoT based solutions are used for efficient and smart utilization of energy. They are used in Smart grid, Smart meter solution implementations.

Energy system reliability is achieved through IoT based analytics system. It helps in preventing system overloading or throttling and also detects threats to system performance and stability, which protects against losses such as downtime, damaged equipment, and injuries.

g. Agriculture
IoT minimizes the human intervention in farming function, farming analysis and monitoring. IoT based systems detect changes to crops, soil environment etc.

IoT in agriculture contribute to,

  • Crop monitoring: Sensors can be used to monitor crops and the health of plants using the data collected. Sensors can also be used for early monitoring of pests and disease.
  • Food safety: The entire supply chain, the Farm, logistics and retails, are all becoming connected. Farm products can be connected with RFID tags, increasing customer confidence.
  • Climate monitoring: Sensors can be used to monitor temperature, humidity, light intensity and soil moisture. These data can be sent to the central system to trigger alerts and automate water, air and crop control.
  • Logistics monitoring: Location based sensors can be used to track vegetables and other Farm products during transport and storage. This enhances scheduling and automates the supply chain.
  • Livestock farming monitoring: The monitoring of Farm animals can be monitored via sensors to detect potential signs of disease. The data can be analysed from the central system and relevant information can be sent to the farmers.

There are many opportunities for the government to use the IoT to make government services more efficient. IoT cannot be analyzed or implemented properly without collaborative efforts between Industry, Government and Agencies. Government and Agencies need to work together to build a consistent set of standards that everyone can follow.

Connected Government solutions using IoT is used in the domain front:

  • Public Safety departments to leverage IoT for the protection of citizens. One method is through using video images and sensors to provide predictive analysis, so that government can provide security to citizen gathering during parades or inaugural events.
  • Healthcare front, advanced analytics of IoT delivers better and granular care of patients. Real time access of patient’s reports, monitoring of patients health status improves the emergency healthcare.
  • IoT helps in content delivery, monitoring of the students, faculty and improving the quality of education and professional development in Education domain.
  • In energy sector, IoT allows variety of energy controls and monitoring functions. It simplifies the process of energy monitoring and management while maintaining low cost and high level of precision. It helps in preventing system overloading, improving performance of the system and stability.
  • IoT strategy is being utilized in the agricultural industry in terms of productivity, pest control, water conservation and continuous production based on improved technology and methods.

In the technology front:

  • IOT connects billions of devices and sensors to create new and innovative applications. In order to support these applications, a reliable, elastic and agile platform is essential. Cloud computing is one of the enabling platforms to support IOT.
  • Connected Government solution can manage the large number of devices and volume of data emitted with IoT. This large volume of new information generated by IoT allows a new collaboration between government, industry and citizens. It helps in rapidly developing IoT focused preventive and predictive analytics.
  • Optimizing the business processes with process automation and prebuilt integrations across multiple departmental applications. This opens up the new opportunities for government to share information, innovate, save lives, make more informed decisions, and actually extend the scope of machine and human interaction.


  1. Gartner Says It’s the Beginning of a New Era: The Digital Industrial Economy.” Gartner.
  2. Embracing the Internet of Everything to Capture your share of $14.4 trillion.” Cisco.
  3. With a Trillion Sensors, the Internet of Things Would Be the “Biggest Business in the History of Electronics.” Motherboard.
  4. The ‘Internet of Things’ Will Be The World’s Most Massive Device Market And Save Companies Billions of Dollars.” Business Insider.
  5. Facts and Forecasts: Billions of Things, Trillions of Dollars. Siemens.


The IoT: It’s a question of scope

1 Apr

There is a part of the rich history of software development that will be a guiding light, and will support creation of the software that will run the Internet of Things (IoT). It’s all a question of scope.

Figure 1 is a six-layer architecture, showing what I consider to be key functional and technology groupings that will define software structure in a smart connected product.

Figure 1

The physical product is on the left. “Connectivity” in the third box allows the software in the physical product to connect to back-end application software on the right. Compared to a technical architecture, this is an oversimplification. But it will help me explain why I believe the concept of “scope” is so important for everyone in the software development team.

Scope is a big deal
The “scope” I want to focus on is a well-established term used to explain name binding in computer languages. There are other uses, even within computer science, but for now, please just exclude them from your thinking, as I am going to do.

The concept of scope can be truly simple. Take the name of some item in a software system. Now decide where within the total system this name is a valid way to refer to the item. That’s the scope of this particular name.

(Related: What newcomers to IoT plan for its future)

I don’t have evidence, but I imagine that the concept arose naturally in the earliest days of software, with programs written in machine code. The easiest way to handle variables is to give them each a specific memory location. These are global variables; any part of the software that knows the address can access and use these variables.

But wait! It’s 1950 and we’ve used all 1KB of memory! One way forward is to recognize that some variables are used only by localized parts of the software. So we can squeeze more into our 1KB by sharing memory locations. By the time we get to section two of the software, section one has no more use for some of its variables, so section two can reuse those addresses. These are local variables, and as machine code gave way to assembler languages and high-level languages, addresses gave way to names, and the concept of scope was needed.

But scope turned out to be much more useful than just a way to share precious memory. With well-chosen rules on scope, computer languages used names to define not only variables, but whole data structures, functions, and connections to peripherals as well. You name it, and, well yes, you could give it a name. This created new ways of thinking about software structure. Different parts of a system could be separated from other parts and developed independently.

A new software challenge
There’s a new challenge for IoT software, and this challenge applies to all the software across the six boxes in Figure 1. This includes the embedded software in the smart connected device, the enterprise applications that monitor and control the device, as well as the software-handling access control and product-specific functions.

The challenge is the new environment for this software. These software types and the development teams behind them are very comfortable operating in essentially “closed” environments. For example, the embedded software used to be just a control system; its universe was the real-time world of sensors and actuators together with its memory space and operating system. Complicated, but there was a boundary.

Now, it’s connected to a network, and it has to send and receive messages, some of which may cause it to update itself. Still complicated, and it has no control over the timing, sequence or content of the messages it receives. Timing and sequence shouldn’t be a problem; that’s like handling unpredictable screen clicks or button presses from a control panel. But content? That’s different.

Connectivity creates broadly similar questions about the environment for the software across all the six layers. Imagine implementing a software-feature upgrade capability. Whether it’s try-before-you-buy or a confirmed order, the sales-order processing system is the one that holds the official view of what the customer has ordered. So a safe transaction-oriented application like SOP is now exposed to challenging real-world questions. For example, how many times, and at what frequency, should it retry after a device fails to acknowledge an upgrade command within the specified time?

An extensible notion
The notion of scope can be extended to help development teams handle this challenge. It doesn’t deliver the solutions, but it will help team members think about and define structure for possible solution architectures.

For example, Figure 2 looks at software in a factory, where the local scope of sensor readings and actuator actions in a work-cell automation system are in contrast to the much broader scope of quality and production metrics, which can drive re-planning of production, adjustment of machinery, or discussions with suppliers about material quality.

Figure 2

Figure 3 puts this example from production in the context of the preceding engineering development work, and the in-service life of this product after it leaves the factory.

Figure 3

Figure 4 adds three examples of new IoT capabilities that will need new software: one in service (predictive maintenance), and two in the development phase (calibration of manufacturing models to realities in the factory, and engineering access to in-service performance data).

Figure 4

Each box is the first step to describing and later defining the scope of the data items, messages, and sub-systems involved in the application. Just like the 1950s machine code programmers, one answer is “make everything global”—or, in today’s terms, “put everything in a database in the cloud.” And as in 1950, that approach will probably be a bit heavy on resources, and therefore fail to scale.

Dare I say data dictionary?
A bit old school, but there are some important extensions to ensure a data dictionary articulates not only the basic semantics of a data item, but also its reliability, availability, and likely update frequency. IoT data may not all be in a database; a lot of it starts out there in the real world, so attributes like time and cost of updates may be relevant. For the development team, stories, scrums and sprints come first. But after a few cycles, the data dictionary can be the single reference that ensures everyone can discuss the required scope for every artifact in the system-of-systems.

Software development teams for every type of software involved in an IoT solution (for example, embedded, enterprise, desktop, web and cloud) will have an approach (and possibly different approaches) to naming, documenting, and handling design questions: Who creates, reads, updates or deletes this artifact? What formats do we use to move data inside one subsystem, or between subsystems? Which subsystem is responsible for orchestrating a response to a change in a data value? Given a data dictionary, and a discussion about the importance of scope, these teams should be able to discuss everything that happens at their interfaces.

Different programming languages have different ways of defining scope. I believe it’s worth reviewing a few of these, maybe explore some boundaries by looking at some more esoteric languages. This will remind you of all the wonderful possibilities and unexpected pitfalls of using, communicating, and sharing data and other information technology artifacts. The rules the language designers have created may well inspire you to develop guidelines and maybe specific rules for your IoT system. You’ll be saving your IoT system development team a lot of time.


Building the IoT – Connectivity and Security

25 Jul

Short-range wireless networking, for instance, is another major IoT building block that needs work. It is used in local networks, such as:

and more.With the latest versions of Bluetooth and Zigbee, both protocols can now transport an IP packet, allowing, as IDC represents it, a uniquely identifiable endpoint. A gateway/hub/concentrator is still required to move from the short-range wireless domain to the internet domain. For example, with Bluetooth, a smartphone or tablet can be this gateway.

The main R&D efforts for local area networking are focused on radio hardware and power consumption so that we can avoid needing a power cable or batteries for wireless devices, network topologies and software stacks. 6LoWPAN and its latest evolution under Google’s direction, Thread, are pushing the limits in this area. Because consumers have become accustomed to regularly changing their technology, such as updating their computers and smartphones every few years, the consumer market is a good laboratory for this development.

There is also a need for long-range wireless networking in the IoT to mature. Connectivity for things relies on existing IP networks. For mobile IoT devices and difficult-to-reach areas, IP networking is mainly achieved via cellular systems. However, there are multiple locations where there is no cellular coverage. Further, although cellular is effective, it becomes too expensive as the number of end-devices starts reaching a large number. A user can pay for a single data plan (the use of cellular modems in cars to provide Wi-Fi, for example), but that cost rapidly becomes prohibitive when operating a large fleet.

For end-devices without a stable power supply—such as in farming applications or pipeline monitoring and control—the use of cellular is also not a good option. A cellular modem is fairly power-hungry.

Accordingly, we are beginning to see new contenders for IoT device traffic in long-range wireless connections. A new class of wireless, called low-power wide-area networks (LPWAN), has begun to emerge. Whereas previously you could choose low power with limited distance (802.15.4), or greater distance with high power, LPWAN provide a good compromise: battery-powered operation with distances up to 30KM.

There are a number of competing technologies for LPWAN, but two approaches are of particular significance are LoRa and SIGFOX.

LoRa provides an open specification for the protocol, and most importantly, an open business model. The latter means that anyone can build a LoRa network—from an individual or a private company to a network operator.

SIGFOX is an ultra-narrowband technology. It requires an inexpensive endpoint radio and a more sophisticated base station to manage the network. Telecommunication operators usually carry the largest amount of data; usually high frequencies (such as 5G), whereas SIGFOX intends to do the opposite by using the lower frequencies. SIGFOX advertises that its messages can travel up to 1,000 kilometers (620 miles), and each base station can handle up to 1 million objects, consuming 1/1000th the energy of a standard cellular system. SIGFOX communication tends to be better if it’s headed up from the endpoint to the base station, because the receive sensitivity on the endpoint is not as good as the expensive base station. It has bidirectional functionality, but its capacity going from the base station back to the endpoint is constrained, and you’ll have less link budget going down than going up.

SIGFOX and LoRa have been competitors in the LPWAN space for several years. Yet even with different business models and technologies, SIGFOX and LoRa have the same end-goal: to be adopted for IoT deployments over both city and nationwide LPWAN. For the IoT, LPWAN solves the connectivity problem for simple coverage of complete buildings, campuses or cities without the need for complex mesh or densely populated star networks.

The advantage of LPWAN is well-understood by the cellular operators; so well, in fact, that Nokia, Ericsson and Intel are collaborating on narrowband-LTE (NB-LTE). They argue it is the best path forward for using LTE to power IoT devices. NB-LTE represents an optimized variant of LTE. According to them, it is well-suited for the IoT market segment because it is cheap to deploy, easy to use and delivers strong power efficiency. The three partners face an array of competing interests supporting alternative technologies. Those include Huawei and other companies supporting the existing narrowband cellular IoT proposal.

These technologies are part of the solution to solve some of the cloud-centric network challenges. It is happening, but we can’t say this is mainstream technology today.

Internet concerns

Beyond the issue of wireless connectivity to the internet lie questions about the internet itself. There is no doubt that IoT devices use the Internet Protocol (IP). The IPSO Alliance was founded in 2008 to promote IP adoption. Last year, the Alliance publicly declared that the use of IP in IoT devices was now well understood by all industries. The question now is, “How to best use IP?”

For example, is the current IP networking topology and hierarchy the right one to meet IoT requirements? When we start thinking of using gateways/hubs/concentrators in a network, it also raises the question of network equipment usage and data processing locations. Does it make sense to take the data from the end-points and send it all the way to a back-end system (cloud), or would some local processing offer a better system design?

Global-industry thinking right now is that distributed processing is a better solution, but the internet was not built that way. The predicted sheer breadth and scale of IoT systems requires collaboration at a number of levels, including hardware, software across edge and cloud, plus the protocols and data model standards that enable all of the “things” to communicate and interoperate. The world networking experts know that the current infrastructure made up of constrained devices and networks simply can’t keep up with the volume of data traffic created by IoT devices, nor can it meet the low-latency response times demanded by some systems. Given the predicted IoT growth, this problem will only get worse.

In his article, The IoT Needs Fog Computing, Angelo Corsaro, chief technology officer ofPrismtech, makes many good points about why the internet as we know it today is not adequate. He states that it must change from cloud to fog to support the new IoT networking, data storage and data processing requirements.

The main challenges of the existing cloud-centric network for broad IoT application are:

  • Connectivity (one connection for each device)
  • Bandwidth (high number of devices will exceed number of humans communicating)
  • Latency (the reaction time must be compatible with the dynamics of the physical entity or process with which the application interacts)
  • Cost (for an system owner, the cost of each connection multiplied by the number of devices can sour the ROI on a system)

These issues led to the creation of the OpenFog Consortium (OFC). OFC was created to define a composability architecture and approach to fog/edge/distributed computing, including creating a reference design that delivers interoperability close to the end-devices. OFC’s efforts will define an architecture of distributed computing, network, storage, control, and resources that will support intelligence at the edge of IoT, including autonomous and self-aware machines, things, devices, and smart objects. OFC is one more example that an important building block to achieve a scalable IoT is under development. This supports Gartner’s belief that the IoT will take five to 10 years to achieve mainstream adoption.

Yet the majority of media coverage about the IoT is still very cloud-centric, sharing the IT viewpoint. In my opinion, IT-driven cloud initiatives make one significant mistake. For many of the IoT building blocks, IT is trying to push its technologies to the other end of the spectrum—the devices. Applying IT know-how to embedded devices requires more hardware and software, which currently inflates the cost of IoT devices. For the IoT to become a reality, the edge device unit cost needs to be a lot lower than what we can achieve today. If we try to apply IT technologies and processes to OT devices, we are missing the point.

IT assumes large processors with lots of storage and memory. The programming languages and other software technologies of IT rely on the availability of these resources. Applying the IT cost infrastructure to OT devices is not the right approach. More development is required not only in hardware, but in system management. Managing a network of thousands or millions of computing devices is a significant challenge.

Securing the IoT

The existing internet architecture compounds another impediment to IoT growth: security. Not a single day goes by that I don’t read an article about IoT security requirements. The industry is still analyzing what it means. We understand IT security, but IT is just a part of the IoT. The IoT brings new challenges, especially in terms of networking architecture and device variety.

For example, recent studies are demonstrating that device-to-device interaction complexity doesn’t scale when we include security. With a highly diverse vendor community, it is clear the IoT requires interoperability. We also understand that device trust, which includes device authentication and attestation, is essential to securing the IoT. But device manufacturer-issued attestation keys compromise user privacy. Proprietary solutions may exist for third-party attestation, but again, they do not scale. Security in an IoT system must start with the end-device. The device must have an immutable identity.

Unfortunately, today this situation does not have an answer. Some chip vendors do have solutions for it. However, they are proprietary solutions, which means the software running on the device must be customized for each silicon vendor.

Security in a closed proprietary system is achievable, especially as the attack surface is smaller. As soon as we open the systems to public networking technologies, however, and are looking at the exponential gain of data correlation from multiple sources, security becomes a combinatory problem that will not soon be solved. With semantic interoperability and application layer protocol interoperability required to exchange data between systems, translation gateways introduce trusted third parties and new/different data model/serialization formats that further complicate the combined systems’ complexity.

The IT realm has had the benefit of running on Intel or similar architectures, and having Windows or Linux as the main operating system. In the embedded realm there is no such thing as a common architecture (other than the core—which, most of the time, is ARM—but the peripherals are all different, even within the same silicon vendor product portfolio). There are also a number of real-time operating systems (RTOS) for the microcontrollers and microprocessors used in embedded systems, from open-source ones to commercial RTOS. To lower embedded systems cost and achieve economies of scale, the industry will need to standardize the hardware and software used. Otherwise, development and production costs of the “things” will remain high, and jeopardize reaching the predicted billions of devices.

Fortunately, the technology community has identified several IoT design patterns. A design pattern is a general reusable solution to a commonly occurring problem. While not a finished design that can be transformed directly into hardware or code, a design pattern is a description or template for how to solve a problem that can be used in many different situations.

These IoT design patterns are described in IETF RFC 7452 and in a recent Internet Society IoT white paper. In general, we recognize five classes of patterns:

  • Device-to-Device
  • Device-to-Cloud
  • Gateway
  • Back-end Data Portability
  • IP-based Device-to-Device

Security solutions for each of these design patterns are under development. But considerable work remains.

Finally, all of this work leads to data privacy, which, unfortunately, is not only a technical question, but also a legal one. Who owns the data, and what can the owner do with it? Can it be sold? Can it be made public?

As you can see, there are years of work ahead of us before we can provide solutions to these security questions. But the questions are being asked and, according to the saying, asking the question is already 50% of the answer!


My goal here is not to discourage anyone from developing and deploying an IoT system—quite the contrary, in fact. The building blocks to develop IoT systems exist. These blocks may be too expensive, too bulky, may not achieve an acceptable performance level, and may not be secured, but they exist.

Our position today is similar to that at the beginning of the automobile era. The first cars did not move that fast, and had myriad security issues! A century later, we are contemplating the advent of the self-driving car. For IoT, it will not take a century. As noted before, Gartner believes IoT will take five to ten years to reach mainstream adoption. I agree, and I am personally contributing and putting in the effort to develop some of the parts required to achieve this goal.

Many questions remain. About 10 years ago, the industry was asking if the IP was the right networking technology to use. Today it is clear. IP is a must. The question now is, “How do we use it”? Another question we begin to hear frequently is, “What is the RoI (return on investment) of the IoT”? What are the costs and revenue (or cost savings) that such technology can bring? Such questions will need solid answers before the IoT can really take off.

Challenges also abound. When designing your system, you may find limitations in the sensors/actuators, processors, networking technologies, storage, data processing, and analytics that your design needs. The IoT is not possible without software, and where there is software, there are bug fixes and feature enhancements. To achieve software upgradability, the systems need to be designed to allow for this functionality. The system hardware and operation costs may be higher to attain planned system life.

All that said, it is possible to develop and deploy an IoT system today. And as new technologies are introduced, more and more system concepts can have a positive RoI. Good examples of such systems include fleet management and many consumer initiatives. The IoT is composed of many moving parts, many of which have current major R&D programs. In the coming years, we will see great improvements in many sectors.

The real challenge for the IoT to materialize, then, is not technologies. They exist. The challenge is for their combined costs and performance to reach the level needed to enable the deployment of the forecasted billions of IoT devices.


Dawn of the Gigabit Internet Age

14 Mar

The availability of speedier Internet connections will likely transform a variety of products and services for businesses and consumers, according to research from Deloitte Global.

Deloitte Touche Tohmatsu Limited (Deloitte Global) predicts that the number of gigabit-per-second (gbit/s) Internet connections, which offer significantly faster service than average broadband speeds, will surge to 10 million by the end of the year, a tenfold increase. As average data connections get faster and the number of providers offering gigabit services grows, we expect businesses and consumers will steadily use more bandwidth, and a range of new data-intensive services and devices will come to market.

The expansion of gigabit connections will increasingly enable users to take advantage of high-speed data. For instance, the quality of both video streaming and video calling has already ticked up steadily along with data connection speeds over the past 10 years, and both services are now supported by billions of smartphones, tablets, and PCs. In the enterprise, significantly faster Internet speeds could enhance the ability of remote teams to work together: Large video screens could remain on throughout the work day, linking dispersed team members and enabling them to collaborate “side by side” even when they are thousands of miles apart.

Moreover, as available bandwidth increases, we expect many aspects of communication will be affected. Instant messages, for example, have already evolved from being predominantly text-based to incorporating photos and videos in ever-higher resolution and frame rates. Social networks, too, are hosting growing volumes of video views: As of November, there were 8 billion daily video views on Facebook, double the quantity from just seven months prior.¹

The expansion of gigabit services could reinvent the public sector and social services as well. A range of processes, from crowd monitoring to caring for the elderly, could be significantly enhanced through the availability of high-quality video surveillance. Crowd-control systems could use video feeds to accurately measure a sudden swarm of people to an area, while panic buttons used in the event an elderly person falls could be replaced by high-definition cameras.

Gigabit connections may also change home security solutions. Historically, connected home security relied on a call center making a telephone call to the residence, and many home video camera solutions currently record onto hard drives. As network connection speeds increase, however, cameras are likely to stream video, back up online, and offer better resolution and higher frame rates.² As video resolution increases and cameras proliferate, network demand will likely grow, too.

Additionally, some homes have already accumulated a dozen connected devices and will likely accrue more, with bandwidth demand for each device expected to rise steadily over time. There will also likely be a growing volume of background data usage, as an increased number of devices added to a network, from smartphones to smart lighting hubs, would require online updates for apps or for operating systems.

The Internet speed race is not likely to conclude with gigabit service. Deloitte Global expects Internet speeds to continue rising in the long term: 10 gigabits per second has already been announced, and 50 gigabit-per-second connections are being contemplated for the future.³ CIOs should maintain teams that can monitor the progress of bandwidth speeds—and not only those serving businesses and homes, but emerging gigabit options available via cellular networks and Wi-Fi hotspots as well.


Smart Home: Which company will lead the 2014 Trends?

11 Dec


International research firm Parks Associates,  will provide an update on the connected home market and analyze the key trends and upcoming announcements ahead of 2014 International CES . Parks Associates estimates that in 2017, more than 11 million U.S. broadband households will have some type of smart home controller, up from two million in 2013..So we are seeing in the marketplace, including the Control 4, LUTRON ,CRESTRON, AMX, and other power company like Wulian etc, that will be a hot war in home automation area.

So which company will win and lead the 2014 trend? As we know, AMX is a famous brand and has a long history in the home automation area, but its technology is wire, and wireless is the trend, so it must be out. LUTRON and CRESTRON  , Control 4 , yes, you can say, now in the market , maybe many people know them and think their products are good, in fact, for these three companies, not all the products are wireless, part of them are wire. It means, you can not DIY by yourself completely , you must pay the installing fees. So can you find one company which can supply the whole set of home automation products and DIY installing completely? Yes, look for in China, there is one company wulian , you will find they can meet any your inquire for home automation products , what’s more, you can get the high cost performance!

Now in the market , Apple also said they goes into the home automation area, and many companies said they have the best wireless  technology , like WiFI, Bluetooth ,ZigBee, Z wave etc. WiFi has advantage in big date transportation like video, but at the same time, it is also its disadvantage, except the video, most of the home automation products need low power dissipation and low energy consumption. Bluetooth, PTP technology, that will not have a wide range of application. ZigBee, now, many investors think that is the best choice to home automation area, and there is a whole complete industry chain to keep the creativity, For Z wave, consider it just can supply more than 200 devices in theory, we just can say it has a limited range in home automation or building automation .

More airlines relax in-flight gadget policy following recent FAA ruling

7 Nov

More airlines relax in-flight gadget policy following recent FAA ruling
You can stop pretending to turn your phone off during flights now, as more airlines respond to the Federal Aviation Administration’s newly relaxed regulations on in-flight gadget usage.

United Airlines and American Airlines announced that its passengers no longer have to switch off their mobile devices during takeoff and landing.

The FAA officially loosened its rules on October 31st 2013 after facing pressure from passengers, politicians, and the press to update its antiquated regulations. The regulations were initially implemented decades ago — electronics had to stay off while planes traveled under 10,000 feet to avoid wireless signals interfering with the plane’s navigational tech.

While airlines and the FAA have long feared that tablets and e-readers interfere with in-flight systems, there’s no real proof that such interference exists.

The FAA left it up to the airlines whether or not to take advantage of the newly-eased standards. Jet Blue and Delta modified their policies right away, with United and AA not too far behind.

The battle between flight attendants policing the aisles for signs of illuminated screens, and bored passengers who a) don’t really believe their phone could mess with the plane’s navigational system and b) are bored may finally be coming to an end.

The same FAA panel also decided last month that Wi-Fi is safe to use throughout an entire flight. Like handheld electronics, planes could only turn their Wi-Fi systems on after reaching 10,000 feet.

Slowly but surely these draconian rules are adapting to modern times. People already have to deal with enough crap when they fly — long lines, taking off their shoes, having their $30 tub of dead sea mineral face cream unceremoniously thrown out by a 200 pound security agent. Airlines don’t even feed us proper meals anymore.

The least they can do is let us play dots.


4G cars are coming, but we won’t have much choice in how we connect them

30 Sep
connected car logo

photo: GigaOM
SUMMARY:Soon we’ll be able to connect our cars directly to the mobile internet just like our smartphones, but unlike your smartphone your new car is going to be linked to a specific carrier.

4G cars are making their way to the U.S., starting first with the Audi A3 and eventually a whole fleet of GM vehicles. Embedded LTE could soon be streaming music to our dashboards, providing real-time traffic alerts to our nav systems and downloading Thomas the Tank Engine reruns for Junior to watch in his car seat.

The car will become a new type of connected device like our smartphones and tablets, and like those gadgets our 4G cars will require data plans. But unlike the smartphone and tablet, we’re not going to have a choice on what carrier we buy those plans from. It might seem absurd, but in the U.S. our 4G cars are going to be linked to a specific carrier, just as the first three generations of iPhones were tied to AT&T.

Gemalto's LTE connected car module

Gemalto’s LTE connected car module

That’s the opposite approach to what automakers are doing in Europe. The Audi S3 debuted in Europe with a distinctly European mobile connectivity model. A slot in the dash will take any carrier’s SIM card, and the Gemalto machine-to-machine communications model embedded in Audis supports multiple European GSM, HSPA and LTE bands. You can thank Europe’s coordinated approach to mobility for that flexibility — a single module can cover almost every carrier’s network in almost every country on the continent.

But pulling such a feat off in the U.S. is much different story, said Andreas Hägele, who heads up Gemalto’s M2M portfolio. Not only does the U.S. host multiple mobile standards (CDMA and GSM), but its LTE networks are all over the radio frequency spectrum.

Each of the four major carriers has deployed their initial LTE networks on completely separate bands, and most of them are targeting equally distinct bands for future 4G expansions. Add to that the car’s need for ubiquitous coverage, and any universal module would have to support multiple 2G and 3G technologies on multiple bands. Building a single module that supports all carriers isn’t impossible, but it might as well be, Hägele said; it’s like shooting at a moving target.

“We can do it technically,” Hägele said. “It’s a question of economics on one hand, and strategy on the other.”

Connected services versus simple connectivity

Automakers aren’t selling rote connectivity. They’re selling services ranging from turn-by-turn navigation to emergency roadside assistance to telematics services like remote start. Since they’ll have to vouch for those services, many of them will be very careful about the carrier partners they pick.

GM connected car demo

Starting with model year 2015 vehicles, GM will start connecting all cars sold in the U.S. to AT&T’s 2G, 3G and 4G networks. Customers will be able to buy data plans from AT&T to power in-car Wi-Fi and connect their infotainment apps, but GM is also moving its entire OnStar vehicle safety, navigation and telematics platform onto AT&T’s network. In that deal GM has stipulated AT&T sign roaming agreements with rural carriers and provide service guarantees to ensure OnStar services will work across the country. An emergency roadside assistance service doesn’t do you much good if the carrier connecting your car doesn’t have coverage where you’ve broken down.

In that scenario, GM is the service provider, not AT&T, so it shouldn’t matter to us whose network we’re connected to. For a decade, GM has relied on Verizon to power OnStar and most consumers were none the wiser. If in-car connectivity were only about powering these kind of vehicle-specific services, it wouldn’t be an issue.

But we’re entering an age where our car connectivity is enabling a plethora of apps in vehicles that aren’t provided by the automaker — a trend we’ll be tracking in detail atGigaOM’s Mobilize conference in October. But all of those services will require data plans, and the way the connected car market is evolving, we’re basically going to be held captive by a single carrier to provide us those plans.

Apple's Eyes Free in a BMW

Apple’s Eyes Free in a BMW

With today’s emerging connected car systems, many automakers have adopted a bring-you-own-connectivity model in which your smartphone provides the link back to the internet. I don’t anticipate that will always be the case, though.

As apps and user interfaces become more sophisticated and more closely tied to the vehicle’s core functions, integrated connectivity will likely take precedence over simple tethering — and it should. A radio powered by the engine’s distributor and a antenna mounted on the roof are going to deliver a much better mobile data experience than a smartphone linked to the dash by Bluetooth.

But where does that leave the consumer? If I’m an AT&T customer buying a GM vehicle, then I’m set. I merely have to attach my car to my shared data plan. But if I’m a customer of Verizon Wireless, Sprint, T-Mobile or one of hundred other regional or


Speed Test: 16 fast connectivity facts

23 Dec
We’ve been gathering a wealth of data from users of ZDNet’s Broadband Speed Test. As the year draws to a close, what have we learned?

Throughout the year, people have been testing their connection speeds with ZDNet’s Broadband Speed Test. Since February, we asked people to enter their postcode and connection type, so that we could compare the various technologies. We lost some data in June, as ZDNet Australia was migrated to the international version of ZDNet. Still, up until last week (December 12), we had 602,831 records from Australian users. This was enough to discover some interesting facts about what’s happening when it gets to hooked-up internet Down Under.

Overall, it paints a positive picture. Speeds are increasing, not just through the adoption of new technologies (like fibre and 4G), but also because we’re getting more out of DSL and 3G.

As always, a word of caution on these figures: They are not a fully representative sample. They are the results of tests, often taken by people who want to see why their connection is slow, or how fast their new connection is. That’ll polarise the results a little. There’s also the geek factor: The results will be heavily skewed in favour of people who get a kick out of seeing how fast their internet connection is. That could push the averages up a little.

That said, these caveats apply equally to all the results, irrespective of which connection type or internet service provider (ISP) the user has selected. That makes the relativity of these comparisons totally bona fide.


The quickest


Fibre provided the fastest average connection speed from the 602,832 tests taken over the year. It provided an average speed of 24.8Mbps, followed by cable (20.7Mbps), 4G (10.7Mbps), DSL (6.1Mbps), and 3G (3.5Mbps).


Wireless connectivity


Despite the emerging availability of 4G, it accounted for just 3 percent of all tests, with this figure showing no sign of increase over the year. There were more than twice as many tests for 3G.


(Credit: Phil Dobbie/ZDNet)


Both 3G and 4G speeds seem to have increased over the year. 3G speeds have risen from just 2.5Mbps in February up to 5.1Mbps so far this month.

Average 3G speeds were slowest in New South Wales (2Mbps), compared to 2.7Mbps in Victoria, 2.8Mbps in Western Australia, and 5.8Mbps in Queensland.


DSL facts


DSL speeds averaged 6Mbps for home users, 7.4Mbps for those at work, and 11.3Mbps for school users.

Home DSL speeds have been increasing, although they slipped a little around Easter time. April was the slowest month, with an average speed of 5.9Mbps, and December finishing the year at 6.7Mbps.


(Credit: Phil Dobbie/ZDNet)


Victoria had the fastest home DSL speed results (6.7Mbps), followed by South Australia (6.1Mbps), NSW (6Mbps), WA (5.5Mbps), Tasmania (5.3Mbps), Queensland (5.1Mbps), and the Australian Capital Territory (5.1Mbps).

Over the year, Telstra has offered the fastest home DSL access speeds. Its average of 6.5Mbps was well ahead of TPG (6.1Mbps), Internode (6Mbps), iiNet (5.7Mbps), OptusNet (5.5Mbps), and Dodo (5.5Mbps).

Telstra lost its lead position recently, however, with TPG beating Telstra for the top spot for the last three months. Telstra’s average speeds have been sliding since the middle of the year, whilst TPG has increased.


(Credit: Phil Dobbie/ZDNet)


Although it’s not a precise indicator of market size, it is worth noting that 29 percent of all home DSL speed tests were by BigPond users, followed by TPG (18 percent), iiNet (15 percent), OptusNet (7 percent), Internode (4 percent), and Dodo (3 percent).

Most ISPs retained a similar share of tests throughout the year, although OptusNet slipped from 8 percent in February and March down to 5 percent in August and September, finishing at 6 percent for the last few months of the year. Dodo and TPG also account for a smaller proportion of tests at the end of the year.


Fibre and cable facts


Cable users made up 20 percent of the tests. Only 2 percent of tests over the year were from fibre connections.

Telstra’s cable speeds seem to be streets ahead, averaging 33Mbps (2,780 tests), compared to 22.9Mbps for Internode (199 tests), 20.6Mbps for OptusNet (360 tests), 17.9Mbps for iiNet (626 tests), and 9.5Mbps for TPG (254 tests).

Average fibre speeds seem to have slowed during the year — perhaps as new users sign up for lower-speed plans. Over the year, fibre speeds averaged 26.5Mbps for home users, 18.9Mbps for those at work, and just 16.1Mbps for schools.

Telstra accounted for 35 percent of all fibre tests, and, with an average of 33Mbps, beat the rest in terms of speed.


(Credit: Phil Dobbie/ZDNet)


At 24.6Mbps, Victoria had the fastest average speed from fibre (239 tests), followed by NSW, at 23.3Mbps (250 tests), and Queensland, at 19.9Mbps (102 tests).


%d bloggers like this: