Tag Archives: IoT

Private 5G Mobile Networks for Industrial IoT

31 Jul

Afbeeldingsresultaat voor 5g netwerk

Dedicated 5G campus networks, designed to meet the coverage, performance and security requirements of industrial users, are one of the most exciting — and tangible — advanced 5G use-cases under development.

Part of the reason for this is that the private mobile network market in general is taking-off. These networks enable enterprises to optimize and redefine business processes in ways that are not possible, or are impractical, within the limitations of wired and WiFi networks, and also cannot be reliably served by wide-area cellular. Right now, this means using LTE technology. Backed by a robust ecosystem of suppliers and integrators, private LTE is a growth market, with deployment activity across diverse industry sectors in all global regions.

Looking one step farther out, however, to scenarios where users have more demanding performance requirements — for example, the cyber-physical systems that characterize Industry 4.0. — and 5G technology comes into the picture, offering an investment path that can support these new-wave applications at scale. Building on the existing LTE ecosystem, private 5G campus networks are emerging to address the performance requirements of production-critical processes in sectors such as smart factories, logistics/warehouses, container ports, oil & gas production, chemical plants, energy generation and distribution and more.

In my new white paper, “Private 5G Networks for Industrial IoT,” I discuss how 5G technology meets the performance requirements of industrial users and why it will integrate with the next generation of Operational Technologies (OT) used in these markets. The paper discusses how private 5G can be deployed across licensed, shared-licensed and unlicensed spectrum bands, and investigates key 5G radio innovations. Specifically, it addresses the use of time synchronization in shared spectrum to ensure predictable performance.

Among the key findings in the paper — available for download here — are:

  • The strategic importance of private networks is reflected in 5G R&D. Whereas in previous generations, private networking was an add-on capability to public cellular; in 5G these requirements are addressed directly in the initial specification phase.
  • The first 5G standards release (3GPP Release 15) contains many of the critical features that will underpin the performance needed in the industrial IoT segment. In addition, to support the advanced capabilities needed for cyber-physical industrial communication networks, an enormous amount of work is underway in Release 16, scheduled for functional freeze in March 2020 and ASN.1 freeze (i.e. protocols stable) in June 2020.
  • 5G offers the opportunity to consolidate industrial networking complexity onto a common network platform. An example is the cross-industry effort to transition diverse fieldbuses to the Time Sensitive Networking (TSN) Ethernet standard, and the mapping of TSN requirements to the 5G system specifications, such that a 5G campus network can transport TSN within the required latency, jitter and timing bounds.
  • There are a range of spectrum options that will accelerate private network adoption. In some markets, regulators are investigating, or already allocating, dedicated spectrum to enterprises to run private networks; these allocations are often targeted at industrial verticals.
  • Unlicensed spectrum is also attractive, with new radio techniques emerging to increase reliability in shared bands. Time synchronized sharing in unlicensed spectrum, in combination with other advanced 5G radio capabilities, can deliver highly predictable performance.
  • Heavy Reading believes spectrum will, in many cases, be de-coupled from the decision about which party designs, operates and maintains private networks. There is evidence that operators themselves see opportunities in dedicated enterprise spectrum and are preparing to offer manged private networks in these bands. Other active parties include systems integrators and specialist OT companies.
  • In the radio domain, multiple techniques are under development to will enable 5G to meet extreme industrial IoT performance requirements. These include flexible numerology, ultra-reliable low-latency communications (URLLC), spatial diversity, Coordinated MultiPoint (CoMP), cm-accurate positioning, QoS, spectrum flexibility (including NR-Unlicensed), etc.
  • At the system level, capabilities such as network slicing, improved security, new authentication methods, edge-cloud deployment, TSN support (with synchronization) and API exposure make 5G suitable for the private industrial IoT market

The investment the global 3GPP community — which includes leading technology vendors, research organizations and network operators — is making in industrial IoT is very significant. This multi-year commitment draws deeply on R&D capabilities at these organizations and creates confidence in the technology and roadmap.

Source: https://www.lightreading.com/mobile/5g/private-5g-mobile-networks-for-industrial-iot/a/d-id/753123

Advertisements

Innovation Through Acceleration: How 5G Will Change Mobile Landscape

31 Jul

Pushed by technological progress, the pace of modern life has accelerated thousands of times. People are always in motion, literally. Their demand for high-tech solutions grows exponentially, and telecommunication is no exception. We need fast and quality access to data anytime and anywhere.
The rapid development of 5G technology is what we need to meet the ever-growing demands rising across various business domains, as it hallmarks the onset of a new era of mobile broadband.
Contrary to popular belief, 5G is not a modified version of 4G. It is an entirely new technology that will change our lives dramatically.
In fact, 5G will significantly expand the existing 4G capabilities. It will not only help improve the quality of mobile communication but will change the mobile application market in general.
5G is already referred to as a network capable of transmitting data at speeds of up to 20 Gbps. Just for comparison, the maximum speed of 4G today is 150 Mbps, which is 133 times slower!
Crowd expert says that 5G is the missing step required for the development of a long list of technologies of the future including IoT, data analytics, blockchain, semantic Web, artificial intelligence, etc.
Thanks to 5G, we will be able to watch “spherical videos” and online broadcasts in Ultra HD quality. The quality and accuracy of color rendering will be improved significantly.
Another essential factor enabled by 5G is connection stability. It’s a shame to be disconnected continuously during an important video conference, isn’t it? 5G promises to remove such shortcomings.
Besides, 5G will provide new opportunities for the media business and will many types of users happy, from gamers and cybersports fans to fans of live broadcasting to IoT users, etc.

5G Adoption Forecasts

Although 5G has yet to gain momentum, catch on and go mainstream, its adoption is inevitable due to huge demand from nearly all verticals and industries.

“75% of entrepreneurs are already willing to pay for the opportunity to use the 5G network,” Gartner

According to Umbrella IT, 5G will be used by 25 million people all over the world in 2021; this number will most likely increase to 1,2 billion users in 2025.
Mainstream adoption of 5G will give birth to a whole new set of mobile services and business models to benefit any business. On the other hand, it may leave many countries and industries behind tech evolution if they fail to invest in and develop new and optimize existing digital solutions for the 5G realms.

How 5G Will Affect Mobile Application Development Across Domains

5G and IoT
There is no doubt that the Internet of Things (IoT) will be most affected by 5G tech developments. 5G will bring ultra-low latency (less than one millisecond) and extended battery life to industries that leverage smart and connected solutions.
According to Gartner57% of businesses admit IoT is the most promising niche for 5G adoption.
5G will make it possible to work with those IoT areas where the difference between 40 milliseconds and four milliseconds is crucial. One such vivid example of effective 5G use is the drone industry. With the instantaneous exchange of information between driverless vehicles, the risk of an accident will be reduced to zero.
5G and Smart cities
5G adoption will not be limited to the IoT space only. It will enable the creation of smart houses and entire “smart cities” equipped with billions of sensors and interconnected devices.
Smart cities will become a unique platform for business development where millions of various mobile applications will be synchronized in order to reduce crime, create real-time disaster alerts, prevent pollution, etc.
As 5G will enable enhanced connectivity, it’ll create new collaboration opportunities for both competing and non-competing businesses.
Ultra HD broadcast
Everybody loves positive emotions, but not everyone has a chance to attend events that give good vibes and inspiration.
Thanks to mobile multimedia applications running on 5G, you can watch your favorite soccer matches or concerts remotely from anywhere.
A 360-degree broadcast, combined with augmented reality and 5G technology, will allow us to see the world more versatilely. Excursions to historic sites, educational films about the structure of atoms or the universe, and the broadcasting of surgical operations to educate young professionals… the list goes on and on.
Manufacturing and logistics
Using mobile applications based on 5G will help improve the remote control of machines in complex industries. The industrial automation systems will be improved and become more stable, which will result in overall product quality.
Combined with 3D printing and robotics, 5G technology will allow you to construct highly-efficient smart plants and reduce the risks for people working in hazardous industries.
Augmented reality (AR)
5G will unlock the potential of AR. Users will be able to experience the full-presence effect and enjoy the benefits of using the full range of AR-based mobile applications, from education to entertainment.
Tactile Internet will cease to be a fantasy and become a reality we can only imagine nowadays. Driven by 5G, AR will allow mastering various hard and soft skills without going to a brick-and-mortar classroom.
5G will take business communication to the next technology level. Using AR, entrepreneurs will be able to experience the full immersion in a physical meeting regardless of their current location. As such, a startup from NYC will be able to literally join an important VC meeting in Seoul and pitch their product/talk to investors without spending money on business trips and accommodation. That being said, the whole world will become extremely lean after 5G goes mainstream.
Healthcare
Virtual reality (VR) technologies are already in very high demand across medicine and healthcare domains and are poised to remove critical barriers and impediments to quality healthcare provision, especially in underserved areas.
Remote medicine will play an important role in the remote monitoring of patient health. Physicians will be able to react more quickly when they receive information in real-time.
Used together with ehealth mobile applications, wearable devices will help keep more detailed records of patient conditions in certain areas, observe trends and make informed and shared decisions regarding choice of treatment.
Extremely high data transfer speed of 5G will enable fast DNA decryption. Today, we need a 140 GB file to store information about a single genome. And speed of processing DNA is very slow. With 5G, we’ll be able to decrypt DNA in less than a minute!” says Vlad Potapenko, CEO of 8allocate, a company that helps other businesses optimize their mobile solutions for 5G.
3D modeling
The emergence of mobile apps capable of creating different 3D models is another benefit of using 5G. Combining a 5G mobile application with the latest 3D printers will allow you to create higher quality 3D models of objects. Such rich-media apps will be used across an array of niches and verticals, from healthcare to real estate and beyond.
Personalized AI chatbots
Although AI-based chatbots can’t completely replace a human being yet, they can help significantly improve the quality of any service and increase customer loyalty.
The symbiosis of chatbots and 5G will provide more opportunities for prompt feedback and informed decision making.
We’ve identified four key benefits of using 5G for mobile application development:
  • 5G is 100x faster than 4G;
  • 5G extends autonomous work of IoT/connected devices by ten times;
  • 5G reduces wait time up to 1 millisecond or even less (i.e., apps will open and perform at lightning speed);
  • 5G will help extend many services to remote/far areas.
Most likely, 5G technology will be implemented as a supplement to existing networks. This will result in lower energy costs. The battery will consume less power as the increase in speed will make it possible to perform almost all of the calculations on the server side (and not on the user’s device).
5G technology will be a real breakthrough due to its ability to effectively adapt to the full range of requirements for new-gen applications:
  • security;
  • speed;
  • lower battery consumption;
  • reliable and continuous communication between devices.
To wrap it up, 5G is more than just a new technology; it’s a door-opener to the whole new world of opportunities that will help make our life better and more meaningful.

And how else do you believe 5G will change the way we make apps now?

Source: https://hackernoon.com/innovation-through-acceleration-how-5g-will-change-mobile-landscape-x5bqh3jfj

Unlearn to Unleash Your Data Lake

16 Sep

The Data Science Process is about exploring, experimenting, and testing new data sources and analytic tools quickly.

The Challenge of Unlearning
For the first two decades of my career, I worked to perfect the art of data warehousing. I was fortunate to be at Metaphor Computers in the 1980’s where we refined the art of dimensional modeling and star schemas. I had many years working to perfect my star schema and dimensional modeling skills with data warehouse luminaries like Ralph Kimball, Margy Ross, Warren Thornthwaite, and Bob Becker. It became engrained in every customer conversation; I’d built a star schema and the conformed dimensions in my head as the client explained their data analysis requirements.

Then Yahoo happened to me and soon everything that I held as absolute truth was turned upside down. I was thrown into a brave new world of analytics based upon petabytes of semi-structured and unstructured data, hundreds of millions of customers with 70 to 80 dimensions and hundreds of metrics, and the need to make campaign decisions in fractions of a second. There was no way that my batch “slice and dice” business intelligence and highly structured data warehouse approach was going to work in this brave new world of real-time, predictive and prescriptive analytics.

I struggled to unlearn engrained data warehousing concepts in order to embrace this new real-time, predictive and prescriptive world. And this is one of the biggest challenge facing IT leaders today – how to unlearn what they’ve held as gospel and embrace what is new and different. And nowhere do I see that challenge more evident then when I’m discussing Data Science and the Data Lake.

Embracing The “Art of Failure” and The Data Science Process
Nowadays, Chief Information Officers (CIOs) are being asked to lead the digital transformation from a batch world that uses data and analytics to monitor the business to a real-time world that exploits internal and external, structured and unstructured data, to predict what is likely to happen and prescribe recommendations. To power this transition, CIO’s must embrace a new approach for deriving customer, product, and operational insights – the Data Science Process (see Figure 2).

Figure 2:  Data Science Engagement Process

The Data Science Process is about exploring, experimenting, and testing new data sources and analytic tools quickly, failing fast but learning faster. The Data Science process requires business leaders to get comfortable with “good enough” and failing enough times before one becomes comfortable with the analytic results. Predictions are not a perfect world with 100% accuracy. As Yogi Berra famously stated:

“It’s tough to make predictions, especially about the future.”

This highly iterative, fail-fast-but-learn-faster process is the heart of digital transformation – to uncover new customer, product, and operational insights that can optimize key business and operational processes, mitigate regulatory and compliance risks, uncover new revenue streams and create a more compelling, more prescriptive customer engagement. And the platform that is enabling digital transformation is the Data Lake.

The Power of the Data Lake
The data lake exploits the economics of big data; coupling commodity, low-cost servers and storage with open source tools and technologies, is 50x to 100x cheaper to store, manage and analyze data then using traditional, proprietary data warehousing technologies. However, it’s not just cost that makes the data lake a more compelling platform than the data warehouse. The data lake also provides a new way to power the business, based upon new data and analytics capabilities, agility, speed, and flexibility (see Table 1).

Data Warehouse Data Lake
Data structured in heavily-engineered structured dimensional schemas Data structured as-is (structured, semi-structured, and unstructured formats)
Heavily-engineered, pre-processed data ingestion Rapid as-is data ingestion
Generates retrospective reports from historical, operational data sources Generates predictions and prescriptions from a wide variety of internal and external data sources
100% accurate results of past events and performance “Good enough” predictions of future events and performance
Schema-on-load to support the historical reporting on what the business did Schema-on-query to support the rapid data exploration and hypothesis testing
Extremely difficult to ingest and explore new data sources (measured in weeks or months) Easy and fast to ingest and explore new data sources (measured in hours or days)
Monolithic design and implementation (water fall) Natively parallel scale out design and implementation (scrum)
Expensive and proprietary Cheap and open source
Widespread data proliferation (data warehouses and data marts) Single managed source of organizational data
Rigid; hard to change Agile; relatively ease to change

Table 1:  Data Warehouse versus Data Lake

The data lake supports the unique requirements of the data science team to:

  • Rapidly explore and vet new structured and unstructured data sources
  • Experiment with new analytics algorithms and techniques
  • Quantify cause and effect
  • Measure goodness of fit

The data science team needs to be able perform this cycle in hours or days, not weeks or months. The data warehouse cannot support these data science requirements. The data warehouse cannot rapidly exploration the internal and external structured and unstructured data sources. The data warehouse cannot leverage the growing field of deep learning/machine learning/artificial intelligence tools to quantify cause-and-effect. Thinking that the data lake is “cold storage for our data warehouse” – as one data warehouse expert told me – misses the bigger opportunity. That’s yesterday’s “triangle offense” thinking. The world has changed, and just like how the game of basketball is being changed by the “economics of the 3-point shot,” business models are being changed by the “economics of big data.”

But a data lake is more than just a technology stack. To truly exploit the economic potential of the organization’s data, the data lake must come with data management services covering data accuracy, quality, security, completeness and governance. See “Data Lake Plumbers: Operationalizing the Data Lake” for more details (see Figure 3).

Figure 3:  Components of a Data Lake

If the data lake is only going to be used another data repository, then go ahead and toss your data into your unmanageable gaggle of data warehouses and data marts.

BUT if you are looking to exploit the unique characteristics of data and analytics –assets that never deplete, never wear out and can be used across an infinite number of use cases at zero marginal cost – then the data lake is your “collaborative value creation” platform. The data lake becomes that platform that supports the capture, refinement, protection and re-use of your data and analytic assets across the organization.

But one must be ready to unlearn what they held as the gospel truth with respect to data and analytics; to be ready to throw away what they have mastered to embrace new concepts, technologies, and approaches. It’s challenging, but the economics of big data are too compelling to ignore. In the end, the transition will be enlightening and rewarding. I know, because I have made that journey.

Source: http://cloudcomputing.sys-con.com/node/4157284

How connected cars are turning into revenue-generating machines

29 Aug

 

At some point within the next two to three years, consumers will come to expect car connectivity to be standard, similar to the adoption curve for GPS navigation. As this new era begins, the telecom metric of ARPU will morph into ARPC (average revenue per car).

In that time frame, automotive OEMs will see a variety of revenue-generating touch points for connected vehicles at gas stations, electric charging stations and more. We also should expect progressive mobile carriers to gain prominence as essential links in the automotive value chain within those same two to three years.

Early in 2016, that transitional process began with the quiet but dramatic announcement of a statistic that few noted at the time. The industry crossed a critical threshold in the first quarter when net adds of connected cars (32 percent) rose above the net adds of smartphones (31 percent) for the very first time. At the top of the mobile carrier chain, AT&T led the world with around eight million connected cars already plugged into its network.

The next big event to watch for in the development of ARPC will be when connected cars trigger a significant redistribution of revenue among the value chain players. In this article, I will focus mostly on recurring connectivity-driven revenue. I will also explore why automakers must develop deep relationships with mobile carriers and Tier-1s to hold on to their pieces of the pie in the connected-car market by establishing control points.

After phones, cars will be the biggest category for mobile-data consumption.

It’s important to note here that my conclusions on the future of connected cars are not shared by everyone. One top industry executive at a large mobile carrier recently asked me, “Why do we need any other form of connectivity when we already have mobile phones?” Along the same lines, some connected-car analysts have suggested that eSIM technology will encourage consumers to simply add to their existing wireless plans connectivity in their cars.

Although there are differing points of view, it’s clear to me that built-in embedded-SIM for connectivity will prevail over tethering with smartphones. The role of Tier-1s will be decisive for both carriers and automakers as they build out the future of the in-car experience, including infotainment, telematics, safety, security and system integration services.

The sunset of smartphone growth

Consider the U.S. mobile market as a trendsetter for the developed world in terms of data-infused technology. You’ll notice thatphone revenues are declining. Year-over-year sales of mobiles have registered a 6.5 percent drop in North America and have had an even more dramatic 10.8 percent drop in Europe. This is because of a combination of total market saturation and economic uncertainty, which encourages consumers to hold onto their phones longer.

While consumer phone upgrades have slowed, non-phone connected devices are becoming a significant portion of net-adds and new subscriptions. TBR analyst Chris Antlitz summed up the future mobile market: “What we are seeing is that the traditional market that both carriers [AT&T and Verizon] go after is saturated, since pretty much everyone who has wanted a cell phone already has one… Both companies are getting big into IoT and machine-to-machine and that’s a big growth engine.”

At the same time, AT&T and Verizon are both showing a significant uptick in IoT revenue, even though we are still in the early days of this industry. AT&T crossed the $1 billion mark and Verizon posted earnings of $690 million in the IoT category for last year, with 29 percent of that total in the fourth quarter alone.

Data and telematics

While ARPU is on the decline, data is consuming a larger portion of the pie. Just consider some astonishing facts about data usage growth from Cisco’s Visual Networking Index 2016. Global mobile data traffic grew 74 percent over the past year, to more than 3.7 exabytes per month. Over the past 10 years, we’ve seen a 4,000X growth in data usage. After phones, cars will be the biggest category for mobile-data consumption.

Most cars have around 150 different microprocessor-controlled sub-systems built by different functional units. The complexity of integrating these systems adds to the time and cost of manufacturing. Disruptive companies like Tesla are challenging that model with a holistic design of telematics. As eSIM becomes a standard part of the telematics control unit (TCU), it could create one of the biggest disruptive domino effects the industry has seen in recent years. That’s why automakers must develop deep relationships with mobile carriers and Tier-1s.

The consumer life cycle for connected cars will initially have to be much longer than it is for smartphones.

Virtualization of our cars is inevitable. It will have to involve separate but interconnected systems because the infrastructure is inherently different for control versus convenience networks. Specifically, instrument clusters, telematics and infotainment environments have very different requirements than those of computing, storage and networking. To create a high-quality experience, automakers will have to work through hardware and software issues holistically.

Already we see Apple’s two-year iPhone release schedule expanding to a three-year span because of gentler innovations and increasing complexity. The consumer life cycle for connected cars will initially have to be much longer than it is for smartphones because of this deep integration required for all the devices, instruments and functionalities that operate the vehicle.

Five factors unique to connected cars

Disruption is everywhere within the auto industry, similar to the disruption that shook out telecom. However, there are several critical differences:

  • Interactive/informative surface. The mobile phone has one small screen with all the technology packed in behind it. Inside a car, nearly every surface could be transformed into an interactive interface. Beyond the instrumentation panel, which has been gradually claiming more real estate on the steering wheel, there will be growth in backseat and rider-side infotainment screens. (Semi-) autonomous cars will present many more possibilities.
  • Processing power. The cloud turned mobile phones into smart clients with all the heavy processing elsewhere, but each car can contain a portable data center all its own. Right now, the NVIDIA Tegra X1 mobile processor for connected cars, used to demonstrate its Drive CX cockpit visualizations, can handle one trillion floating-point operations per second (flops). That’s roughly the same computing power as a 1,600-square-foot supercomputer from the year 2000.
  • Power management. The size and weight of phones were constrained for many years by the size of the battery required. The same is true of cars, but in terms of power and processing instead of the physical size and shape of the body frame. Consider apps like Pokémon Go, which are known as battery killers because of their extensive use of the camera for augmented reality and constant GPS usage. In the backseat of a car, Pokémon Go could run phenomenally with practically no affect on the car battery. Perhaps car windows could even serve as augmented reality screens.
  • Risk factors. This is the No. 1 roadblock to connected cars right now. The jump from consumer-grade to automotive-grade security is just too great for comfort. Normally, when somebody hacks a phone, nobody gets hurt physically. Acybersecurity report this year pointed out that connected cars average 100 million lines of code, compared to only 8 million for a Lockheed Martin F-35 Lightning II fighter jet. In other words, security experts have a great deal of work to do to protect connected cars from hackers and random computer errors.
  • Emotional affinity. Phones are accessories, but a car is really an extension of the driver. You can see this aspect in the pride people display when showing off their cars and their emotional attachment to their cars. This also explains why driverless cars and services like Uber are experiencing a hard limit on their market penetration. For the same reasons, companies that can’t provide flawless connectivity in cars could face long-lasting damage to their brand reputations.

Software over hardware

The value in connected cars will increasingly concentrate in software and applications over the hardware. The connected car will have a vertical hardware stack closely integrated with a horizontal software stack. To dominate the market, a player would need to decide where their niche lies within the solution matrix.

However, no matter how you view the hardware players and service stack, there is a critical role for mobility, software and services. These three will form the framework for experiences, powered by analytics, data and connectivity. Just as content delivered over the car radio grew to be an essential channel for ad revenue in the past, the same will be true in the future as newer forms of content consumption arise from innovative content delivery systems in the connected car.

In the big picture, though, connectivity is only part of the story.

As the second-most expensive lifetime purchase (after a home) for the majority of consumers, a car is an investment unlike any other. Like fuel and maintenance, consumers will fund connectivity as a recurring expense, which we could see through a variety of vehicle touch points. There’s the potential for carriers to partner with every vehicle interaction that’s currently on the market, as well as those that will be developed in the future.

When consumers are filling up at the gas pump, they could pay via their connected car wallet. In the instance of charging electric cars while inside a store, consumers could also make payments on the go using their vehicles. The possibilities for revenue generation through connected cars are endless. Some automakers may try the Kindle-like model to bundle the hardware cost into the price of the car, but most mobile carriers will prefer it to be spread out into a more familiar pricing model with a steady stream of income.

Monetization of the connected car

Once this happens and carriers start measuring ARPC, it will force other industry players to rethink their approach more strategically. For example, bundling of mobile, car and home connectivity will be inevitable for app, data and entertainment services as an integrated experience. In the big picture, though, connectivity is only part of the story. Innovative carriers will succeed by going further and perfecting an in-car user experience that will excite consumers in ways no one can predict right now. As electric vehicles (EVs), hydrogen-powered fuel cells and advances in solar gain market practicality, cars may run without gas, but they will not run without connectivity.

The first true killer app for connected cars is likely to be some form of new media, and the monetization potential will be vast. With Gartner forecasting a market of 250 million connected cars on the road by 2020, creative methods for generating revenue streams in connected cars won’t stop there. Over the next few years, we will see partnerships proliferate among industry players, particularly mobile carriers. The ones who act fast enough to assume a leadership role in the market now will drive away with an influential status and a long-term win — if history has anything to say about it.

Note: In this case, the term “connected” brings together related concepts, such as Wi-Fi, Bluetooth and evolving cellular networks, including 3G, 4G/LTE, 5G, etc.

Featured Image: shansekala/Getty Images
Source: http://cooltechreview.net/startups/how-connected-cars-are-turning-into-revenue-generating-machines/

Is 2016 Half Empty or Half Full?

11 Aug

With 2016 crossing the half way point, let’s take a look at some technology trends thus far.

Breaches: Well, many databases are half empty due to the continued rash of intrusions while the crooks are half full with our personal information. According to the Identity Theft Resource Center (ITRC), there have been 522 breaches thus far in 2016 exposing almost 13,000,000 records. Many are health care providers as our medical information is becoming the gold mine of stolen info. Not really surprising since the health care wearable market is set to explode in the coming years. Many of those wearables will be transmitting our health data back to providers. There were also a bunch of very recognizable names getting blasted in the media: IRS, Snapchat, Wendy’s and LinkedIn. And the best advice we got? Don’t use the same password across multiple sites. Updating passwords is a huge trend in 2016.

Cloud Computing: According to IDC, public cloud IaaS revenues are on pace to more than triple by 2020. From $12.6 billion in 2015 to $43.6 billion in 2020. The public cloud IaaS market grew 51% in 2015 but will slightly slow after 2017 as enterprises get past the wonder and move more towards cloud optimization rather than simply testing the waters. IDC also noted that four out of five IT organizations will be committed to hybrid architectures by 2018. While hybrid is the new normalremember, The Cloud is Still just a Datacenter Somewhere. Cloud seems to be more than half full and this comes at a time when ISO compliance in the cloud is becoming even more important.

DNS: I’ve said it before and I’ll say it again, DNS is one of the most important components of a functioning internet. With that, it presents unique challenges to organizations. Recently, Infoblox released its Q1 2016 Security Assessment Report and off the bat said, ‘In the first quarter of 2016, 519 files capturing DNS traffic were uploaded by 235 customers and prospects for security assessments by Infoblox. The results: 83% of all files uploaded showed evidence of suspicious activity (429 files).’ They list the specific threats from botnets to protocol anomalies to Zeus and DDoS. A 2014 vulnerability, Heartbleed, still appears around 11% of the time. DevOps is even in the DNS game. In half full news,VeriSign filed two patent applications describing the use of various DNS components to manage IoT devices. One is for systems and methods for establishing ownership and delegation of IoT devices using DNS services and the other is for systems and methods for registering, managing, and communicating with IoT devices using DNS processes. Find that half full smart mug…by name!

IoT: What can I say? The cup runneth over. Wearables are expected to close in on 215 million units shipped by 2020 with 102 million this year alone. I think that number is conservative with smart eyewear, watches and clothing grabbing consumer’s attention. Then there’s the whole realm of industrial solutions like smart tractors, HVAC systems and other sensors tied to smart offices, factories and cities. In fact, utilities are among the largest IoT spenders and will be the third-largest industry by expenditure in IoT products and services. Over $69 billion has already been spent worldwide, according to the IDC Energy Insights/Ericsson report. And we haven’t even touched on all the smart appliances, robots and media devices finding spots our homes. Get ready for Big Data regulations as more of our personal (and bodily) data gets pushed to the cloud. And we’re talking a lot of data.

Mobile: We are mobile, our devices are mobile and the applications we access are mobile. Mobility, in all its iterations, is a huge enabler and concern for enterprises and it’ll only get worse as we start wearing our connected clothing to the office. The Digital Dress Code has emerged. With 5G on the way, mobile is certainly half full and there is no empting it now.
Of course, F5 has solutions to address many of these challenges whether you’re boiling over or bone dry. Oursecurity solutions, including Silverline, can protect against malicious attacks; no matter the cloud –  private, public or hybrid – our Cloud solutions can get you there and back;BIG-IP DNS, particularly DNS Express, can handle the incredible name request boom as more ‘things’ get connected; and speaking of things, your data center will need to be agile enough to handle all the nouns requesting access; and check out how TCP Fast Open can optimize your mobile communications.

That’s what I got so far and I’m sure 2016’s second half will bring more amazement, questions and wonders. We’ll do our year-end reviews and predictions for 2017 as we all lament, where did the Year of the Monkey go?

There’s that old notion that if you see a glass half full, you’re an optimist and if you see it half empty you are a pessimist. I think you need to understand what state the glass itself was before the question. Was it empty and filled half way or was it full and poured out? There’s your answer!

Source: http://wireless.sys-con.com/node/3877543

5G: solving all the CTO’s problems?

18 Jan

 

Just weeks after the 3GPP kicked off its initial work on 5G standards, we are already hearing talk about ‘pre-5G’ or ‘proto-5G’ deployments; a second wave of standards (before the first has been defined); and forecasts for subscriber numbers (150 million in the first year, apparently). All this is terribly familiar – just as in 4G, there is a barrage of hype and false expectations, which will be followed by a cold reckoning, when mobile operators realize that they have no clear idea how to turn all those shiny new technologies into profits.

This time, however, operators’ shareholders and economic situation will not allow them to indulge in the spectrum frenzy of 3G or the architectural clean break of 4G. There will have to be a complete rethink of the costs and the return on investment before significant 5G deployments can start, otherwise there is little prospect of the new networks solving the CTO’s problems.

Those problems are varied and complex, but most are rooted in the fact that mobile broadband traffic is exploding, and while the cost per megabyte is falling, the price users are willing to pay for that megabyte is falling more steeply still. Increasingly, network upgrades are about clinging on to customers, not boosting margins. If 5G is to have any real value for the operator, it must accomplish two core objectives – reduce the cost of delivery by a far greater amount than 4G has done, and support brand new revenue and profit models, which can justify network investments. These use cases will vary for each operator, and some are not yet visible, so as well as supporting ubiquitous coverage and dense capacity at low cost, 5G will need to be infinitely flexible.

Dense, ubiquitous, software-driven, flexible – this will be an extremely hard combination to achieve, and it will be accomplished not just by core standards, but by the way that 5G is planned and managed. The network will need to be optimized and automated in real time, and constantly recalibrated to meet changing traffic patterns or applications. That will go a lot further to easing the CTO’s headaches than any step change in the air interface – and will become even more crucial as mobile networks become increasingly virtualized.

Virtualization and software-defined networking will improve mobile economics and broaden the addressable applications, but these are standards which are coming from the IT world, not from the traditional mobile arena. The CTO will increasingly be an IT as well as a telecoms executive, and the way that intelligence from the network is harnessed by IT systems, especially big data analytics, will be critical to the success of 5G.

The prospect of deploying a dense, ubiquitous, software-driven network, with rich links to IT platforms and extreme responsiveness, may sound like an even greater nightmare for the CTO. But without it, the operator will be unable to take advantage of the genuine new opportunities that lie ahead – particularly in some of the emerging Internet of Things markets – and will be stuck forever with a failing smartphone data business model.

The IoT will be extremely diverse and not all its needs will be met by cellular – indeed, its unifying technology will not be the access network, but the IT platforms which coordinate all the moving parts via a holistic view of each customer and service.

This is just the most extreme example of how MNOs will fail if they regard 5G merely as a new network architecture, however modern and virtualized. To turn that architecture into new services and new revenues, it must be planned and managed in a ‘5G’ way also. 5G is a business model transformation for telcos rather than a technological one and so much of the ROI will come from the systems that manage and control service delivery and customer relationships, even when that customer may be a machine. There is a viable business case for the 5G mobile network operator, but it means making some dramatic changes in thinking as well as in networks.

Find out more. Download the ‘Why 5G won’t be enough’ analyst opinion paper.

Source: http://blogs.amdocs.com/network/2016/01/17/5g-solving-all-the-ctos-problems/#.Vpy3FirhC70

3GPP

Power-Grid Hacked? Where’s the IoT?

1 Apr

Writing about the IoT (Internet of Things), or what was once called M2M, is something that people want to read about, a lot. It’s only recently that people are really catching on that everything is going to be connected. So when an article appeared on the front page of the USA Today about the smart grid stating that it was open to hack certainly deserved a chuckle or two, especially from people who are IoT advocates. No offense to my colleagues at the USA Today, but this nationally syndicated newspaper chain was covering the topic as if the fact that vulnerabilities could threaten lives was a breaking news story.

Ironically, there are days people talk about the IoT as if is something brand spanking new. Today newspapers and the broadcast news eagerly espouse the virtues of connected devices because there are apps or gadgets for just about everything imaginable in the IoT. We are now seeing a consumer frenzy surrounding smartphones, fitness trackers, lights, toasters, automobiles, and even baby bottles being connected.

Many people are just beginning to understand the IoT is more than connecting a computer to the Internet, or surfing the Web or watching a YouTube video. To really understand the Internet of Things is to recognize it is more than the everyday consumers gadgets that are getting all the media play these days. Much like the USA Today was so eloquently trying to point out is that the power grid is under attack every day—and what the author stated so clearly—and at any moment, it would leave millions of people without power for days or weeks. And that’s not even the worst of what could happen. Most residents do not equate the average brownout they experience for a few hours to the blackout that could be on the horizon in their neighborhood.

But again most people don’t give the IoT much thought. It’s kind of like their cellphones. Most people don’t know how they work. Nor do they care. They only care they work when and where they need it. The same holds true about their connected gadgets. Most consumers really don’t give their connected gadgets much thought until they need them for tracking their fitness, or turning on their lights or thermostats, or for finding the closest fast food restaurant when traveling in their cars. However, as more and more consumers adopt and adapt to electronic devices as part of their everyday lifestyle, this will change their attitudes and perceptions forever and the excitement for connected devices will trickle over into the enterprise. It is already happening with smart cities, with parking meters, trash pickups, snow removal, first responders, and smart utility meters.

Perhaps that is why the USA Today story has some real significance now and enterprise companies are starting to move away from just talking about the IoT to figuring out ways to implement solutions and services.

Part of the problem with the grid today is that it was designed with OMS (outage-management systems) that were configured to be reactive to signals that indicated outages and managed restoration. However, going forward the IoT systems being designed are able to prevent outages and restore services. These services, as one analyst firm says, could lead to a very bright future for the smart-grid, and as a result, projections based on these services makes sense and are very tangible.

While enterprises are looking to adopt the IoT, there seems to be a blurring of the lines between actual growth and hyperbole in market estimates. Vendors want to make huge growth predictions—50 billion devices—which currently is the buzz of the industry. However, these enormous market amplifications have already proven they will undoubtedly stall growth.

Corporate America seeks growth forecasts that are meaningful and that help deliver solutions to improve bottomline results and shareholder value. Again, one network carrier’s conjecture boasting the number of connections could quadruple by 2020, reaching more than 5 billion, doesn’t mean anything if all of these devices and connections are going to be hacked and CEOs heads are on the chopping block.

The same carrier was even quoted as saying in order for the IoT to reach these prognostications, networks must be reliable, the data from all of these connected endpoints must be able to be stored reliably and securely, infrastructures must be secure, and there must be ways to achieve device management.

If all the stars are in alignment, there is no question the IoT is poised for growth. But, that means everyone has to focus on making security a top priority to fend off the bad guys and to consider the market unknowns that can slow or delay IoT development.

That’s why the formation of groups like the ITA (Illinois Technology Assn.), www.illinoistech.org, Internet of Things Council—a public/private partnership that aims to assure civic leadership in the Internet of Things can will help companies sort through the facts from the fiction to jumpstart their initiatives.

Thus, it’s no wonder the more the industry does its crystal ball gazing, we are doing a disservice to IoT’s true potential. Even Federal Energy Regulatory Commission Chairwoman Cheryl LaFleur was pretty poignant in her remarks when she was quoted in the USA Today article referring to the potential of an attack, “One is too many, so that’s why we have to pay attention. The threats continue to evolve and we have to continue to evolve as well.”

Makes you wonder if the industry is evolving or just continuing to bandy about forecasts with little or no regard for living up to market or shareholding expectations much like it has for the past 15 years. Regardless of what you believe in all of this, the IoT is changing our lives one way or the other and it will certainly have an even greater impact on each and every business. How and when, those are the billion dollar questions.

Source: http://connectedworld.com/power-grid-hacked-wheres-the-iot/

Smart cities to quadruple by the year 2025

4 Aug

The number of global smart cities is expected to grow from 21 in 2013 to an estimated 88 in 2025, according to a new report from IHS Technology. These smart cities will possess energy efficient infrastructures as well as keep a maintained focus on security and streamlined transportation efforts.

smart-cities-1024x576

Lisa Arrowsmith, IHS Associate Director, defines a smart city as a city that has deployed “the integration of information, communications and technology (ICT) solutions across three or more different functional areas of a city.” She further adds that these implementations could be in the realms of mobile and transport, energy and sustainability, physical infrastructure, governance, and safety and security.

Among the 21 cities IHS currently categorizes as smart are five in the U.S. – San Francisco, Los Angeles, Boston, Chicago and New York. According to the study, “Asia-Pacific will account for 32 smart cities of the total in nine years’ time, Europe will have 31, and the Americas will contribute 25.”

19bbpfy362vh4jpg

“London, for example, is retrofitting both residential and commercial buildings to lessen carbon dioxide emissions,” the study notes. “The city is also adopting charging infrastructure to support the introduction of 100,000 electric vehicles.” In Santander, Spain, it adds, “soil-humidity sensors detect when land requires irrigating for more sustainable water use.”

The IHS report titled, “Smart Cities: Business Models, Technologies and Existing Projects,” also finds that the current $1 billion worldwide annual investment in smart cities will grow to over $12 billion by the year 2025. The report continues on to demonstrate the need for smart cities as a response to increasingly congested and polluted cities.

With a global population that is becoming overly urbanized, certain resources are becoming scarce in these densely populated areas. Smart cities and tech based city organization can focus on these limited resources and assure they are managed in a way that provides the best solutions for inhabitants.

While today’s smart cities may not be the most cost-friendly option when reorganizing an urban area, Arrowsmith lauds the possibilities that smart planning could provide. She notes the collaboration of public and private sectors could unquestionably boost a local economy. Incorporating technology applications into city planning could in turn create jobs or even foster a high tech culture within the municipality.

tech_santander21__01__630x420

The glowing example of a global smart city is Santander, Spain. After obtaining an EU grant, the aging port town organized a team to install over 12,000 sensors within city limits. BusinessWeek’s Carol Matlack writes that the sensors track everything from surfing conditions to traffic congestion. The city has even placed sensors deep in the ground of their parks to measure soil humidity and can then properly determine sprinkler usage. In all, Santander is a prime example of how technology and communication can work in unison to better organize the smart city of the future.

With the example Santander has provided as well as what plans for cities across the globe have in store, it’s certainly not far-fetched to believe in the projections provided by the IHS report. You can read the IHS document in its entirety here.

Source: http://atmelcorporation.wordpress.com/2014/08/01/report-smart-cities-to-quadruple-by-the-year-2025/

Big Data – Trends & Trajectories

4 Aug

Would you be taken aback if Big Data is declared as the word of the year 2014? Well, I certainly wouldn’t be. Although initially it started off as a paradigm, Big Data is permeating all facets of business at a fast pace. Digital data is everywhere and there is a tremendous wave of innovation on the ways big data can be used to generate value across sectors of the global economy.

In this blog we shall discuss few big data trends which will have immense significance in the upcoming days.

Internet of customers:

In a panel discussion at the World Economic Forum, 2014, when asked what will be important in the next 5 years, Marc Benioff, CEO of salesforce.com, elucidated on the importance of big data in enhancing and maintaining the customer base. As we talk about mobility and the internet of things, we should recognize that behind every such device is a customer. It is not an “internet of things” but an “internet of customers”.

The catchword here is “Context”. With data explosion happening in every industry, we are gathering unprecedented amount of user contexts. Big data provides tremendous opportunities to harness these contexts to gain actionable insights on consumer behavior. It doesn’t really matter if you are a B2C or a B2B company but what actually matters is how effectively you utilize the potential of big data to extract useful contextual information and use it to build a 1:1 relationship with individual customers. The companies that use this opportunity to enhance their customer base will be the most successful in the future.

Good Data > Big Data: One of the most prominent illustrations of big data in action is Google Flu Trends (GFT), which uses aggregated Google search data to monitor real-time flu cases world over. Google used specific search terms and patterns to correlate between how many people searched for flu-related topics and how many people actually have flu symptoms. With over 500 million google searches made every day, this may seem to be the perfect big data case study but as it turns out, GFT failed to perform as highly as it was expected to. GFT overestimated the prevalence of flu in the 2012-2013 and

2011-2012 seasons by more than 50% and also completely missed the swine flu epidemic in 2009.

This has led many analysts to sit back and retrospect on the big data strategies which caused this failure. The fallacy that a huge amount of data leads to better analysis should be recognized. Rather than taking into consideration indiscriminate and unrelated datasets which worsen the problem, the analysis premise should study data based on specific definition and aligned to the objectives. Big data methodologies can be successful, but only if they are based on accurate assumptions and are relevant.

Open Sourcing: Google never made public the criteria it used to establish the search patterns and has hence hindered further analysis on the failure. This experiment necessitates the introduction of the open source culture in big data technologies. Studies involving astounding amount of data should involve greater cooperation and transparency between participating organizations which would in turn help build robust predictive models.

Visualization/ User experience: Presenting data in an understandable and rational way is another issue concomitant with big data technologies. Softwares which help deduce insights from big complex datasets will be much in demand in the near future. Analytical business softwares with user-friendly and intuitive user interfaces will form a critical component of the sales of big data technologies.

Many technology giants have started to focus on building easy-to-use and engaging user experiences which would make them popular facilitators of big data. In one of his all-hands speeches in the second half of 2013, Jim Hagemann Snabe, Co-CEO of SAP AG, outlined SAP’s vision to change the perception that its softwares are complex and difficult to use. As far as SAP is concerned in this context, user experience is one of the focus points of SAP’s strategy and it will go a long way in helping SAP further cement its position as one of the analytics market leaders and a promising enabler of big data technologies.

Source: http://chayani.wordpress.com/2014/08/03/big-data-trends-trajectories/

First Test with Everythng Cloud, Raspberry PI and Python

12 May

During my constant research and work on the topic „Internet of Everything“ or „Internet of Things“ and to be more special: searching for a cool middleware, I came across a very interesting website: https://www.evrythng.com/. The whole concept behind looks very promising and I was tempted to give these guys a try. I have a few sensors, some knowledge in programming and I need some online space to store my data. So basic motivation is there to look out for some good concepts which are not doomed to be killed soon after their first release date because of loosing the race to be the best middleware. As a start I registered an account and my first idea was: connecting a sensor to their cloud based storage system and simply try how it might look like. Of course they offer a variety of other services, like:

  • Products, which are a description of class of THNGs (THNG = smallest unit)
  • Applications which can be used to connect e.g. a webbrowser
  • Campaings
  • Analytics
  • and many other very useful services.

Basically you can build a whole economic system on their online spaces. So where to start? First things first and so I decided to build only a simple use case to connect only one sensor which seems to be sufficient for a small test. The sensor which is the basic entity in their terms: a THNG. So for a fast start: build a small sensor and then try to connect it via a Python script. My self-imposed task was: using Python as programming language. (I could have used some different languages, like JAVA. This would have been much easier because there exists a ready-to-use Java API here ..)

At first I thought about a sensor architecture which included to build a sensor gateway to connect my local sensors (picture: Basic Sensor Architecture). Why? A gateway can:

  • protect the communication from sensor to cloud based services
  • has more CPU power to use stronger cryptographic algorithms
  • protect the sensors from being accessed from outside the Internet
  • store data when the connection to the cloud services is lost
  • lower the volume of traffic by sending data only, when the value is changing (saving bandwidth),
  • bridging connectivity options between e.g.: Ethernet and Bluetooth
  • translating IPv4 into IPv6 or vice versa
  • etc …

Sensor gw

Basic Sensor Architecture

But this would have taken some time, so I simplified the architecture a little bit and connected a sensing device directly to the EVRYTHNG cloud, which is reflected by the following picture: Proof of Concept Architecture.

Sensor

Proof of Concept Architecture

And of course the sensor must have enough CPU & storage capacity to run some functionality, like Python, ssh, etc.. on it. Of course there must be at least an ethernet connectivity to transport its data to its final destination. For these basic requirements I made my choice: I used one of my Raspberry PIs for my experiment. A bread board, some wires, a resistor and a waterproof temperature sensor did the trick:

PI sensor

Raspberry Pi as a sensor

Next step: getting data from the sensor via Python was not a very difficult task and there are a lot of very good tutorials on how to program Python to get sensor values: e.g. here. So I skipped that step in my description to concentrate on the new things.

I headed for a test account and after registering I got my personal access. Next step was to setup a THNG as a base entity to send data to. I decided not to do it via API calls (which would have forced me to write another script) instead I did it the traditional way: using the web-GUI from EVRYTHNG:

Thng

THNG Creation

After getting my THNG and of course here you can find the ThingID (greyed out) which is important to access it via API calls from your sensor, you can then define some properties which will then will be filled with data from your devices ( a really nice one is to use Geo locations, because if you have a moving sensor e.g. in a car, this could easily become quite handy). For this THNG I simply used one property: temperature. Of course you can use multiple of these data fields, but again: my task was only to do a simple test if I can get the whole story up and running.

But the most interesting part was: getting the REST API from EVRYTHNG to work with Python. No API exist for Python so far and I had to experiment a little on how to write a PoC Code for such a task. But again: it was not really a challenge, because the API from EVRYTHNG is very well documented and can be found here. A lot of examples of how REST is working, which return codes you can get back, how to create and delete, and most important: how to store and access your data. I decided to build a small class which fits for a sensor with limited amount of space. So the basic task was: sending data via http and before you can do that: converting the Python data array to JSON (yes, https is also an option and it is recommended once you use it for your production environment!!!). I did it via urllib2 and converting Python data to JSON, I used simply the json coder / encoder API. Nearly everything worked straight, except one little glitch which costs me some time: to convert data to an EVRYTHNG API-readable format I had to add brackets to the final json-converted data like: “[“+self.DATA+ “]“. After that challenge my little program worked and I was able to produce some valid input into EVRYTHNG. You can find the Python class below. Feel free to use it (at your own risk)…

”’

@author: Axel Dittmann

”’

import urllib2

import json

class sensorthng:

    ”’

   a very small class just to update sensors .. no further functionality provided 

    ”’

    def __init__(self, API_Token):

        self.API_TOKEN = API_Token

        self.HEADER_UPDATE = {‘Content-Type’‘application/json‘Authorization’: self.API_TOKEN}

        self.URL =https://api.evrythng.com/thngs/”

 

    def open_query (self, query_URL, data, header):

        self.URL_REQUEST = query_URL

        self.DATA = data

        self.HEADER=header

        #Request is GET if data = None, POST if data has some valid JSON data

        self.REQUEST = urllib2.Request(self.URL_REQUEST,self.DATA,self.HEADER)

        self.RESPONSE = urllib2.urlopen(self.REQUEST)

        return self.RESPONSE.read()

    

        #Updating properties of a thng

    def updating_properties(self, thng_id, data):

        self.DATA = data

        self.QUERY_URL = self.URL+thng_id+“/properties”

        #convert to JSON

        self.DATA = json.dumps(self.DATA)

        # now the important part: add [] to the json data, otherwise the function would not work

        self.DATA = “[“+self.DATA+ “]“

        #change of the header -> must be update string

        return self.open_query(self.QUERY_URL, self.DATA, self.HEADER_UPDATE)

My basic Proof-of-Concept code looked like this:

mythng = sensorthng(„USE your API-Access-Code-here”)

    

whileTrue:

    try:

        c_temp = read_ctemperature()

        print c_temp   

        data =  { ‘key’:‘temperature’,‘value’:c_temp}

        print mythng.updating_properties(„USE your THNG-ID here”, data)    

        time.sleep(60)

    except:

        print” Error Occured .. next try”

It is doing nothing else than sending data (of course getting the Celsius temperature from the sensor) and if an error occurs e.g. my internet connection will be down, then it will simply loop until the connection is restored. So no interruption of service in case of failure …

So basically just write a little code to implement the class and and use your API_Token from your EVRYTHNG login and your THNG_ID from your THNG screen. Then it should just work fine and the script updates your data .. Like my Raspberry PI: I decided to just sending a continuous stream for every 60 seconds:

PI response

Temperature and EVRYTHNG API return data

In this picture you can see, that the EVRYTHNG functionality add a timestamp to your data if you do not provide one. And this output shows also the return info from the EVRYTHNG API once the data was sent to their REST API if everything works and the data was successful delivered. And now let’s take a look how it looks like in the EVRYTHNG web GUI:

Evrything

THNG Overview

After clicking on the properties of my PI_temp_sensor you can than see a nice little graphical view of your submitted data:

Tmp graph

Temperature Data

The graphical view expressed the real life situation: at 8:10 it started to rain and this caused the temperature to drop 🙂.

So, first test done, in my opinion very easy and of course for me was my little test a great success, because everything works just „out-of-the-box“ and straight forward!! The EVRYTHNG idea has great potential and I am very sure that they will make their way to one of the leading THNG / IoT providers in the near future. The topics I haven’t covered so far are really interesting subjects like: using coordinates on a map to have a better overview of your THNGs, or grouping THNGs, etc .. I haven’t discovered EVRYTHNG’s full potential yet, but due to the fact that every upcoming day you have to learn new things, there must be something left for tomorrow 🙂 …

 

Source: http://ipv6poclab.org/2014/05/11/first-test-with-evrythng-cloud-raspberry-pi-and-python/

%d bloggers like this: