Tag Archives: IoT

Unlearn to Unleash Your Data Lake

16 Sep

The Data Science Process is about exploring, experimenting, and testing new data sources and analytic tools quickly.

The Challenge of Unlearning
For the first two decades of my career, I worked to perfect the art of data warehousing. I was fortunate to be at Metaphor Computers in the 1980’s where we refined the art of dimensional modeling and star schemas. I had many years working to perfect my star schema and dimensional modeling skills with data warehouse luminaries like Ralph Kimball, Margy Ross, Warren Thornthwaite, and Bob Becker. It became engrained in every customer conversation; I’d built a star schema and the conformed dimensions in my head as the client explained their data analysis requirements.

Then Yahoo happened to me and soon everything that I held as absolute truth was turned upside down. I was thrown into a brave new world of analytics based upon petabytes of semi-structured and unstructured data, hundreds of millions of customers with 70 to 80 dimensions and hundreds of metrics, and the need to make campaign decisions in fractions of a second. There was no way that my batch “slice and dice” business intelligence and highly structured data warehouse approach was going to work in this brave new world of real-time, predictive and prescriptive analytics.

I struggled to unlearn engrained data warehousing concepts in order to embrace this new real-time, predictive and prescriptive world. And this is one of the biggest challenge facing IT leaders today – how to unlearn what they’ve held as gospel and embrace what is new and different. And nowhere do I see that challenge more evident then when I’m discussing Data Science and the Data Lake.

Embracing The “Art of Failure” and The Data Science Process
Nowadays, Chief Information Officers (CIOs) are being asked to lead the digital transformation from a batch world that uses data and analytics to monitor the business to a real-time world that exploits internal and external, structured and unstructured data, to predict what is likely to happen and prescribe recommendations. To power this transition, CIO’s must embrace a new approach for deriving customer, product, and operational insights – the Data Science Process (see Figure 2).

Figure 2:  Data Science Engagement Process

The Data Science Process is about exploring, experimenting, and testing new data sources and analytic tools quickly, failing fast but learning faster. The Data Science process requires business leaders to get comfortable with “good enough” and failing enough times before one becomes comfortable with the analytic results. Predictions are not a perfect world with 100% accuracy. As Yogi Berra famously stated:

“It’s tough to make predictions, especially about the future.”

This highly iterative, fail-fast-but-learn-faster process is the heart of digital transformation – to uncover new customer, product, and operational insights that can optimize key business and operational processes, mitigate regulatory and compliance risks, uncover new revenue streams and create a more compelling, more prescriptive customer engagement. And the platform that is enabling digital transformation is the Data Lake.

The Power of the Data Lake
The data lake exploits the economics of big data; coupling commodity, low-cost servers and storage with open source tools and technologies, is 50x to 100x cheaper to store, manage and analyze data then using traditional, proprietary data warehousing technologies. However, it’s not just cost that makes the data lake a more compelling platform than the data warehouse. The data lake also provides a new way to power the business, based upon new data and analytics capabilities, agility, speed, and flexibility (see Table 1).

Data Warehouse Data Lake
Data structured in heavily-engineered structured dimensional schemas Data structured as-is (structured, semi-structured, and unstructured formats)
Heavily-engineered, pre-processed data ingestion Rapid as-is data ingestion
Generates retrospective reports from historical, operational data sources Generates predictions and prescriptions from a wide variety of internal and external data sources
100% accurate results of past events and performance “Good enough” predictions of future events and performance
Schema-on-load to support the historical reporting on what the business did Schema-on-query to support the rapid data exploration and hypothesis testing
Extremely difficult to ingest and explore new data sources (measured in weeks or months) Easy and fast to ingest and explore new data sources (measured in hours or days)
Monolithic design and implementation (water fall) Natively parallel scale out design and implementation (scrum)
Expensive and proprietary Cheap and open source
Widespread data proliferation (data warehouses and data marts) Single managed source of organizational data
Rigid; hard to change Agile; relatively ease to change

Table 1:  Data Warehouse versus Data Lake

The data lake supports the unique requirements of the data science team to:

  • Rapidly explore and vet new structured and unstructured data sources
  • Experiment with new analytics algorithms and techniques
  • Quantify cause and effect
  • Measure goodness of fit

The data science team needs to be able perform this cycle in hours or days, not weeks or months. The data warehouse cannot support these data science requirements. The data warehouse cannot rapidly exploration the internal and external structured and unstructured data sources. The data warehouse cannot leverage the growing field of deep learning/machine learning/artificial intelligence tools to quantify cause-and-effect. Thinking that the data lake is “cold storage for our data warehouse” – as one data warehouse expert told me – misses the bigger opportunity. That’s yesterday’s “triangle offense” thinking. The world has changed, and just like how the game of basketball is being changed by the “economics of the 3-point shot,” business models are being changed by the “economics of big data.”

But a data lake is more than just a technology stack. To truly exploit the economic potential of the organization’s data, the data lake must come with data management services covering data accuracy, quality, security, completeness and governance. See “Data Lake Plumbers: Operationalizing the Data Lake” for more details (see Figure 3).

Figure 3:  Components of a Data Lake

If the data lake is only going to be used another data repository, then go ahead and toss your data into your unmanageable gaggle of data warehouses and data marts.

BUT if you are looking to exploit the unique characteristics of data and analytics –assets that never deplete, never wear out and can be used across an infinite number of use cases at zero marginal cost – then the data lake is your “collaborative value creation” platform. The data lake becomes that platform that supports the capture, refinement, protection and re-use of your data and analytic assets across the organization.

But one must be ready to unlearn what they held as the gospel truth with respect to data and analytics; to be ready to throw away what they have mastered to embrace new concepts, technologies, and approaches. It’s challenging, but the economics of big data are too compelling to ignore. In the end, the transition will be enlightening and rewarding. I know, because I have made that journey.

Source: http://cloudcomputing.sys-con.com/node/4157284

Advertisements

How connected cars are turning into revenue-generating machines

29 Aug

 

At some point within the next two to three years, consumers will come to expect car connectivity to be standard, similar to the adoption curve for GPS navigation. As this new era begins, the telecom metric of ARPU will morph into ARPC (average revenue per car).

In that time frame, automotive OEMs will see a variety of revenue-generating touch points for connected vehicles at gas stations, electric charging stations and more. We also should expect progressive mobile carriers to gain prominence as essential links in the automotive value chain within those same two to three years.

Early in 2016, that transitional process began with the quiet but dramatic announcement of a statistic that few noted at the time. The industry crossed a critical threshold in the first quarter when net adds of connected cars (32 percent) rose above the net adds of smartphones (31 percent) for the very first time. At the top of the mobile carrier chain, AT&T led the world with around eight million connected cars already plugged into its network.

The next big event to watch for in the development of ARPC will be when connected cars trigger a significant redistribution of revenue among the value chain players. In this article, I will focus mostly on recurring connectivity-driven revenue. I will also explore why automakers must develop deep relationships with mobile carriers and Tier-1s to hold on to their pieces of the pie in the connected-car market by establishing control points.

After phones, cars will be the biggest category for mobile-data consumption.

It’s important to note here that my conclusions on the future of connected cars are not shared by everyone. One top industry executive at a large mobile carrier recently asked me, “Why do we need any other form of connectivity when we already have mobile phones?” Along the same lines, some connected-car analysts have suggested that eSIM technology will encourage consumers to simply add to their existing wireless plans connectivity in their cars.

Although there are differing points of view, it’s clear to me that built-in embedded-SIM for connectivity will prevail over tethering with smartphones. The role of Tier-1s will be decisive for both carriers and automakers as they build out the future of the in-car experience, including infotainment, telematics, safety, security and system integration services.

The sunset of smartphone growth

Consider the U.S. mobile market as a trendsetter for the developed world in terms of data-infused technology. You’ll notice thatphone revenues are declining. Year-over-year sales of mobiles have registered a 6.5 percent drop in North America and have had an even more dramatic 10.8 percent drop in Europe. This is because of a combination of total market saturation and economic uncertainty, which encourages consumers to hold onto their phones longer.

While consumer phone upgrades have slowed, non-phone connected devices are becoming a significant portion of net-adds and new subscriptions. TBR analyst Chris Antlitz summed up the future mobile market: “What we are seeing is that the traditional market that both carriers [AT&T and Verizon] go after is saturated, since pretty much everyone who has wanted a cell phone already has one… Both companies are getting big into IoT and machine-to-machine and that’s a big growth engine.”

At the same time, AT&T and Verizon are both showing a significant uptick in IoT revenue, even though we are still in the early days of this industry. AT&T crossed the $1 billion mark and Verizon posted earnings of $690 million in the IoT category for last year, with 29 percent of that total in the fourth quarter alone.

Data and telematics

While ARPU is on the decline, data is consuming a larger portion of the pie. Just consider some astonishing facts about data usage growth from Cisco’s Visual Networking Index 2016. Global mobile data traffic grew 74 percent over the past year, to more than 3.7 exabytes per month. Over the past 10 years, we’ve seen a 4,000X growth in data usage. After phones, cars will be the biggest category for mobile-data consumption.

Most cars have around 150 different microprocessor-controlled sub-systems built by different functional units. The complexity of integrating these systems adds to the time and cost of manufacturing. Disruptive companies like Tesla are challenging that model with a holistic design of telematics. As eSIM becomes a standard part of the telematics control unit (TCU), it could create one of the biggest disruptive domino effects the industry has seen in recent years. That’s why automakers must develop deep relationships with mobile carriers and Tier-1s.

The consumer life cycle for connected cars will initially have to be much longer than it is for smartphones.

Virtualization of our cars is inevitable. It will have to involve separate but interconnected systems because the infrastructure is inherently different for control versus convenience networks. Specifically, instrument clusters, telematics and infotainment environments have very different requirements than those of computing, storage and networking. To create a high-quality experience, automakers will have to work through hardware and software issues holistically.

Already we see Apple’s two-year iPhone release schedule expanding to a three-year span because of gentler innovations and increasing complexity. The consumer life cycle for connected cars will initially have to be much longer than it is for smartphones because of this deep integration required for all the devices, instruments and functionalities that operate the vehicle.

Five factors unique to connected cars

Disruption is everywhere within the auto industry, similar to the disruption that shook out telecom. However, there are several critical differences:

  • Interactive/informative surface. The mobile phone has one small screen with all the technology packed in behind it. Inside a car, nearly every surface could be transformed into an interactive interface. Beyond the instrumentation panel, which has been gradually claiming more real estate on the steering wheel, there will be growth in backseat and rider-side infotainment screens. (Semi-) autonomous cars will present many more possibilities.
  • Processing power. The cloud turned mobile phones into smart clients with all the heavy processing elsewhere, but each car can contain a portable data center all its own. Right now, the NVIDIA Tegra X1 mobile processor for connected cars, used to demonstrate its Drive CX cockpit visualizations, can handle one trillion floating-point operations per second (flops). That’s roughly the same computing power as a 1,600-square-foot supercomputer from the year 2000.
  • Power management. The size and weight of phones were constrained for many years by the size of the battery required. The same is true of cars, but in terms of power and processing instead of the physical size and shape of the body frame. Consider apps like Pokémon Go, which are known as battery killers because of their extensive use of the camera for augmented reality and constant GPS usage. In the backseat of a car, Pokémon Go could run phenomenally with practically no affect on the car battery. Perhaps car windows could even serve as augmented reality screens.
  • Risk factors. This is the No. 1 roadblock to connected cars right now. The jump from consumer-grade to automotive-grade security is just too great for comfort. Normally, when somebody hacks a phone, nobody gets hurt physically. Acybersecurity report this year pointed out that connected cars average 100 million lines of code, compared to only 8 million for a Lockheed Martin F-35 Lightning II fighter jet. In other words, security experts have a great deal of work to do to protect connected cars from hackers and random computer errors.
  • Emotional affinity. Phones are accessories, but a car is really an extension of the driver. You can see this aspect in the pride people display when showing off their cars and their emotional attachment to their cars. This also explains why driverless cars and services like Uber are experiencing a hard limit on their market penetration. For the same reasons, companies that can’t provide flawless connectivity in cars could face long-lasting damage to their brand reputations.

Software over hardware

The value in connected cars will increasingly concentrate in software and applications over the hardware. The connected car will have a vertical hardware stack closely integrated with a horizontal software stack. To dominate the market, a player would need to decide where their niche lies within the solution matrix.

However, no matter how you view the hardware players and service stack, there is a critical role for mobility, software and services. These three will form the framework for experiences, powered by analytics, data and connectivity. Just as content delivered over the car radio grew to be an essential channel for ad revenue in the past, the same will be true in the future as newer forms of content consumption arise from innovative content delivery systems in the connected car.

In the big picture, though, connectivity is only part of the story.

As the second-most expensive lifetime purchase (after a home) for the majority of consumers, a car is an investment unlike any other. Like fuel and maintenance, consumers will fund connectivity as a recurring expense, which we could see through a variety of vehicle touch points. There’s the potential for carriers to partner with every vehicle interaction that’s currently on the market, as well as those that will be developed in the future.

When consumers are filling up at the gas pump, they could pay via their connected car wallet. In the instance of charging electric cars while inside a store, consumers could also make payments on the go using their vehicles. The possibilities for revenue generation through connected cars are endless. Some automakers may try the Kindle-like model to bundle the hardware cost into the price of the car, but most mobile carriers will prefer it to be spread out into a more familiar pricing model with a steady stream of income.

Monetization of the connected car

Once this happens and carriers start measuring ARPC, it will force other industry players to rethink their approach more strategically. For example, bundling of mobile, car and home connectivity will be inevitable for app, data and entertainment services as an integrated experience. In the big picture, though, connectivity is only part of the story. Innovative carriers will succeed by going further and perfecting an in-car user experience that will excite consumers in ways no one can predict right now. As electric vehicles (EVs), hydrogen-powered fuel cells and advances in solar gain market practicality, cars may run without gas, but they will not run without connectivity.

The first true killer app for connected cars is likely to be some form of new media, and the monetization potential will be vast. With Gartner forecasting a market of 250 million connected cars on the road by 2020, creative methods for generating revenue streams in connected cars won’t stop there. Over the next few years, we will see partnerships proliferate among industry players, particularly mobile carriers. The ones who act fast enough to assume a leadership role in the market now will drive away with an influential status and a long-term win — if history has anything to say about it.

Note: In this case, the term “connected” brings together related concepts, such as Wi-Fi, Bluetooth and evolving cellular networks, including 3G, 4G/LTE, 5G, etc.

Featured Image: shansekala/Getty Images
Source: http://cooltechreview.net/startups/how-connected-cars-are-turning-into-revenue-generating-machines/

Is 2016 Half Empty or Half Full?

11 Aug

With 2016 crossing the half way point, let’s take a look at some technology trends thus far.

Breaches: Well, many databases are half empty due to the continued rash of intrusions while the crooks are half full with our personal information. According to the Identity Theft Resource Center (ITRC), there have been 522 breaches thus far in 2016 exposing almost 13,000,000 records. Many are health care providers as our medical information is becoming the gold mine of stolen info. Not really surprising since the health care wearable market is set to explode in the coming years. Many of those wearables will be transmitting our health data back to providers. There were also a bunch of very recognizable names getting blasted in the media: IRS, Snapchat, Wendy’s and LinkedIn. And the best advice we got? Don’t use the same password across multiple sites. Updating passwords is a huge trend in 2016.

Cloud Computing: According to IDC, public cloud IaaS revenues are on pace to more than triple by 2020. From $12.6 billion in 2015 to $43.6 billion in 2020. The public cloud IaaS market grew 51% in 2015 but will slightly slow after 2017 as enterprises get past the wonder and move more towards cloud optimization rather than simply testing the waters. IDC also noted that four out of five IT organizations will be committed to hybrid architectures by 2018. While hybrid is the new normalremember, The Cloud is Still just a Datacenter Somewhere. Cloud seems to be more than half full and this comes at a time when ISO compliance in the cloud is becoming even more important.

DNS: I’ve said it before and I’ll say it again, DNS is one of the most important components of a functioning internet. With that, it presents unique challenges to organizations. Recently, Infoblox released its Q1 2016 Security Assessment Report and off the bat said, ‘In the first quarter of 2016, 519 files capturing DNS traffic were uploaded by 235 customers and prospects for security assessments by Infoblox. The results: 83% of all files uploaded showed evidence of suspicious activity (429 files).’ They list the specific threats from botnets to protocol anomalies to Zeus and DDoS. A 2014 vulnerability, Heartbleed, still appears around 11% of the time. DevOps is even in the DNS game. In half full news,VeriSign filed two patent applications describing the use of various DNS components to manage IoT devices. One is for systems and methods for establishing ownership and delegation of IoT devices using DNS services and the other is for systems and methods for registering, managing, and communicating with IoT devices using DNS processes. Find that half full smart mug…by name!

IoT: What can I say? The cup runneth over. Wearables are expected to close in on 215 million units shipped by 2020 with 102 million this year alone. I think that number is conservative with smart eyewear, watches and clothing grabbing consumer’s attention. Then there’s the whole realm of industrial solutions like smart tractors, HVAC systems and other sensors tied to smart offices, factories and cities. In fact, utilities are among the largest IoT spenders and will be the third-largest industry by expenditure in IoT products and services. Over $69 billion has already been spent worldwide, according to the IDC Energy Insights/Ericsson report. And we haven’t even touched on all the smart appliances, robots and media devices finding spots our homes. Get ready for Big Data regulations as more of our personal (and bodily) data gets pushed to the cloud. And we’re talking a lot of data.

Mobile: We are mobile, our devices are mobile and the applications we access are mobile. Mobility, in all its iterations, is a huge enabler and concern for enterprises and it’ll only get worse as we start wearing our connected clothing to the office. The Digital Dress Code has emerged. With 5G on the way, mobile is certainly half full and there is no empting it now.
Of course, F5 has solutions to address many of these challenges whether you’re boiling over or bone dry. Oursecurity solutions, including Silverline, can protect against malicious attacks; no matter the cloud –  private, public or hybrid – our Cloud solutions can get you there and back;BIG-IP DNS, particularly DNS Express, can handle the incredible name request boom as more ‘things’ get connected; and speaking of things, your data center will need to be agile enough to handle all the nouns requesting access; and check out how TCP Fast Open can optimize your mobile communications.

That’s what I got so far and I’m sure 2016’s second half will bring more amazement, questions and wonders. We’ll do our year-end reviews and predictions for 2017 as we all lament, where did the Year of the Monkey go?

There’s that old notion that if you see a glass half full, you’re an optimist and if you see it half empty you are a pessimist. I think you need to understand what state the glass itself was before the question. Was it empty and filled half way or was it full and poured out? There’s your answer!

Source: http://wireless.sys-con.com/node/3877543

5G: solving all the CTO’s problems?

18 Jan

 

Just weeks after the 3GPP kicked off its initial work on 5G standards, we are already hearing talk about ‘pre-5G’ or ‘proto-5G’ deployments; a second wave of standards (before the first has been defined); and forecasts for subscriber numbers (150 million in the first year, apparently). All this is terribly familiar – just as in 4G, there is a barrage of hype and false expectations, which will be followed by a cold reckoning, when mobile operators realize that they have no clear idea how to turn all those shiny new technologies into profits.

This time, however, operators’ shareholders and economic situation will not allow them to indulge in the spectrum frenzy of 3G or the architectural clean break of 4G. There will have to be a complete rethink of the costs and the return on investment before significant 5G deployments can start, otherwise there is little prospect of the new networks solving the CTO’s problems.

Those problems are varied and complex, but most are rooted in the fact that mobile broadband traffic is exploding, and while the cost per megabyte is falling, the price users are willing to pay for that megabyte is falling more steeply still. Increasingly, network upgrades are about clinging on to customers, not boosting margins. If 5G is to have any real value for the operator, it must accomplish two core objectives – reduce the cost of delivery by a far greater amount than 4G has done, and support brand new revenue and profit models, which can justify network investments. These use cases will vary for each operator, and some are not yet visible, so as well as supporting ubiquitous coverage and dense capacity at low cost, 5G will need to be infinitely flexible.

Dense, ubiquitous, software-driven, flexible – this will be an extremely hard combination to achieve, and it will be accomplished not just by core standards, but by the way that 5G is planned and managed. The network will need to be optimized and automated in real time, and constantly recalibrated to meet changing traffic patterns or applications. That will go a lot further to easing the CTO’s headaches than any step change in the air interface – and will become even more crucial as mobile networks become increasingly virtualized.

Virtualization and software-defined networking will improve mobile economics and broaden the addressable applications, but these are standards which are coming from the IT world, not from the traditional mobile arena. The CTO will increasingly be an IT as well as a telecoms executive, and the way that intelligence from the network is harnessed by IT systems, especially big data analytics, will be critical to the success of 5G.

The prospect of deploying a dense, ubiquitous, software-driven network, with rich links to IT platforms and extreme responsiveness, may sound like an even greater nightmare for the CTO. But without it, the operator will be unable to take advantage of the genuine new opportunities that lie ahead – particularly in some of the emerging Internet of Things markets – and will be stuck forever with a failing smartphone data business model.

The IoT will be extremely diverse and not all its needs will be met by cellular – indeed, its unifying technology will not be the access network, but the IT platforms which coordinate all the moving parts via a holistic view of each customer and service.

This is just the most extreme example of how MNOs will fail if they regard 5G merely as a new network architecture, however modern and virtualized. To turn that architecture into new services and new revenues, it must be planned and managed in a ‘5G’ way also. 5G is a business model transformation for telcos rather than a technological one and so much of the ROI will come from the systems that manage and control service delivery and customer relationships, even when that customer may be a machine. There is a viable business case for the 5G mobile network operator, but it means making some dramatic changes in thinking as well as in networks.

Find out more. Download the ‘Why 5G won’t be enough’ analyst opinion paper.

Source: http://blogs.amdocs.com/network/2016/01/17/5g-solving-all-the-ctos-problems/#.Vpy3FirhC70

3GPP

Power-Grid Hacked? Where’s the IoT?

1 Apr

Writing about the IoT (Internet of Things), or what was once called M2M, is something that people want to read about, a lot. It’s only recently that people are really catching on that everything is going to be connected. So when an article appeared on the front page of the USA Today about the smart grid stating that it was open to hack certainly deserved a chuckle or two, especially from people who are IoT advocates. No offense to my colleagues at the USA Today, but this nationally syndicated newspaper chain was covering the topic as if the fact that vulnerabilities could threaten lives was a breaking news story.

Ironically, there are days people talk about the IoT as if is something brand spanking new. Today newspapers and the broadcast news eagerly espouse the virtues of connected devices because there are apps or gadgets for just about everything imaginable in the IoT. We are now seeing a consumer frenzy surrounding smartphones, fitness trackers, lights, toasters, automobiles, and even baby bottles being connected.

Many people are just beginning to understand the IoT is more than connecting a computer to the Internet, or surfing the Web or watching a YouTube video. To really understand the Internet of Things is to recognize it is more than the everyday consumers gadgets that are getting all the media play these days. Much like the USA Today was so eloquently trying to point out is that the power grid is under attack every day—and what the author stated so clearly—and at any moment, it would leave millions of people without power for days or weeks. And that’s not even the worst of what could happen. Most residents do not equate the average brownout they experience for a few hours to the blackout that could be on the horizon in their neighborhood.

But again most people don’t give the IoT much thought. It’s kind of like their cellphones. Most people don’t know how they work. Nor do they care. They only care they work when and where they need it. The same holds true about their connected gadgets. Most consumers really don’t give their connected gadgets much thought until they need them for tracking their fitness, or turning on their lights or thermostats, or for finding the closest fast food restaurant when traveling in their cars. However, as more and more consumers adopt and adapt to electronic devices as part of their everyday lifestyle, this will change their attitudes and perceptions forever and the excitement for connected devices will trickle over into the enterprise. It is already happening with smart cities, with parking meters, trash pickups, snow removal, first responders, and smart utility meters.

Perhaps that is why the USA Today story has some real significance now and enterprise companies are starting to move away from just talking about the IoT to figuring out ways to implement solutions and services.

Part of the problem with the grid today is that it was designed with OMS (outage-management systems) that were configured to be reactive to signals that indicated outages and managed restoration. However, going forward the IoT systems being designed are able to prevent outages and restore services. These services, as one analyst firm says, could lead to a very bright future for the smart-grid, and as a result, projections based on these services makes sense and are very tangible.

While enterprises are looking to adopt the IoT, there seems to be a blurring of the lines between actual growth and hyperbole in market estimates. Vendors want to make huge growth predictions—50 billion devices—which currently is the buzz of the industry. However, these enormous market amplifications have already proven they will undoubtedly stall growth.

Corporate America seeks growth forecasts that are meaningful and that help deliver solutions to improve bottomline results and shareholder value. Again, one network carrier’s conjecture boasting the number of connections could quadruple by 2020, reaching more than 5 billion, doesn’t mean anything if all of these devices and connections are going to be hacked and CEOs heads are on the chopping block.

The same carrier was even quoted as saying in order for the IoT to reach these prognostications, networks must be reliable, the data from all of these connected endpoints must be able to be stored reliably and securely, infrastructures must be secure, and there must be ways to achieve device management.

If all the stars are in alignment, there is no question the IoT is poised for growth. But, that means everyone has to focus on making security a top priority to fend off the bad guys and to consider the market unknowns that can slow or delay IoT development.

That’s why the formation of groups like the ITA (Illinois Technology Assn.), www.illinoistech.org, Internet of Things Council—a public/private partnership that aims to assure civic leadership in the Internet of Things can will help companies sort through the facts from the fiction to jumpstart their initiatives.

Thus, it’s no wonder the more the industry does its crystal ball gazing, we are doing a disservice to IoT’s true potential. Even Federal Energy Regulatory Commission Chairwoman Cheryl LaFleur was pretty poignant in her remarks when she was quoted in the USA Today article referring to the potential of an attack, “One is too many, so that’s why we have to pay attention. The threats continue to evolve and we have to continue to evolve as well.”

Makes you wonder if the industry is evolving or just continuing to bandy about forecasts with little or no regard for living up to market or shareholding expectations much like it has for the past 15 years. Regardless of what you believe in all of this, the IoT is changing our lives one way or the other and it will certainly have an even greater impact on each and every business. How and when, those are the billion dollar questions.

Source: http://connectedworld.com/power-grid-hacked-wheres-the-iot/

Smart cities to quadruple by the year 2025

4 Aug

The number of global smart cities is expected to grow from 21 in 2013 to an estimated 88 in 2025, according to a new report from IHS Technology. These smart cities will possess energy efficient infrastructures as well as keep a maintained focus on security and streamlined transportation efforts.

smart-cities-1024x576

Lisa Arrowsmith, IHS Associate Director, defines a smart city as a city that has deployed “the integration of information, communications and technology (ICT) solutions across three or more different functional areas of a city.” She further adds that these implementations could be in the realms of mobile and transport, energy and sustainability, physical infrastructure, governance, and safety and security.

Among the 21 cities IHS currently categorizes as smart are five in the U.S. – San Francisco, Los Angeles, Boston, Chicago and New York. According to the study, “Asia-Pacific will account for 32 smart cities of the total in nine years’ time, Europe will have 31, and the Americas will contribute 25.”

19bbpfy362vh4jpg

“London, for example, is retrofitting both residential and commercial buildings to lessen carbon dioxide emissions,” the study notes. “The city is also adopting charging infrastructure to support the introduction of 100,000 electric vehicles.” In Santander, Spain, it adds, “soil-humidity sensors detect when land requires irrigating for more sustainable water use.”

The IHS report titled, “Smart Cities: Business Models, Technologies and Existing Projects,” also finds that the current $1 billion worldwide annual investment in smart cities will grow to over $12 billion by the year 2025. The report continues on to demonstrate the need for smart cities as a response to increasingly congested and polluted cities.

With a global population that is becoming overly urbanized, certain resources are becoming scarce in these densely populated areas. Smart cities and tech based city organization can focus on these limited resources and assure they are managed in a way that provides the best solutions for inhabitants.

While today’s smart cities may not be the most cost-friendly option when reorganizing an urban area, Arrowsmith lauds the possibilities that smart planning could provide. She notes the collaboration of public and private sectors could unquestionably boost a local economy. Incorporating technology applications into city planning could in turn create jobs or even foster a high tech culture within the municipality.

tech_santander21__01__630x420

The glowing example of a global smart city is Santander, Spain. After obtaining an EU grant, the aging port town organized a team to install over 12,000 sensors within city limits. BusinessWeek’s Carol Matlack writes that the sensors track everything from surfing conditions to traffic congestion. The city has even placed sensors deep in the ground of their parks to measure soil humidity and can then properly determine sprinkler usage. In all, Santander is a prime example of how technology and communication can work in unison to better organize the smart city of the future.

With the example Santander has provided as well as what plans for cities across the globe have in store, it’s certainly not far-fetched to believe in the projections provided by the IHS report. You can read the IHS document in its entirety here.

Source: http://atmelcorporation.wordpress.com/2014/08/01/report-smart-cities-to-quadruple-by-the-year-2025/

Big Data – Trends & Trajectories

4 Aug

Would you be taken aback if Big Data is declared as the word of the year 2014? Well, I certainly wouldn’t be. Although initially it started off as a paradigm, Big Data is permeating all facets of business at a fast pace. Digital data is everywhere and there is a tremendous wave of innovation on the ways big data can be used to generate value across sectors of the global economy.

In this blog we shall discuss few big data trends which will have immense significance in the upcoming days.

Internet of customers:

In a panel discussion at the World Economic Forum, 2014, when asked what will be important in the next 5 years, Marc Benioff, CEO of salesforce.com, elucidated on the importance of big data in enhancing and maintaining the customer base. As we talk about mobility and the internet of things, we should recognize that behind every such device is a customer. It is not an “internet of things” but an “internet of customers”.

The catchword here is “Context”. With data explosion happening in every industry, we are gathering unprecedented amount of user contexts. Big data provides tremendous opportunities to harness these contexts to gain actionable insights on consumer behavior. It doesn’t really matter if you are a B2C or a B2B company but what actually matters is how effectively you utilize the potential of big data to extract useful contextual information and use it to build a 1:1 relationship with individual customers. The companies that use this opportunity to enhance their customer base will be the most successful in the future.

Good Data > Big Data: One of the most prominent illustrations of big data in action is Google Flu Trends (GFT), which uses aggregated Google search data to monitor real-time flu cases world over. Google used specific search terms and patterns to correlate between how many people searched for flu-related topics and how many people actually have flu symptoms. With over 500 million google searches made every day, this may seem to be the perfect big data case study but as it turns out, GFT failed to perform as highly as it was expected to. GFT overestimated the prevalence of flu in the 2012-2013 and

2011-2012 seasons by more than 50% and also completely missed the swine flu epidemic in 2009.

This has led many analysts to sit back and retrospect on the big data strategies which caused this failure. The fallacy that a huge amount of data leads to better analysis should be recognized. Rather than taking into consideration indiscriminate and unrelated datasets which worsen the problem, the analysis premise should study data based on specific definition and aligned to the objectives. Big data methodologies can be successful, but only if they are based on accurate assumptions and are relevant.

Open Sourcing: Google never made public the criteria it used to establish the search patterns and has hence hindered further analysis on the failure. This experiment necessitates the introduction of the open source culture in big data technologies. Studies involving astounding amount of data should involve greater cooperation and transparency between participating organizations which would in turn help build robust predictive models.

Visualization/ User experience: Presenting data in an understandable and rational way is another issue concomitant with big data technologies. Softwares which help deduce insights from big complex datasets will be much in demand in the near future. Analytical business softwares with user-friendly and intuitive user interfaces will form a critical component of the sales of big data technologies.

Many technology giants have started to focus on building easy-to-use and engaging user experiences which would make them popular facilitators of big data. In one of his all-hands speeches in the second half of 2013, Jim Hagemann Snabe, Co-CEO of SAP AG, outlined SAP’s vision to change the perception that its softwares are complex and difficult to use. As far as SAP is concerned in this context, user experience is one of the focus points of SAP’s strategy and it will go a long way in helping SAP further cement its position as one of the analytics market leaders and a promising enabler of big data technologies.

Source: http://chayani.wordpress.com/2014/08/03/big-data-trends-trajectories/

First Test with Everythng Cloud, Raspberry PI and Python

12 May

During my constant research and work on the topic „Internet of Everything“ or „Internet of Things“ and to be more special: searching for a cool middleware, I came across a very interesting website: https://www.evrythng.com/. The whole concept behind looks very promising and I was tempted to give these guys a try. I have a few sensors, some knowledge in programming and I need some online space to store my data. So basic motivation is there to look out for some good concepts which are not doomed to be killed soon after their first release date because of loosing the race to be the best middleware. As a start I registered an account and my first idea was: connecting a sensor to their cloud based storage system and simply try how it might look like. Of course they offer a variety of other services, like:

  • Products, which are a description of class of THNGs (THNG = smallest unit)
  • Applications which can be used to connect e.g. a webbrowser
  • Campaings
  • Analytics
  • and many other very useful services.

Basically you can build a whole economic system on their online spaces. So where to start? First things first and so I decided to build only a simple use case to connect only one sensor which seems to be sufficient for a small test. The sensor which is the basic entity in their terms: a THNG. So for a fast start: build a small sensor and then try to connect it via a Python script. My self-imposed task was: using Python as programming language. (I could have used some different languages, like JAVA. This would have been much easier because there exists a ready-to-use Java API here ..)

At first I thought about a sensor architecture which included to build a sensor gateway to connect my local sensors (picture: Basic Sensor Architecture). Why? A gateway can:

  • protect the communication from sensor to cloud based services
  • has more CPU power to use stronger cryptographic algorithms
  • protect the sensors from being accessed from outside the Internet
  • store data when the connection to the cloud services is lost
  • lower the volume of traffic by sending data only, when the value is changing (saving bandwidth),
  • bridging connectivity options between e.g.: Ethernet and Bluetooth
  • translating IPv4 into IPv6 or vice versa
  • etc …

Sensor gw

Basic Sensor Architecture

But this would have taken some time, so I simplified the architecture a little bit and connected a sensing device directly to the EVRYTHNG cloud, which is reflected by the following picture: Proof of Concept Architecture.

Sensor

Proof of Concept Architecture

And of course the sensor must have enough CPU & storage capacity to run some functionality, like Python, ssh, etc.. on it. Of course there must be at least an ethernet connectivity to transport its data to its final destination. For these basic requirements I made my choice: I used one of my Raspberry PIs for my experiment. A bread board, some wires, a resistor and a waterproof temperature sensor did the trick:

PI sensor

Raspberry Pi as a sensor

Next step: getting data from the sensor via Python was not a very difficult task and there are a lot of very good tutorials on how to program Python to get sensor values: e.g. here. So I skipped that step in my description to concentrate on the new things.

I headed for a test account and after registering I got my personal access. Next step was to setup a THNG as a base entity to send data to. I decided not to do it via API calls (which would have forced me to write another script) instead I did it the traditional way: using the web-GUI from EVRYTHNG:

Thng

THNG Creation

After getting my THNG and of course here you can find the ThingID (greyed out) which is important to access it via API calls from your sensor, you can then define some properties which will then will be filled with data from your devices ( a really nice one is to use Geo locations, because if you have a moving sensor e.g. in a car, this could easily become quite handy). For this THNG I simply used one property: temperature. Of course you can use multiple of these data fields, but again: my task was only to do a simple test if I can get the whole story up and running.

But the most interesting part was: getting the REST API from EVRYTHNG to work with Python. No API exist for Python so far and I had to experiment a little on how to write a PoC Code for such a task. But again: it was not really a challenge, because the API from EVRYTHNG is very well documented and can be found here. A lot of examples of how REST is working, which return codes you can get back, how to create and delete, and most important: how to store and access your data. I decided to build a small class which fits for a sensor with limited amount of space. So the basic task was: sending data via http and before you can do that: converting the Python data array to JSON (yes, https is also an option and it is recommended once you use it for your production environment!!!). I did it via urllib2 and converting Python data to JSON, I used simply the json coder / encoder API. Nearly everything worked straight, except one little glitch which costs me some time: to convert data to an EVRYTHNG API-readable format I had to add brackets to the final json-converted data like: “[“+self.DATA+ “]“. After that challenge my little program worked and I was able to produce some valid input into EVRYTHNG. You can find the Python class below. Feel free to use it (at your own risk)…

”’

@author: Axel Dittmann

”’

import urllib2

import json

class sensorthng:

    ”’

   a very small class just to update sensors .. no further functionality provided 

    ”’

    def __init__(self, API_Token):

        self.API_TOKEN = API_Token

        self.HEADER_UPDATE = {‘Content-Type’‘application/json‘Authorization’: self.API_TOKEN}

        self.URL =https://api.evrythng.com/thngs/”

 

    def open_query (self, query_URL, data, header):

        self.URL_REQUEST = query_URL

        self.DATA = data

        self.HEADER=header

        #Request is GET if data = None, POST if data has some valid JSON data

        self.REQUEST = urllib2.Request(self.URL_REQUEST,self.DATA,self.HEADER)

        self.RESPONSE = urllib2.urlopen(self.REQUEST)

        return self.RESPONSE.read()

    

        #Updating properties of a thng

    def updating_properties(self, thng_id, data):

        self.DATA = data

        self.QUERY_URL = self.URL+thng_id+“/properties”

        #convert to JSON

        self.DATA = json.dumps(self.DATA)

        # now the important part: add [] to the json data, otherwise the function would not work

        self.DATA = “[“+self.DATA+ “]“

        #change of the header -> must be update string

        return self.open_query(self.QUERY_URL, self.DATA, self.HEADER_UPDATE)

My basic Proof-of-Concept code looked like this:

mythng = sensorthng(„USE your API-Access-Code-here”)

    

whileTrue:

    try:

        c_temp = read_ctemperature()

        print c_temp   

        data =  { ‘key’:‘temperature’,‘value’:c_temp}

        print mythng.updating_properties(„USE your THNG-ID here”, data)    

        time.sleep(60)

    except:

        print” Error Occured .. next try”

It is doing nothing else than sending data (of course getting the Celsius temperature from the sensor) and if an error occurs e.g. my internet connection will be down, then it will simply loop until the connection is restored. So no interruption of service in case of failure …

So basically just write a little code to implement the class and and use your API_Token from your EVRYTHNG login and your THNG_ID from your THNG screen. Then it should just work fine and the script updates your data .. Like my Raspberry PI: I decided to just sending a continuous stream for every 60 seconds:

PI response

Temperature and EVRYTHNG API return data

In this picture you can see, that the EVRYTHNG functionality add a timestamp to your data if you do not provide one. And this output shows also the return info from the EVRYTHNG API once the data was sent to their REST API if everything works and the data was successful delivered. And now let’s take a look how it looks like in the EVRYTHNG web GUI:

Evrything

THNG Overview

After clicking on the properties of my PI_temp_sensor you can than see a nice little graphical view of your submitted data:

Tmp graph

Temperature Data

The graphical view expressed the real life situation: at 8:10 it started to rain and this caused the temperature to drop 🙂.

So, first test done, in my opinion very easy and of course for me was my little test a great success, because everything works just „out-of-the-box“ and straight forward!! The EVRYTHNG idea has great potential and I am very sure that they will make their way to one of the leading THNG / IoT providers in the near future. The topics I haven’t covered so far are really interesting subjects like: using coordinates on a map to have a better overview of your THNGs, or grouping THNGs, etc .. I haven’t discovered EVRYTHNG’s full potential yet, but due to the fact that every upcoming day you have to learn new things, there must be something left for tomorrow 🙂 …

 

Source: http://ipv6poclab.org/2014/05/11/first-test-with-evrythng-cloud-raspberry-pi-and-python/

5 Signs for a bad M2M connectivity agreement

28 Apr

simM2M and IoT solutions are in many ways all the same; a unique problem and a unique solution. “Where is my truck at the moment? Is my office protected well tonight? How is the patient doing?” All these questions are unique for a customer but there are some common rules that every M2M/IoT solution can benefit from. The solution collects some kind of data, communicates it back to a data centre, the company performs some kind of analysis and compares the results against expected or past performance levels.

Connectivity, in this business flow, is one of the crucial components. While every M2M/IoT solution is specific, there are common needs for a trusted connectivity provider who will cope with your business requirements. But how do you recognize them? I will try to give you 5 big mistakes that will increase your M2M connectivity costs.

1- Fixed contract terms and commitments

Do you need to sign a commitment letter to get best prices? Then you are in trouble. A trusted M2M connectivity provider should be able to fit into your business model. You should be able to turn your SIM cards in active or de-active mode whenever you need and you should be billed only if the SIMs are active.

So, do not sing a fixed deal. Look for flexibility and different price plans to match with your business model.

2- Un-flexible rate plans

Do you have pre-defined pre-decided price plans matched with your SIM cards? With such plans, it is not a question of if, but when, you will incur expensive overage due to many different reasons. With per-device rate plans you are stuck on your M2M connectivity provider’s hand. You as a customer should be able to change price plans whenever you want and you should be billed only for the actual usage, not based on fixed price plans.

3- Complex roaming fees

Do you know what does it cost per network per country? Is it easy for you to understand and predict your costs? Your roaming pricing should be very simple and predictable.

4- Can you do remote diagnostics?
Does your connectivity provider enable you to do your own error corrections or diagnostics? You should be able to see what the problem is and how to solve it? This is crucial for most of the M2M/IoT products as your customer are depended on your solution for them. So you need to solve the problems quickly and take quick actions.

5- Can you automate your flows?

Can you define pro-active alarms and automatic rate-plan changes based on the situation of the SIM? What happens someone physically attacks your device out in the market? What do you do when a SIM consumes much more than it should normally consume? Can you automate your business flows based on the usage, situation or location of the SIM? You should be able to do that and it should be for free, of course!

Traditional operators around the globe promise the lowest rates and quality, but is this really the case? I am at Amsterdam today meeting some customer and I should say, I am surprised how un-flexible and pre-defined services they are served with. I think your company needs a reliable and predictable M2M/IoT connectivity service provider that can help your business to grow. Do you need to know more about how to lower your M2M connectivity costs, please go ahead and contact me.

Source: http://celikalper.wordpress.com/2014/04/25/5-signs-for-a-bad-m2m-connectivity-agreement/

The Internet of Things Ecosystem

7 Apr

The Internet of Things Ecosystem framework was introduced this past week and was well received for providing a holistic view of the different segments that make up the IoT. This post provides a snapshot of the companies, organizations, technologies and trends within the IoT Ecosystem, and the full list of articles within the segment. Ideally, over time, key players and technologies will start to emerge through the ongoing analysis of the Internet of Things Ecosystem framework.

 

IoT in the News (3/31-4/4)

 

Medical engineers said Sunday they had created a device the size of a plaster which can monitor patients by tracking their muscle activity before administering their medication.Scientists Have Created An Incredible Patient-Monitoring Device That Is The Size Of A Band-Aid

Methods for monitoring so-called “movement disorders” such as epilepsy and Parkinson’s disease have traditionally included video recordings or wearable devices, but these tend to be bulky and inflexible.

The new gadget, which is worn on the skin, looks like a Band-Aid but uses nanotechnology — in which building blocks as small as atoms and molecules are harnessed to bypass problems of bulkiness and stiffness — to monitor the patient.

http://www.businessinsider.com/nanotechnology-patient-monitoring-device-2014-3#ixzz2yA3oYeD0

Hand Hygiene Technology Startup Hyginex Receives Investment from Persistent Systems

Hyginex has committed to saving patient lives through use of its novel hand hygiene improvement and monitoring technology. The system uses patented wearable technology and sensors in healthcare facilities to help doctors and nurses improve hand hygiene practices.

“Hyginex’s unique approach based on innovative wearable technology protects patients in hospitals and improves global healthcare,” says Dr. Sridhar Jagannathan, chief innovation officer for Persistent Systems, Inc. and head of the Persistent Venture Fund. “To this investment in Hyginex, Persistent brings its expertise and focus on emerging medical technologies.”

‘Internet of things’ will significantly alter supply chains

Michael Burkett, managing vice president at Gartner, said: “It’s important to put IoT maturity into perspective, because of the fast pace at which it is emerging, so supply chain strategists need to be looking at its potential now.

“Some IoT devices are more mature, such as commercial telematics now used in trucking fleets to improve logistics efficiency. Some, such as smart fabrics that use sensors within clothing and industrial fabrics to monitor human health or manufacturing processes, are just emerging.”

http://www.supplymanagement.com/news/2014/internet-of-things-will-significantly-alter-supply-chains

Wearable Technology Appeals to Jewelry Consumers

High-tech watches were the most preferred wearable technology, with 55 percent of participants saying they would buy the device, followed by bracelets and wristbands with a 27 percent share.

As for where consumers would make their purchase of wearable devices, there was no clear trend; however, respondents stated that the local independent jeweler was their least preferred retailer. This could still present an opportunity for local jewelers to capture the space by increasing their visibility overall and build a reputation as a ”go-to” source for wearable tech devices.

http://www.diamonds.net/News/NewsItem.aspx?ArticleID=46519&ArticleTitle=Wearable%2BTechnology%2BAppeals%2Bto%2BJewelry%2BConsumers

Microsoft Paid Up To $150M To Buy Wearable Computing IP From The Osterhout Design Group

Microsoft, we have discovered, has paid up to $150 million to buy IP assets related to augmented reality, head-borne computers, and related items from the Osterhout Design Group, a low-profile company that develops wearable computing devices and other gadgets, these days primarily for the military and other government organizations.

As you might remember, we first broke the news that Microsoft was looking at acquiring ODG, or part of its assets, in September 2013, at a price tag of up to $200 million, depending on what went into the deal.

The government continues to be ODG’s primary customer, although as Osterhout reminds us, the pace of technology right now is such that the kinds of innovations being created for enterprises and other organizations has very direct applicability to the consumer market, too — and the reverse as well when you think about the wider trend of the consumerization of IT.

“In terms of what we’re doing [at ODG], we don’t make weapons. We make things that can help people do their jobs,” he says. “The real focus are features that are applicable in the consumer space, too.” In other words, ODG may already be talking to other companies for consumer products; or its door is open to that possibility.

http://techcrunch.com/2014/03/27/microsoft-paid-up-to-150m-to-buy-wearable-computing-ip-from-the-osterhout-design-group/

SITA and Virgin Atlantic win Smart technology award

LONDON – SITA and Virgin Atlantic Airways have received a Smart Technology Award from The Wearables 2014, the leading awards for wearable technology. Part of the 2014 Wearable Technology Show, the award recognized the companies for a pioneering pilot, which tested how Google Glass and Sony Smartwatches could enhance the passenger experience. – See more at:http://www.traveldailynews.com/news/article/59781/sita-and-virgin-atlantic-win#sthash.9Z0U5mmn.dpuf

Microsoft acquires 80 wearable tech assets in $150m deal

The technology juggernaut was in talks to purchase ODG entirely in September last year, an unnamed source told TechCrunch. However, the deal later switched to an IP acquisition. The transaction was completed last year and all related patents and IP transferred in January.

The hefty price tag of $100 million-to-$150 million was attached by another source. Microsoft has given no official comment so far, but the deal has was confirmed by ODG founder and inventor Ralph Osterhout.

http://www.mcvuk.com/news/read/microsoft-acquires-80-wearable-tech-assets-in-150m-deal/0130473

Future of Wearable Tech: Solar-Powered Dresses and Wi-Fi Suits

The designers at Studio Roosegaarde have created a provocative dress called “Intimacy” that becomes transparent based on “close and personal encounters” with other people. The dress is made of smart electronic foil that gradually becomes see-through as its wearer’s heartbeat increases.

http://www.livescience.com/44486-wearable-tech-disappearing-dresses-wifi-suits.html

Wearable Tech Conference arrives in NYC this summer

A conference focused on one of the hottest tech growth areas will launch this summer in New York City. Organized by media group TMC, the Wearable Tech Conference & Expo will be held July 23-24 at The Javits Center.

The agenda for the event will provide attendees with new perspectives on wearable tech devices and technology, how and why they work the way they do and what lies ahead for the billion-dollar industry. Individuals and companies attending the conference will have the opportunity to interact with colleagues, meet new partners, see live demos and participate in in-depth discussions.

“Wearable technology is a rapidly growing trend and one that will reach a broad set of markets, from fashion to healthcare to fitness,” says Rich Tehrani, TMC CEO and conference chairman. “We look forward to providing a robust and interactive experience for our event attendees, providing them the tools and information needed to take advantage of every opportunity in the wearable industry and achieve business success.”

http://www.ept.ca/news/wearable-tech-conference-arrives-in-nyc-this-summer/1002986506/?&er=NA

Marc Newson’s Wearing Technology (Vogue)

MARC NEWSON, designer of some of the world’s most desirable industrial objects, says there’s no question that wearable technology is “the future” but counters that, for now, it falls way short. “I wouldn’t be seen dead wearing [Google Glass],” he says. “It looks pretty stupid. It’s a little bit like that wonderful invention called the Segway.  Such a fantastic piece of technology but you just look like a complete dick when you drive around on it. That’s precisely the moment when I think the fashion world laughs at the world of industrial design.”

http://www.vogue.co.uk/news/2014/04/01/marc-newson-wearable-technology

Could wearables become bigger than tablets?

Mobile gadgets won’t just be tucked into your purse or your pocket. Soon, they’ll increasingly be on your wrist, as part of your glasses and even in your clothing. While still in its infancy, wearable technology is poised to take off. The market for the wearables business is expected to exceed $1.5 billion in 2014, double its value last year, according to a report from Juniper Research.

http://tech.fortune.cnn.com/2014/04/01/could-wearables-become-bigger-than-tablets/

The explanation for recent wearable technology abandonment

The Guardian posted today that one-third of consumers are abandoning their wearable tech devices. The author, Charles Arthur references research from Endeavour Partners in which it states, “one-third of American consumers who have owned a wearable product stopped using it within six months.”

http://www.examiner.com/article/the-explantion-for-recent-wearable-technology-abandonment

How Wearable Tech Could Improve Your Mental Health

Smart wristbands have become increasingly popular tools among people interested in tracking data about themselves, from their heart rate to their movement during daily activities. In the future, these devices could also help people understand the symptoms of conditions such as autism and depression, researchers say.

These researchers have recently focused their work on children with autism, and have found that these children’s expression of their emotions often does not correlate with their internal arousal state as indicated by wristband data. For instance, a child with autism might appear to be experiencing a high-energy episode when, in fact, wristband data indicates that their internal state is calm.

http://www.livescience.com/44516-future-of-smart-wristbands.html

DRESS USES TECHNOLOGY TO GUARANTEE NIP SLIP WHENEVER YOUR HEARTRATE IS ELEVATED

Wearable technology is a scifi idea that’s just starting to become a reality, so it’s to be expected that for a while its gaze is going to exceed its grasp, and in the world of fashion one must make allowances that one typically doesn’t for technology for pieces that blur the line between art and a functional object. But dang, wearable tech, if a dress that turns more translucent based on the speed of your heart rate doesn’t perfectly embody the vast gulf between the dreams of wearable tech and the reality.

http://www.themarysue.com/see-through-dress-heartrate/

Google and designer fashion brand Fossil join in wearable tech

Google and Fossil designer brand announced today that they will create with technology and fashion a designer brand smartwatch as a first step in this union of wearable tech trend, reports Mobile Commerce Press.

The chief strategy and marketing officer, Greg McKelvey, of the Texas-based Fossil released a statement to explain the creating wearable fashion rooted in twenty-first century technology, ‘we believe we are uniquely positioned to develop and bring to market products for our fashion customers that marry the beauty of our designs, the promise of our brands and now the function of new technology.’

http://www.examiner.com/article/google-and-designer-fashion-brand-fossil-join-wearable-tech

The Future of Wearable Technology

Wearable technology is on the rise as companies like Sony, Samsung, and Motorola join the ranks of Google and potentially Apple in the development of smartwatches. At present, some 15 percent of consumers are using this wearable technology, a term that includes everything from Google Glass to fitness bands to smartwatches.

If any company were to come to the forefront in designing a smartwatch, it would inevitably be Google. The company recently announced that it would indeed be delving further into wearable technology with a carefully designed and simple smartwatch model known as Android Wear.

http://tier10lab.com/2014/04/01/the-future-of-wearable-technology/
Devices don’t connect to the Internet to see what their friends are posting on Devicebook. They’re exchanging data and performing tasks via a well-understood paradigm — applications.Internet Of Things – Or Business Of Applications?

There’s a lot of buzz these days about the — forthcoming, already here, arriving between 2013 and 2017 — Internet of Things. Along with that buzz come statistics — particularly from networking expert Cisco, which predicts that billions of devices will connect to the Internet in the next few years.

No one disputes that because it’s obvious. From wearables and Internet-enabled televisions to smart pens and children’s toys, the growth of the Internet of Things seems like an inevitable conclusion.

http://www.informationweek.com/infrastructure/data-center/internet-of-things—or-business-of-applications/d/d-id/1141567

Wearable Technology Innovator Valencell Teams with 3Pillar Global

3Pillar Global, a leading developer of innovative, revenue-generating software products, has announced a product development partnership with Valencell, an innovator in mobile health and fitness technology. 3Pillar and Valencell are collaborating to build a new consumer-facing application that will use industry-leading data from Valencell’s PerformTek sensor technology to help consumers understand a vast array of biometric information.

http://www.broadwayworld.com/bwwgeeks/article/Wearable-Technology-Innovator-Valencell-Teams-with-3Pillar-Global-20140401#S9CAi0JG0A1qmbys.99

Growing Internet Of Things Means Growing Opportunity For Solutions Providers

The Internet of Things gives solutions providers an expanded opportunity. It enables them to grow into the role of their customers’ single, trusted advisor and manager of all IT systems and solutions inside — and outside — of the office.

Len DiCostanzo, senior VP of community and business development for Autotask and former managed services provider (MSP), points out, “There is a tremendous opportunity out there.” To capitalize on this opportunity, solutions providers will have to do what they’ve always done — connect things through a network. The Internet of Things, however, means MSPs now need to create solutions to support their customers’ business objectives not only with technology at the office, but also with technology necessary to manage mobile devices and remote users.

http://www.bsminfo.com/doc/growing-internet-of-things-grows-means-growing-opportunity-0001

F. Scott Moody jumps from iPhone fingerprints to Internet of things

Two years after selling his company and technology to Apple (Nasdaq: AAPL), AuthenTec founder F. Scott Moody is at it again.

Moody, who helped develop the fingerprint security technology Apple uses in its latest iPhone, has taken a slight pivot with his new Raleigh startup- K4 Connect.

“Basically, K4 Connect is the name of the company that’s providing a software platform that connects various things,” he says.

http://upstart.bizjournals.com/companies/startups/2014/04/01/f-scott-moody-apple-internet.html

Microsoft readies to join the ‘Internet of Things’ with Windows on Devices

We are less than 12 hours away from Microsoft’s keynote at Build 2014. We’re really looking forward to all the news surrounding Windows Phone 8.1. That said, Microsoft will be showing off plenty of other products, devices and services. It also looks like Microsoft is ready to join the ‘Internet of Things’. A website has gone live on the night before Build. The site gives us a little insight into what else Microsoft may announced this week.

Windows on Devices (www.windowsondevices.com) is now live on the internet. The site talks about the next big thing being small and how Microsoft plans to bring Windows to a whole new class of devices.

What sort of devices? A coffee mug, a talking bear, a robot or anything else your imagination can dream up. Microsoft even says they’ll demo a connected piano with Windows on Devices at Build 2014.

It looks like Microsoft might be making development tools for developers looking to join the Internet of Things. An SDK is expected to release during spring 2014 with a look at new software and APIs.  Windows on Devices will allow devs to work with development boards like the Intel Galileo.

http://www.wpcentral.com/comment/823526

Tech companies to work together on Internet of Things

One of the group’s goals is to draw up inter-operability standards so that the devices, sensors and networks members create can communicate with each other and the data they exchange can be secure.

The organization is to be managed by the Object Management Group, a Boston-based nonprofit trade association. The coalition is still discussing which industries could serve as test-beds for new standards.

http://www.washingtonpost.com/business/economy/tech-companies-to-work-together-on-internet-of-things/2014/03/28/365ad22c-b436-11e3-8cb6-284052554d74_story.html

Start-Up 1248 Gets Backing From Heavyweight

Currently, 1248 is creating an open standard to allow IoT applications and services to work together as part of a project funded by the U.K. government’s Technology Strategy Board. The U.K.’s tech industry recently received a boost from Prime Minister David Cameron, who has pledged more than £45m to help companies develop IoT technology.

http://www.connectedworldmag.com/latestNews.aspx?id=NEWS140325170804173

85% of the public sector is unprepared for the impact of wearable technology on its IT infrastructure – 

A Freedom of Information (FOI) request by Ipswitch has revealed that when asked specifically about managing wearable technology entering the workplace – from Google Glass to smart watches – an overwhelming 85% of public sector organisations (PSOs) admitted to having no plan in place.

The request revealed that despite 93% of PSOs having invested in network management tools, less than a quarter (23%) bother to review network performance regularly during office hours.

It also found that, despite the rich feature set offered by these tools, almost two-thirds (65%) of PSOs across the UK are unable to differentiate between wired and wireless devices on their network.

Finally, even though performance was cited as a key priority by 87% of PSOs, only just over a third (34%) review network performance on a weekly basis or less frequently.  One in eight (12%) of PSOs admit to not reviewing network performance at all.

http://www.information-age.com/technology/data-centre-and-it-infrastructure/123457858/85-public-sector-unprepared-impact-wearable-technology-its-it-infrastructure#sthash.2QApmN1c.dpuf

Internet of Things Enables $3.88 Trillion in Potential Value to Manufacturers

ARC Advisory Group believes that the emerging Internet of Things (IoT) will offer value across multiple industrial sectors and applications. Cisco expands on this, using the term, “Internet of Everything” (IoE) to describe its vision of bringing people, process, and data together via the Internet of Things. The company predicts that the IoE could enable manufacturers to generate $3.88 trillion of value through a combination of increased revenues and lower costs over the next ten years.

http://www.ien.eu/article/internet-of-things-enables-388-trillion-in-potential-value-to-manufacturers/

Australian “Internet-of-Things at home” market to hit $1 billion by 2017: Telsyte

The Australian smart-home automation market is set to reach almost $1 billion by 2017 making the connected home a reality for many Australians.

A new study from the technology analyst firm Telsyte has revealed the home automation market will generate $160 million in device revenues in 2014.

This is expected to grow to $917 million by 2017.

http://www.arnnet.com.au/article/541779/australian_internet-of things_home_market_hit_1_billion_by_2017_telsyte_/

Identiv Raises $20M to Deliver Trust to the Internet of Things

FREMONT, Calif., March 31, 2014 (GLOBE NEWSWIRE) — Identiv (Nasdaq:INVE) today announced that it has entered into a $20.0 million term loan and line-of-credit agreement through Opus Bank’s Technology Banking Division. The proceeds of the transaction will be used to retire existing debt and enhance liquidity, creating a stronger financial platform to accelerate growth.

“We are now focusing on delivering trust solutions for the rapidly expanding connected world,” said Jason Hart, CEO of Identiv. “Our ‘Trust Your World’ vision is applicable to billions of everyday items that demand to be trusted. We are expanding from a strong base, having shipped product for well over 100 million everyday items in 2013.”

“We could not be more excited about contributing to this shift in the industry. There is so much potential in the emerging Internet of Things market and in connecting everyday items,” said Kevin McBride, Senior Managing Director and Head of Opus Bank’s Technology Banking Division. “Identiv is a clear market leader with its strong capabilities and core competencies, and we are proud to be joining Identiv on this journey.”

http://online.wsj.com/article/PR-CO-20140331-912568.html

The challenge of the Internet of Things in the workplace and data centre

Peter Wood, ISACA member and CEO of penetration testers First Base Technologies said that as devices are often built small and cheap, they have little in the way of authentication and encryption built-in. It works in the background, but an educated attacker could leverage devices to get access to the rest of the network.

“As we get smart buildings with connected heating, ventilation and air conditioning it is not unfeasible for an attacker to switch off the air conditioning in the data centre, or to turn up the heating so everyone has to leave,” he said.

“The challenge is all of these devices can talk to each other. A smart building will have servers to address all of the endpoints, but it is not difficult for an attacker to impersonate that server or take the devices over completely. In the Far East you will see smart cities sooner rather than later.”

http://www.itproportal.com/2014/04/02/challenge-internet-things-workplace-and-data-centre/#ixzz2xmSLh0YL

Keeping track of athletes with wearable tech

Sports fans among us have seen the proliferation of wearable GPS devices in professional sports such as AFL and the rugby codes, where tracking devices are worn between the shoulder blades of the athletes.

And it is not limited to the professionals, as any Lycra-clad weekend cyclist with a GPS-enabled smartphone will tell you.

By tracking athletes and measuring heart rates it is possible to monitor fatigue, track player movements in relation to each other, plan team strategies and improve training.

The next revolution is to make it all possible indoors and under stadium roofs, and with the new CSIRO indoor tracking system the future is already upon us.

With the addition of the CSIRO wireless ad-hoc system for positioning (WASP) technology, these parameters can all be measured under the roof of the Docklands stadium, in ice hockey rinks, netball centres and indoor velodromes. The device, called ClearSky, is produced by Victorian company Catapult Sports which supplies GPS devices to the international elite sports market, including the US National Football League (NFL) and European football leagues.

http://phys.org/news/2014-03-track-athletes-wearable-tech.html#jCp
We expect to see full or nearly full adoption in 2014 and beyond as wellness and wearable use continues to expand. From a market leadership standpoint, three vendors have quickly broken from the pack in the Bluetooth Smart race. In products released in 2013, we found Qualcomm, Broadcom, and MediaTek as the providers of this technology in nearly 90 per cent of devices we analysed. Where Qualcomm and Broadcom are primarily seen in the global who’s who of mobile devices, MediaTek has ramped up quickly and is seen in the leading Chinese devices.Bluetooth Smart charts course to widespread IoT adoption

http://www.eetasia.com/ART_8800696690_499488_NT_ec0157eb.HTM

Wearable technology, beacon, augmented reality, and content

Wearable technology was one of the big topics at SXSW 2014. A lot of the panelists spoke about the importance of brands to deliver valuable & unique content and wearable technology will allow for that, although, brands will need to be careful not to overload fans/consumers with content.

Contextual interaction is one of the main drivers of wearable technology and brands will be able to differentiate themselves by the way they connect with consumers through story telling.

http://blogs.imediaconnection.com/blog/2014/03/31/wearable-technology-beacon-augmented-reality-and-content/

Fraunhofer Designs Flexible Energy Harvesters for Wearable Tech

The development of wearable electronics demands new types of power sources that are flexible and compact enough to fit into these devices. Researchers at the Fraunhofer Institute in Germany are working on this problem with the design of a flexible energy harvester that can be manufactured through a low-cost printing process.

The FP7 MATFLEXEND project at the Fraunhofer-Institut für Zuverlässigkeit und Mikrointegration (IZM) is developing harvesters that convert mechanical deformation into energy by using a capacitive converter exploiting a capacitor’s deformation, according to information about the project on the institute’s website.

http://www.designnews.com/author.asp?section_id=1386&doc_id=272547&itc=dn_analysis_element&dfpPParams=ind_184,industry_alt,aid_272547&dfpLayout=blog

ADI Engineering Announces White Oak Canyon Gateway as part Intel® Gateway Solution for the Internet of Things (IoT)

ADI Engineering, Inc. today announced availability of its “White Oak Canyon” IoT gateway based on the new Intel® Quark™ SoC X1000. With an integrated and prevalidated IoT software solution from Wind River and McAfee, White Oak Canyon is available from ADI Engineering as a turnkey production solution to OEMs adopting the Intel® Gateway for the Internet of Things (IoT). ADI also supplies the White Oak Canyon hardware by itself to customers preferring to provide their own software. Shipments of White Oak Canyon commence in early May, and detailed product information can be found on ADI’s website.

With its comprehensive communications and security solution from Wind River and McAfee, White Oak Canyon provides seamless connectivity between devices and the cloud, ensuring interoperability of edge devices through an open architecture enabling rapid application and service differentiation. The White Oak Canyon hardware platform provides a full suite of highly integrated features, including the 400 MHz Intel Quark SoC X1000, 2x 10/100Mb Ethernet ports, 1GB DDR3 memory, integrated wireless including ZigBee, 2G/3G, Bluetooth, and Wi-Fi, TPM, analog inputs and digital I/O, optional 1- and 3-phase AC power measurement, RS-232, isolated RS-485, USB, and MicroSD.

http://www.fortmilltimes.com/2014/04/01/3389443/adi-engineering-announces-white.html
Michael Burkett, managing vice president at Gartner, said: “It’s important to put IoT maturity into perspective, because of the fast pace at which it is emerging, so supply chain strategists need to be looking at its potential now.‘Internet of things’ will significantly alter supply chains

“Some IoT devices are more mature, such as commercial telematics now used in trucking fleets to improve logistics efficiency. Some, such as smart fabrics that use sensors within clothing and industrial fabrics to monitor human health or manufacturing processes, are just emerging.”

http://www.supplymanagement.com/news/2014/internet-of-things-will-significantly-alter-supply-chains

How the Internet of Things is Keeping Trains on Track

Despite 200 years of development, train accidents are still a cause for concern in the rail industry, but now sensor technologies are helping make things safer.

InSync Releases iApp Cobalt Platform for the Internet of Things and RFID Applications

InSync Software, Inc., the leading provider of enterprise IoT and RFID software, today announced the availability of its iApp application platform, Cobalt Release 5. InSync’s award-winning iApp platform powers Fortune 500 companies’ RFID, GPS and sensor-driven asset tracking and management applications, helping these businesses locate and track mission-critical assets and improve efficiency in operations.

Read more: http://www.digitaljournal.com/pr/1822349#ixzz2xegf5iiO

Intel to turn Dublin into world’s first ‘internet of things’ city

Almost a week after revealing a US$5bn investment in Ireland, chip giant Intel is embarking on a plan with Dublin City Council to make Dublin the most densely sensored city in the world.

The project to make Dublin a ‘Global Demonstrator for Smart City Sensors’ will use Intel Quark-based Gateway platforms.

Two hundred of these sensing gateways will be placed around Dublin City to gather and monitor environmental data, in particular noise and air quality. Each of these gateways can deploy up to six sensors.

http://www.siliconrepublic.com/innovation/item/36336-intel-to-turn-dublin-into/
In the near future, consumers will be adorning themselves with wearable technology that will weave an incredibly detailed picture of their lives. A cloud of information will float around you with details on sleep habits, what you ate for breakfast, who you are meeting for dinner, your heart rate, and much more. Insurance companies will likely harvest this data to adjust your rates. Governments will undoubtedly hack into this cloud of personal data to track down dissidents. Marketers will have access to a goldmine of personalized information that will be used to market products.WEARABLE TECH: THE SURVEILLANCE GRID OF THE FUTURE

http://www.blacklistednews.com/Wearable_Tech%3A_The_Surveillance_Grid_Of_The_Future/34141/0/38/38/Y/M.html

Internet of Things: Mitigating the Risk

Tony Sager, a 30-plus-year National Security Agency information assurance expert, has a new mission: to identify ways to help mitigate the cyberthreats posed by the Internet of Things, those billions upon billions of unmanned devices connected to the Internet.

Since his retirement in 2012 as chief operating officer of the NSA’s information assurance directorate, Sager has focused on getting organizations to adopt cybersecurity best practices. More recently, he has begun to look at the vulnerabilities presented by the Internet of Things as the chief technologist of the Council on Cybersecurity, a not-for-profit group that promotes practices to assure a safe and open Internet.

http://www.bankinfosecurity.com/blogs/internet-things-mitigating-risk-p-1647

Source: http://mattceni.com/2014/04/06/internet-of-things-in-the-news-week-of-331-44/

%d bloggers like this: