Tag Archives: Security

Gartner Identifies the Top 10 Internet of Things Technologies

24 Feb

Gartner, Inc. has highlighted the top 10 Internet of Things (IoT) technologies that should be on every organization’s radar through the next two years.

“The IoT demands an extensive range of new technologies and skills that many organizations have yet to master,”said Nick Jones, vice president and distinguished analyst at Gartner. “A recurring theme in the IoT space is the immaturity of technologies and services and of the vendors providing them. Architecting for this immaturity and managing the risk it creates will be a key challenge for organizations exploiting the IoT. In many technology areas, lack of skills will also pose significant challenges.”

The technologies and principles of IoT will have a very broad impact on organizations, affecting business strategy, risk management and a wide range of technical areas such as architecture and network design. The top 10 IoT technologies for 2017 and 2018 are:

IoT Security

The IoT introduces a wide range of new security risks and challenges to the IoT devices themselves, their platforms and operating systems, their communications, and even the systems to which they’re connected. Security technologies will be required to protect IoT devices and platforms from both information attacks and physical tampering, to encrypt their communications, and to address new challenges such as impersonating “things” or denial-of-sleep attacks that drain batteries. IoT security will be complicated by the fact that many “things” use simple processors and operating systems that may not support sophisticated security approaches.

“Experienced IoT security specialists are scarce, and security solutions are currently fragmented and involve multiple vendors,” said Mr. Jones. “New threats will emerge through 2021 as hackers find new ways to attack IoT devices and protocols, so long-lived “things” may need updatable hardware and software to adapt during their life span.”

IoT Analytics

IoT business models will exploit the information collected by “things” in many ways — for example, to understand customer behavior, to deliver services, to improve products, and to identify and intercept business moments. However, IoT demands new analytic approaches. New analytic tools and algorithms are needed now, but as data volumes increase through 2021, the needs of the IoT may diverge further from traditional analytics.

IoT Device (Thing) Management

Long-lived nontrivial “things” will require management and monitoring. This includes device monitoring, firmware and software updates, diagnostics, crash analysis and reporting, physical management, and security management. The IoT also brings new problems of scale to the management task. Tools must be capable of managing and monitoring thousands and perhaps even millions of devices.

Low-Power, Short-Range IoT Networks

Selecting a wireless network for an IoT device involves balancing many conflicting requirements, such as range, battery life, bandwidth, density, endpoint cost and operational cost. Low-power, short-range networks will dominate wireless IoT connectivity through 2025, far outnumbering connections using wide-area IoT networks. However, commercial and technical trade-offs mean that many solutions will coexist, with no single dominant winner and clusters emerging around certain technologies, applications and vendor ecosystems.

Low-Power, Wide-Area Networks

Traditional cellular networks don’t deliver a good combination of technical features and operational cost for those IoT applications that need wide-area coverage combined with relatively low bandwidth, good battery life, low hardware and operating cost, and high connection density. The long-term goal of a wide-area IoT network is to deliver data rates from hundreds of bits per second (bps) to tens of kilobits per second (kbps) with nationwide coverage, a battery life of up to 10 years, an endpoint hardware cost of around $5, and support for hundreds of thousands of devices connected to a base station or its equivalent. The first low-power wide-area networks (LPWANs) were based on proprietary technologies, but in the long term emerging standards such as Narrowband IoT (NB-IoT) will likely dominate this space.

IoT Processors

The processors and architectures used by IoT devices define many of their capabilities, such as whether they are capable of strong security and encryption, power consumption, whether they are sophisticated enough to support an operating system, updatable firmware, and embedded device management agents. As with all hardware design, there are complex trade-offs between features, hardware cost, software cost, software upgradability and so on. As a result, understanding the implications of processor choices will demand deep technical skills.

IoT Operating Systems

Traditional operating systems (OSs) such as Windows and iOS were not designed for IoT applications. They consume too much power, need fast processors, and in some cases, lack features such as guaranteed real-time response. They also have too large a memory footprint for small devices and may not support the chips that IoT developers use. Consequently, a wide range of IoT-specific operating systems has been developed to suit many different hardware footprints and feature needs.

Event Stream Processing

Some IoT applications will generate extremely high data rates that must be analyzed in real time. Systems creating tens of thousands of events per second are common, and millions of events per second can occur in some telecom and telemetry situations. To address such requirements, distributed stream computing platforms (DSCPs) have emerged. They typically use parallel architectures to process very high-rate data streams to perform tasks such as real-time analytics and pattern identification.

IoT Platforms

IoT platforms bundle many of the infrastructure components of an IoT system into a single product. The services provided by such platforms fall into three main categories: (1) low-level device control and operations such as communications, device monitoring and management, security, and firmware updates; (2) IoT data acquisition, transformation and management; and (3) IoT application development, including event-driven logic, application programming, visualization, analytics and adapters to connect to enterprise systems.

IoT Standards and Ecosystems

Although ecosystems and standards aren’t precisely technologies, most eventually materialize as application programming interfaces (APIs). Standards and their associated APIs will be essential because IoT devices will need to interoperate and communicate, and many IoT business models will rely on sharing data between multiple devices and organizations.

Many IoT ecosystems will emerge, and commercial and technical battles between these ecosystems will dominate areas such as the smart home, the smart city and healthcare. Organizations creating products may have to develop variants to support multiple standards or ecosystems and be prepared to update products during their life span as the standards evolve and new standards and related APIs emerge.

More detailed analysis is available for Gartner clients in the report “Top 10 IoT Technologies for 2017 and 2018.”This report is part of the Gartner Special Report “The Internet of Things“, which looks at the necessary steps to building and rolling out an IoT strategy.

Source: http://www.fintech.finance/featured/gartner-identifies-the-top-10-internet-of-things-technologies/


Power-Grid Hacked? Where’s the IoT?

1 Apr

Writing about the IoT (Internet of Things), or what was once called M2M, is something that people want to read about, a lot. It’s only recently that people are really catching on that everything is going to be connected. So when an article appeared on the front page of the USA Today about the smart grid stating that it was open to hack certainly deserved a chuckle or two, especially from people who are IoT advocates. No offense to my colleagues at the USA Today, but this nationally syndicated newspaper chain was covering the topic as if the fact that vulnerabilities could threaten lives was a breaking news story.

Ironically, there are days people talk about the IoT as if is something brand spanking new. Today newspapers and the broadcast news eagerly espouse the virtues of connected devices because there are apps or gadgets for just about everything imaginable in the IoT. We are now seeing a consumer frenzy surrounding smartphones, fitness trackers, lights, toasters, automobiles, and even baby bottles being connected.

Many people are just beginning to understand the IoT is more than connecting a computer to the Internet, or surfing the Web or watching a YouTube video. To really understand the Internet of Things is to recognize it is more than the everyday consumers gadgets that are getting all the media play these days. Much like the USA Today was so eloquently trying to point out is that the power grid is under attack every day—and what the author stated so clearly—and at any moment, it would leave millions of people without power for days or weeks. And that’s not even the worst of what could happen. Most residents do not equate the average brownout they experience for a few hours to the blackout that could be on the horizon in their neighborhood.

But again most people don’t give the IoT much thought. It’s kind of like their cellphones. Most people don’t know how they work. Nor do they care. They only care they work when and where they need it. The same holds true about their connected gadgets. Most consumers really don’t give their connected gadgets much thought until they need them for tracking their fitness, or turning on their lights or thermostats, or for finding the closest fast food restaurant when traveling in their cars. However, as more and more consumers adopt and adapt to electronic devices as part of their everyday lifestyle, this will change their attitudes and perceptions forever and the excitement for connected devices will trickle over into the enterprise. It is already happening with smart cities, with parking meters, trash pickups, snow removal, first responders, and smart utility meters.

Perhaps that is why the USA Today story has some real significance now and enterprise companies are starting to move away from just talking about the IoT to figuring out ways to implement solutions and services.

Part of the problem with the grid today is that it was designed with OMS (outage-management systems) that were configured to be reactive to signals that indicated outages and managed restoration. However, going forward the IoT systems being designed are able to prevent outages and restore services. These services, as one analyst firm says, could lead to a very bright future for the smart-grid, and as a result, projections based on these services makes sense and are very tangible.

While enterprises are looking to adopt the IoT, there seems to be a blurring of the lines between actual growth and hyperbole in market estimates. Vendors want to make huge growth predictions—50 billion devices—which currently is the buzz of the industry. However, these enormous market amplifications have already proven they will undoubtedly stall growth.

Corporate America seeks growth forecasts that are meaningful and that help deliver solutions to improve bottomline results and shareholder value. Again, one network carrier’s conjecture boasting the number of connections could quadruple by 2020, reaching more than 5 billion, doesn’t mean anything if all of these devices and connections are going to be hacked and CEOs heads are on the chopping block.

The same carrier was even quoted as saying in order for the IoT to reach these prognostications, networks must be reliable, the data from all of these connected endpoints must be able to be stored reliably and securely, infrastructures must be secure, and there must be ways to achieve device management.

If all the stars are in alignment, there is no question the IoT is poised for growth. But, that means everyone has to focus on making security a top priority to fend off the bad guys and to consider the market unknowns that can slow or delay IoT development.

That’s why the formation of groups like the ITA (Illinois Technology Assn.), www.illinoistech.org, Internet of Things Council—a public/private partnership that aims to assure civic leadership in the Internet of Things can will help companies sort through the facts from the fiction to jumpstart their initiatives.

Thus, it’s no wonder the more the industry does its crystal ball gazing, we are doing a disservice to IoT’s true potential. Even Federal Energy Regulatory Commission Chairwoman Cheryl LaFleur was pretty poignant in her remarks when she was quoted in the USA Today article referring to the potential of an attack, “One is too many, so that’s why we have to pay attention. The threats continue to evolve and we have to continue to evolve as well.”

Makes you wonder if the industry is evolving or just continuing to bandy about forecasts with little or no regard for living up to market or shareholding expectations much like it has for the past 15 years. Regardless of what you believe in all of this, the IoT is changing our lives one way or the other and it will certainly have an even greater impact on each and every business. How and when, those are the billion dollar questions.

Source: http://connectedworld.com/power-grid-hacked-wheres-the-iot/

Hacked vs. Hackers: Game On

3 Dec
SAN FRANCISCO — Paul Kocher, one of the country’s leading cryptographers, says he thinks the explanation for the world’s dismal state of digital security may lie in two charts.

One shows the number of airplane deaths per miles flown, which decreased to one-thousandth of what it was in 1945 with the advent of the Federal Aviation Administration in 1958 and stricter security and maintenance protocols. The other, which details the number of new computer security threats, shows the opposite. There has been more than a 10,000-fold increase in the number of new digital threats over the last 12 years.

The problem, Mr. Kocher and security experts reason, is a lack of liability and urgency. The Internet is still largely held together with Band-Aid fixes. Computer security is not well regulated, even as enormous amounts of private, medical and financial data and the nation’s computerized critical infrastructure — oil pipelines, railroad tracks, water treatment facilities and the power grid — move online.

After a year of record-setting hacking incidents, companies and consumers are finally learning how to defend themselves and are altering how they approach computer security.

If a stunning number of airplanes in the United States crashed tomorrow, there would be investigations, lawsuits and a cutback in air travel, and the airlines’ stock prices would most likely plummet. That has not been true for hacking attacks, which surged 62 percent last year, according to the security company Symantec. As for long-term consequences, Home Depot, which suffered the worst security breach of any retailer in history this year, has seen its stock float to a high point.

In a speech two years ago, Leon E. Panetta, the former defense secretary, predicted it would take a “cyber-Pearl Harbor” — a crippling attack that would cause physical destruction and loss of life — to wake up the nation to the vulnerabilities in its computer systems.

No such attack has occurred. Nonetheless, at every level, there has been an awakening that the threats are real and growing worse, and that the prevailing “patch and pray” approach to computer security simply will not do.

So what happened?

The Wake-Up Call

A bleak recap: In the last two years, breaches have hit the White House, the State Department, the top federal intelligence agency, the largest American bank, the top hospital operator, energy companies, retailers and even the Postal Service. In nearly every case, by the time the victims noticed that hackers were inside their systems, their most sensitive government secrets, trade secrets and customer data had already left the building. And in just the last week Sony Pictures Entertainment had to take computer systems offline because of an aggressive attack on its network.

The impact on consumers has been vast. Last year, over 552 million people had their identities stolen, according to Symantec, and nearly 25,000 Americans had sensitive health information compromised — every day — according to the Department of Health and Human Services. Over half of Americans, including President Obama, had to have their credit cards replaced at least once because of a breach, according to the Ponemon Group, an independent research organization.


And this year, American companies learned it was not just Beijing they were up against. Thanks to revelations by the former intelligence agency contractor Edward J. Snowden, companies worry about protecting their networks from their own government. If the tech sector cannot persuade foreign customers that their data is safe from the National Security Agency, the tech industry analysis firm Forrester Research predicts that America’s cloud computing industry stands to lose $180 billion — a quarter of its current revenue — over the next two years to competitors abroad.

“People are finally realizing that we have a problem that most had not thought about before,” said Peter G. Neumann, a computer security pioneer at SRI International, the Silicon Valley engineering research laboratory. “We may have finally reached a crossroads.”

Is There a Playbook?

Only certain kinds of companies, like hospitals and banks, are held up to scrutiny by government regulators when they are hacked. And legal liability hasn’t been established in the courts, though Target faces dozens of lawsuits related to a hack of that company’s computer network a little over a year ago.

But if there is a silver lining to the current predicament, Mr. Neumann and other security experts say, it is that computer security, long an afterthought, has been forced into the national consciousness.


<strong>“People are finally realizing that we have a problem.”</strong> — Peter G. Neumann, a computer security pioneer at SRI International.
“People are finally realizing that we have a problem.” — Peter G. Neumann, a computer security pioneer at SRI International.Credit Jim Wilson/The New York Times

Customers, particularly those abroad, are demanding greater privacy protections. Corporations are elevating security experts to senior roles and increasing their budgets. At Facebook, the former mantra “move fast and break things” has been replaced. It is now “move slowly and fix things.” Companies in various sectors have started informal information-sharing groups for computer security. And President Obama recently called on Congress to pass a national data breach law to provide “one clear national standard” rather than the current patchwork of state laws that dictate how companies should respond to data breaches.

There is growing recognition that there is no silver bullet. Firewalls and antivirus software alone cannot keep hackers out, so corporations are beginning to take a more layered approach to data protection. Major retailers have pledged to adopt more secure payment schemes by the end of next year. Banks are making it easier for customers to monitor their monthly statements for identity theft. And suddenly, pie-in-the-sky ideas that languished in research labs for years are being evaluated by American hardware makers for use in future products.


Credit Mel Evans/Associated Press

“People are recognizing that existing technologies aren’t working,” said Richard A. Clarke, the first cybersecurity czar at the White House. “It’s almost impossible to think of a company that hasn’t been hacked — the Pentagon’s secret network, the White House, JPMorgan — it is pretty obvious that prevention and detection technologies are broken.”

Companies that continue to rely on prevention and detection technologies like firewalls and antivirus products are considered sitting ducks.

“People are still dealing with this problem in a technical way, not a strategic way,” said Scott Borg, the head of the United States Cyber Consequences Unit, a nonprofit organization. “People are not thinking about who would attack us, what their motives would be, what they would try to do. The focus on the technology is allowing these people to be blindsided.

“They are looking obsessively at new penetrations,” Mr. Borg said. “But once someone is inside, they can carry on for months unnoticed.”

The Keys to Preparation

The companies most prepared for online attacks, Mr. Borg and other experts say, are those that have identified their most valuable assets, like a university’s groundbreaking research, a multinational’s acquisition strategy, Boeing’s blueprints to the next generation of stealth bomber or Target’s customer data. Those companies take additional steps to protect that data by isolating it from the rest of their networks and encrypting it.

That approach — what the N.S.A. has termed “defense in depth” — is slowly being adopted by the private sector. Now, in addition to firewalls and antivirus products, companies are incorporating breach detection plans, more secure authentication schemes, technologies that “white list” traffic and allow in only what is known to be good, encryption and the like.


<strong>“It’s almost impossible to think of a company that hasn’t been hacked.”</strong> — Richard A. Clarke, the first cybersecurity czar at the White House.
“It’s almost impossible to think of a company that hasn’t been hacked.” — Richard A. Clarke, the first cybersecurity czar at the White House.Credit Markus Schreiber/Associated Press

“We’re slowly getting combinations of new technologies that deal with this problem,” Mr. Clarke said.

The most prominent examples are Google, Yahoo, Microsoft and Facebook. Mr. Snowden revealed that the N.S.A. might have been grabbing data from those companies in unencrypted form as it passed between their respective data centers. Now, they all encrypt their traffic as it flows internally between their own data centers.

Though intelligence analysts may disagree, security experts say all of this is a step in the right direction. But security experts acknowledge that even the most advanced security defenses can break down. A widely used technology sold by FireEye, one of the market leaders in breach detection, failed to detect malicious code in an independent lab test this year. The product successfully identified 93 percent of the threats, but as the testers pointed out, it is not the 99 percent of detected threats that matter. It is the 1 percent that are missed that allow hackers to pull off a heist.

Even when security technologies do as advertised, companies are still missing the alerts. Six months before Target was breached last year, it installed a $1.6 million FireEye intrusion detection system. When hackers tripped the system, FireEye sounded alarms to the company’s security team in Bangalore, which flagged the alert for Target’s team at its headquarters in Minneapolis. Then nobody reacted until 40 million credit card numbers and information on 70 million more customers had been sent to computers in Russia, according to several investigators.

Part of the problem, security chiefs say, is “false positives,” the constant pinging of alerts anytime an employee enters a new database or downloads a risky app or email attachment. The result, they complain, is a depletion of resources and attention.

“We don’t need ‘big data.’ We need big information,” said Igor Baikalov, a former senior vice president for global information security at Bank of America, now chief scientist at Securonix, a private company that sells threat intelligence to businesses.

Securonix is part of a growing class of security start-ups, which includes Exabeam and Vectra Networks in Silicon Valley and several other companies that use the deluge of data from employee computers and personal devices to give security officers intelligence they can act on.

Many companies in the Fortune 500 are building their own systems that essentially do the same thing. These technologies correlate unusual activity across multiple locations, then raise an alarm if they start to look like a risk. For example, the technologies would increase the urgency of an alert if an employee suddenly downloaded large amounts of data from a database not regularly used, while simultaneously communicating with a computer in China.

The future of security, experts say, won’t be based on digital walls and moats but on these kinds of newer data-driven approaches.

“Most large organizations have come to the painful recognition that they are already in some state of break-in today,” said Asheem Chandna, a venture capital investor at Greylock Partners. “They are realizing they need to put new and advanced sensors in their network that continuously monitor what is going on.”

While much progress is being made, security experts bemoan that there is still little to prevent hackers from breaking in in the first place.

In May, the F.B.I. led a crackdown on digital crime that resulted in 90 arrests, and Robert Anderson, one of the F.B.I.’s top officers on such cases, said the agency planned to take a more aggressive stance. “There is a philosophy change. If you are going to attack Americans, we are going to hold you accountable,” he said at a cybersecurity meeting in Washington.

Still, arrests of hackers are few and far between.

“If you look at an attacker’s expected benefit and expected risk, the equation is pretty good for them,” said Howard Shrobe, a computer scientist at the Massachusetts Institute of Technology. “Nothing is going to change until we can get their expected net gain close to zero or — God willing — in the negative.”

Until last year, Dr. Shrobe was a manager at the Defense Advanced Research Projects Agency, known as Darpa, overseeing the agency’s Clean Slate program, a multiproject “Do Over” for the computer security industry. The program included two separate but related projects. Their premise was to reconsider computing from the ground up and design new computer systems that are much harder to break into and that recover quickly when they have been breached.

“ ‘Patch and pray’ is not a strategic answer,” Dr. Shrobe said. “If that’s all you do, you’re going to drown.”

It’s 2014. Do you know where your data is?

5 May

As more and more businesses take advantage of the promise of big data, they’re also putting themselves at increased risk of a security breach. They can protect themselves and their data by centralizing it in one location.

It’s no secret that over the last year, companies have adopted and deployed big data architectures and analytics like never before. They’ve caught the big data bug and they’re using the insights gleaned from that data to anticipate, plan and react to situations in real-time. If your organization is doing this, then you’ve likely seen great results, but have you ever stopped to think about how secure your data and, in turn, your decisions are?

Decisions, changes and new technologies, like data analytics, are being implemented so quickly that legacy processes simply can’t keep up. The gap between new and legacy systems is often just wide enough for a security risk to slip through. Once these risks penetrate the gap, they’re moving at the speed of light themselves, diving into data and jumping back out before businesses even know what hit them. You don’t have to look far to see this happening – from recent and high-profile data breaches to stolen financial information, we’ve seen it all.

Herein lies our dilemma. How can we take full advantage of the volume, velocity and variety of data available today while ensuring the data remains secure?

Out with the old, in with the new

To make changes of any kind, you need to know what you’re working with. For the purposes of this exercise, that means knowing where your data is stored. Strike up a conversation with your security team and they’ll likely tell you that it lives in silos. Each business function, team, person, etc. has its own data silo and the security tools that are in place are responsible for protecting the perimeter of said silo.

It’s the typical “burglar alarm” approach to security, but we’re not dealing with old-school burglars. Today, these guys are savvy. From 17-year-olds who can write code capable of breaking through corporate strongholds to hacker groups bent on causing mayhem, it is absolutely critical that more than just the perimeter is secured. Once that perimeter is breached, the floodgates open and entrants gain access to everything from financial data to confidential emails, supply chain routes, disparate databases and more. Cue the chaos and a lot of late night phone calls.

So what’s the fix?

Photo by opera3d/Thinkstock

Data centralization: The new norm?

In order to best secure your data, it needs to be centralized.

As I mentioned on stage at Gigaom’s Structure Data conference in March, companies can be wary of taking this approach. They believe that housing their data in one, centralized repository actually makes them a bigger target for hackers. In fact, it does quite the opposite and it should be the new norm for all companies that are trying to take utmost advantage of their data.  Let me explain.

By breaking down the data silos and centralizing your data, it’s possible to increase visibility across your organization, which is critical for those real-time insights and decisions that drive business success and establish competitive edge.

Data centralization enables business to protect each separate data point and element. Each data point can be tagged with specific attributes that regulate access by user, location, device, etc. That data can then be tracked as it travels in and out of your organization with the pre-set attributes attached to it. It’s like putting a password-protected GPS on your data.

Keep in mind that implementing this new process of data tagging and centralization isn’t an overnight process. It takes time and close collaboration between IT and the business units. Security teams need to work closely with employees across the business to provide the tools they need to publish and consume data while also ensuring it is protected, audited and logged. This collaboration is crucial because it is the employees who truly understand the value and the necessary security classification for each data point, file or folder.

Despite the work involved, data centralization has real and tangible benefits for the business. Businesses who adopt these models find the outcomes to be extremely valuable. It allows them to work closely with employees and break down organizational hierarchies, which leads to greater collaboration across the organization, and, at the same time, secures data and reduces overall risk.

Featured image from Gmac84/Thinkstock. Computer image fromOpera3D/Thinkstock.

Source: http://gigaom.com/2014/05/04/its-2014-do-you-know-where-your-data-is/

Shedding Light on Dark Fiber

13 Mar

Dark Fiber

What is Dark Fiber?

Dark Fiber gives your company’s network a dedicated fiber optic connection; this connection offers virtually unlimited bandwidth as it is solely based upon the equipment you place on the ends. Dense wavelength-division multiplexing (DWDM), an optical technique that involves splitting a single optical fiber into multiple wavelengths, further supports this limitless bandwidth capacity. Currently, DWDM systems have a capacity of 8 terabits and growing!

Frequently, Dark Fiber is sold on a per pair or single-strand basis dependent upon what your gear requires. Typically, the purchase of the network occurs via a long term IRU (Indefeasible Rights of Use) agreement. Traditionally, this lease agreement was for 10 or 20-year terms, however, in recent years companies have begun purchasing on much shorter lease terms.

Benefits of Dark Fiber:

Any Service, Any Protocol, Any Bandwidth:  Dark Fiber is traffic agnostic to the protocols that you allow to traverse the network. It’s yours to use. You control your bandwidth from 1 Mbps to speeds over 100Gbps!  However, do be mindful of some distance limitations that your protocol may have.

Reliability:  A premiere and optimally designed and engineered Dark Fiber network will include redundant paths for diversity. For maximum diversity, multiple carrier networks may be utilized. Always ask for route maps to ensure carrier path diversity and if you see paths that don’t make sense…ask questions.

Scalability:  The only limiting factor is the equipment you install—Dark Fiber is virtually unlimited in its capabilities. You can easily scale you network to your needs from 1Gbps to 100Gbps and beyond, simply by switching out your equipment.

Security:  Because you place the equipment on each termination point of your Dark Fiber network, you have full control on how you implement your security. No public routers, switches or COs ensures your data remains in the private sector.

Flexibility:  The only factor is determining the protocols that traverse the network and at what volume the equipment installed on each end can support. If you lease your own private fiber connection, you control everything.

Purchase Options and Fixed Cost:  Dark Fiber leasing and purchase options provide flexibility for the financial planning aspects of your organization. And, because bandwidth is limitless there is no concern for hiking costs of additional bandwidth.

A Dark Fiber network provides a host of premier benefits to the end user. However, when deciding on a network solution it is important to keep in mind the management and support of that network. Unlike a lit solution, Dark Fiber requires in-house maintenance and upkeep of the network. To learn more about the differences between a lit and dark fiber solution, see our previous post.

Ultimately, when choosing a network solution, it is best to discuss your options with a service provider. Each organization will have different pain points and requirements that may or may not fit the scope of Dark Fiber connectivity. But certainly, if you are looking for limitless flexibility and unrivalled bandwidth, Dark Fiber can show you the light.


Source: http://sunlight.sunesys.com/2014/03/11/shedding-light-on-dark-fiber/

Cyber Security is Not Prepared for the Growth of Internet Connected Devices

3 Mar

The estimated growth of devices connected to the Internet is staggering.  By 2020 Cisco estimates that 99% of devices (50 billion) will be connected to the Internet.  In contrast, currently only around 1% is connected today. The sheer numbers as well as the complexity of new types of devices will be problematic. Although traditional computing devices such as personal computers, tablets and smartphones will increase, it is the Internet of Things (IoT) which will grow significantly, to around 26 billion units. That represents nearly a 30-fold increase according to Gartner.

Device Estimates.jpg

The industry is in a vicious fight protecting current platforms, such as PC’s from malware and compromise. New malware is generated at a mind boggling rate of ~200k unique samples each day. With the rise of smartphones and tablets, we are witnessing the fastest growth of malware in this sector and expect the complexity of attacks to increase. Security companies work tirelessly to keep up with the increasing pace.
But the wildcards to this equation will be the radical growth of IoT devices which have different architectures, software, and usages. Wearables, transportation, and smart appliances which will grow at an alarming rate. These represent challenges as they will differ greatly from familiar computers and longstanding security controls will need to be reworked or rethought entirely. The processes and tools currently in use by security organizations are not easily extensible to meet the new challenge. This will give attackers a diverse area to scrutinize for vulnerabilities and new opportunities to exploit for their gain.
Security resources across the industry are already stretched thin. It will be very difficult to adapt to the new scope, requiring new tools, expertise, and ways of thinking. The security industry is not giving up and throwing in the towel just yet, but the challenge they face is undeniable.
Product vendors can play an important role by designing and testing products with security in mind.  Such hardening techniques can reinforce both hardware and software to deny attackers opportunities of compromise. Hardware features, software capabilities, and security services must be designed to work together for maximum effect. This holistic strategy is necessary to establish a common front of cooperative defenses. Security services must look ahead and begin adaptation to serve emerging form factors, supporting infrastructures, and user demands.
Perhaps most importantly, the everyday user must begin to take responsibility for their own security. Users have a tremendous amount of control over their security and can strongly influence the industry by demanding proper embedded controls. User behaviors must shift to more reasonable actions.  Not every link must be clicked. Not every survey or request for personal information must be fulfilled. Not every application, including those from untrustworthy sources, must be installed. Socially, we must act with more discretion to protect our valuables.
Our world is changing quickly with the staggering increase of interconnected devices melding into cyberspace. The security risks rise equally as fast. We will face challenges, but it is up to all of us to determine how secure we will be.

Matthew Rosenquist is an information security strategist, with a passion for his chosen profession. Benefiting from nearly 20 years of experience in Fortune 100 corporations, he has thrived on establishing strategic organizations and capabilities which deliver cost effective information security services.

Source: https://communities.intel.com/community/itpeernetwork/blog/2014/02/08/cyber-security-is-not-prepared-for-the-growth-of-internet-connected-devices

NFC: It’s hiding in plain sight

21 Dec

The holiday shopping season is in full swing; we’ve survived Black Friday and Cyber Monday and are rapidly approaching the after-Christmas sales.

This year a key part of that shopping experience has involved mobile payments — the use of a mobile device such as a tablet or smartphone to pay for goods and services. What once was just a concept in the minds of technologists is fast becoming reality and, in the process, providing consumers and vendors alike with greater ease of payment and more efficient tracking.

As a result, the number of merchants accepting mobile payments continues to climb at a dramatic rate, as does the number of consumers trying mobile payments for the first time. The last thing either of these camps wants to worry about as they explore this brave new world of shopping is security. Luckily, a number of emerging technologies may now hold the key to making mobile payments much more secure.

One such technology, Near Field Communication, enables the transfer of data between devices like smartphones, chip cards (a card with an embedded, unique microchip that encrypts or “scrambles” user data, making it virtually impossible to copy) and other similar devices, by simply touching them together or bringing them close to one another (usually just a few centimeters). Unlike Bluetooth, NFC requires no pairing, which makes device authentication easier. Also, since NFC is very low power, a battery is not required in the device being read (e.g., the chip card).

With just a tap of your NFC-enabled smartphone or chip card against an NFC-enabled point-of-sale terminal, a merchant could easily take your payment and even identify things like your specific shopping preferences or apply a customer loyalty program reward. Companies such as Samsung and Visa are certainly working to promote this concept by making mobile payments through smartphones commonplace, but would it surprise anyone to know the technology is already in use today?

For a prime example, look no further than the closest McDonald’s; the chain has already installed contactless payments infrastructure in most of its POS systems worldwide, and now offers mobile contactless payments to NFC-enabled handsets.

Consumers not yet aware

In truth, many consumers currently have NFC technology embedded in their phones and credit cards and don’t even know it. That’s because the technology has advanced well ahead of its consumer awareness and usage models. Consumers are waking up to the technology and its benefits, however. With smaller, more energy-efficient NFC chips in development, and with the full gamut of handset manufacturers, POS terminal manufacturers and payments technology providers making the technology available to their customers, it’s only a matter of time before consumers everywhere will be equipped with NFC-enabled devices capable of interacting with other NFC infrastructure devices for the purposes of mobile payment.

One area where NFC will play a key role is as an enabler of contactless chip cards. Chip card technology comes in two variations. Contacted chip cards, also known as chip-and-pin cards, are slid into a slot in a POS terminal and require a personal ID number or secret numeric password for authentication. In contrast, contactless chip cards rely on NFC technology to securely exchange information. Both chip card variations promise to provide consumers with a mobile payment experience that is simple, quick and highly secure.

One reason chip cards offer better security is their use of dynamic authentication. Essentially, dynamic values are introduced into each transaction, reducing a criminal’s ability to use stolen payment card data. Even if criminals manage to get their hands on this data to create a counterfeit cards, they would be unusable without the original cards’ unique elements. By comparison, modern magnetic strip cards are relatively easy for thieves to duplicate or replicate.

EMV is coming

Like NFC technology, chip cards are already in use today around the world. Just this year, Visa began supporting contactless chip card payments. Most consumers remain oblivious to this fact. Also, many of the POS systems, through which those contactless payments would be made, now feature a dual interface, meaning they can accept both contacted and contactless chip card devices. But the terminals are often shipped with this feature turned off because up to now, security has not been a motivating factor for merchants.

What merchants do care about is complying with industry standards, and the deadline they face for that compliance in the United States is October 2015. That’s the date by which the payment industry must comply with new EMV (an acronym derived from Europay, MasterCard, Visa) standards or be forced to assume liability for fraudulent purchases.

EMV is a global standard governing security and interoperability of chip-based payment cards, and it will be a formidable tool in helping to combat the high rate of card cloning fraud with current magnetic stripe technology. For those merchants who adopt it earlier than October 2015, by deploying dual-interface POS terminals, the benefit will be not only safer customer transactions but also the possible elimination of their requirement to re-certify PCI validation with the payment card industry every year.

Any merchant accepting credit cards today is required to be in compliance with PCI standards, which ensure all payment terminals and companion devices contain the encryption technologies needed to provide the highest level of security for cardholder data. That puts added demands on POS terminal and companion device manufacturers, as well as suppliers of “the brains” into those systems, to continue developing the advanced technologies necessary to help ensure these systems achieve PCI compliance.

Mobile payments are now a reality and gaining traction with each passing day. Thanks to advanced technologies like NFC, chip card devices and compliance to evolving standards like EMV and PCI, today’s consumers can be more assured of the security and reliability of their mobile payments.

Learn more about contactless/NFC and security.

Source: http://www.mobilepaymentstoday.com/article/224907/NFC-It-s-hiding-in-plain-sight?utm_source=NetWorld%20Alliance&utm_medium=email&utm_campaign=EMNAMPT12182013

Top 10 Predictions for 2014

6 Dec

Cybersecurity in 2014: A roundup of predictions: ZDNet might have picked up that I have done this for the past two years and Charles McLellan put together his own collection.  This is a good place to start with lists from SymantecWebsenseFireEye,Fortinet and others.  Mobile malware, zero-days, encryption, ‘Internet of Things,’ and a personal favorite, The Importance of DNS are amongst many predictions.

Eyes on the cloud: Six predictions for 2014: Kent Landry – Senior Consultant at Windstream focuses on Cloud futures in this Help Net Security piece.  Hybrid cloud, mobility and that pesky Internet of Everything make the list.

5 key information security predictions for 2014: InformationWeek has Tarun Kaura, Director, Technology Sales, Symantec discuss the coming enterprise threats for 2014.  Social Networking, targeted attacks, cloud and yet again, The Internet of Things finds a spot.

Top 10 Security Threat Predictions for 2014: This is essentially a slide show of Fortinet’s predictions on Channel Partners Telecom but good to review.  Android malware, increased encryption, and a couple botnet predictions are included.

2014 Cyber Security Forecast: Significant healthcare trends: HealthITSecurity drops some security trends for healthcare IT security professionals in 2014.  Interesting take on areas like standards, audit committees, malicious insiders and supply chain are detailed.

14 IT security predictions for 2014: RealBusiness covers 10 major security threats along with four ways in which defenses will evolve.  Botnets, BYOD, infrastructure attacks and of course, the Internet of Things.

4 Predictions for 2014 Networks: From EETimes, this short list looks at the carrier network concerns.   Mobile AAA, NFV, 5G and once again, the Internet of Things gets exposure.

8 cyber security predictions for 2014: InformationAge goes full cybercriminal with exploits, data destruction, weakest links along with some ‘offensive’ or retaliatory attack information.

Verizon’s 2014 tech predictions for the enterprise: Another ZDNet article covering the key trends Verizon believes will brand technology.  Interest includes the customer experience, IT decentralization, cloud and machine-to-machine solutions.

Research: 41 percent increasing IT security budget in 2014: While not a list of predictions, this article covers a recent Tech Pro Research survey findings focused on IT security.  The report, IT Security: Concerns, budgets, trends and plans, noted that 41 percent of survey respondents said they will increase their IT security budget next year.  Probably to counter all the dire predictions.

A lot to consider as you toast the new year with the Internet of Things making many lists.  The key is to examine your own business and determine your own risks for 2014 and tackle those first.

Source: http://www.zdnet.com/cybersecurity-in-2014-a-roundup-of-predictions-7000023729/

Mobile Fourth Wave: The Evolution of the Next Trillion Dollars

2 Sep
Smartphone image copyright Nik Merkulov 

We are entering the golden age of mobile. Mobile has become the most critical tool to enhance productivity and drive human ingenuity and technological growth. And the global mobile market will reach $1.65 trillion in revenue this year. Over the next decade, that revenue number will more than double. If we segment the sources of this revenue, there will be a drastic shift over the course of the next 10 years. During the last decade, voice accounted for over 55 percent of the total revenue, data access 17 percent, and the over-the-top and digital services a mere three percent. Over the next decade, we expect mobile digital services to be the leading revenue-generating category for the industry, with approximately 30 percent of the total revenue. Voice will represent less than 21 percent.

There is already a significant shift in revenue structures for many players. The traditional revenue curves of voice and messaging are declining in most markets. Mobile data access, while still in its infancy in many markets, is starting to face significant margin pressure. As such, the industry has to invest in building a healthy ecosystem on the back of the fourth wave — the OTT and digital services. The revenue generated on the fourth wave is going to be massive, but much more distributed than the previous curves. It will end up being a multi-trillion-dollar market in a matter of a decade — growing much faster and scaling to much greater heights than previous revenue curves.

Vodafone, one of the biggest mobile operators in the world, recently reported that in each of its 21 markets, voice and messaging declined (YOY). In some markets, like Italy, even the data access segment suffered negative growth. However, what was more disturbing was that the increase in access revenue didn’t negate the decline in voice and messaging revenue in any market. The net revenue declined in every single market, no matter which geography it belonged to. The net effect was that the overall revenue declined by nine percent, despite data access revenue growing by eight percent, because the overall voice and messaging revenue streams suffered double-digit losses. Once the access revenue started to decline (and it is already happening to some of the operators), these companies will have to take some drastic measures to attain growth. The investment and a clear strategy on the fourth wave will become even more urgent. They will have to find a way to become Digital Lifestyle Solution Providers.


So, what is the mobile fourth wave, and who are the dominant players today? The fourth wave is not a single entity or a functional block like voice, messaging or data access, but is made up of dozens of new application areas, some of which have not even been dreamt up yet. As such, this portfolio of services requires a different skill set for both development and monetization. Another key difference in the competitive landscape is that the biggest competitors for these services (depending on the region) might not be another operator but the Internet players who are well funded, nimble and very ambitious. The services range from horizontal offerings such as mobile cloud; commerce and payments; security; analytics; and risk management to mobile being tightly integrated with the vertical industries such as retail, health, education, auto, home, energy and media. Mobile will change every vertical from the ground up, and that’s what will define the mobile fourth wave.

In the past, the Top 10 players by revenue were always mobile operators. If we take a look at the Top 10 players by revenue on the fourth wave, there are only five operators on the list. The Internet players like Apple, Google, Amazon, Starbucks and eBay are generating more revenue on this curve than some of the incumbent players. However, some of the operators like AT&T, KDDI, NTT DoCoMo, Telefonica and Verizon have been investing steadily on the fourth curve for some time. The two Japanese operators on the list have even started to report the digital revenue in their financials.

Just as data represents 50 percent or more of their overall revenue, we expect that, for some of these operators, digital will represent more than 50 percent of their data revenue within five years. Relatively smaller operators like Sprint, Turkcell, SingTel and Telstra are also investing in new service areas that will change how operators see their opportunities, competition and revenue streams.


This shift to digital has larger implications, as well. Countries with archaic labor laws that don’t afford companies the flexibility needed to be digital players are going to be at a disadvantage. It is one thing to have figured out the strategy and the areas to invest in, and it is completely another to execute with the focus and tenacity of an upstart. If companies are not able to assemble the right talents to pursue the virgin markets, someone else will. Such players will see decline in their revenues and become targets for M&A. Some of this is already evident in the European markets, which are also plagued by economic woes. Regulators will have a tough task ahead of them in evaluating some unconventional M&As in the coming years.

The shift to digital will also have an impact on the rest of the ecosystem. The infrastructure providers will have to develop expertise in services that can be sold in partnership with the operators. Device OEMs without a credible digital-services portfolio will find it hard to compete just on product or on price. The Internet players will have to form alliances to find distribution and scale. The emergence of the fourth wave is good news for startups. Instead of just looking toward Google or Apple, the exit route now includes the operator landscape, as well. In fact, some of the operators have been making strategic acquisitions in specific segments over the last few years — Telefonica acquired AxisMed, Brazil’s largest chronic-care management company; Verizon acquired Hughes Telematics; and SingTel acquired Amobee.

For any telecom operator looking to enter the digital realm, the strategic options and road map are fairly clear. First, it has to solidify and protect its core business and assets. A great broadband network is the table stakes to be considered a player in the digital ecosystem. Depending on the financial condition of the operator, the non-core assets should be slowly spun off or sold to potential buyers so that the company can squarely focus on preserving the core and on launching the digital business with full force. The digital business requires a portfolio management approach that requires a completely different mindset and skillset to navigate the competitive landscape.

The first three revenue growth curves have served the industry well, but now it is time for the industry to refocus its energies on the fourth curve that will completely redefine the mobile industry, its players and the revenue opportunities. Several new players will start to emerge that will create new revenue from applications and services that transform every industry vertical that contributes significantly to the global GDP. As players like Apple and Google continue to lead, mobile operators will have to regroup, collaborate and refocus to become digital players.

There will be hardly any vertical that is not transformed by the confluence of mobile broadband, cloud services and applications. In fact, the very notion of computing has changed drastically. The use of tablets and smartphones instead of PCs has altered the computing ecosystem. Players and enterprises who aren’t gearing up for this enormous opportunity will get assimilated.

The future of mobile is not just about the platform, but about what’s built on the platform. It is very clear that the winners will be defined by how they react to the fourth wave that will shape mobile industry’s next trillion dollars.

Source: http://allthingsd.com/20130826/mobile-fourth-wave-the-evolution-of-the-next-trillion-dollars/?mod=atd_homepage_carousel&utm_source=Triggermail&utm_medium=email&utm_term=Mobile+Insights&utm_campaign=Post+Blast+%28sai%29%3A+Where+Will+The+Next+%241+Trillion+In+Mobile+Come+From%3F

SIP Adaptation

23 Jun


The trouble with standards is that they rarely are.  Standard, that is.  It seems that no matter what sort of technology you are dealing with, there are plenty of variations.   In America a wall socket delivers electricity at 120 volts, 60 Hz.  In England they use 230 volts, 50 Hz while in Mexico they’ve “standardized” on 127 volts, 60 Hz.  Even the shapes of electrical plugs vary across the globe.  The standard for digital telephone trunks in the United States is T1 while in Europe it’s E1.  Here in Minnesota we drive on the right side of the road while in Japan they drive on the left.  And don’t expect to drive a train from Brazil to Bolivia. There are eight different gauges of railroad track and South America uses five of them.

So, why should SIP be any different?  Why should you expect that Cisco SIP will fully interoperate with Microsoft, Nortel, Avaya, or Siemens SIP?   Other than because it’s the right thing to do, you really can’t.

However, all is not lost.  SIP is controlled by the Internet Engineering Task Force (IETF), the same people that develop and control the protocols used to run the Internet, and for the most part their documents and recommendations are faithfully followed.  You can build a system comprised of SIP services and devices from different companies and things will generally work just fine.  Still, some companies have taken liberties with SIP and to paraphrase Frank Sinatra, “they did it their way.”

So, what do you do when Cisco Call Manager expects redirect information in a Diversion header and Avaya Communication Manager insists upon putting it into the History-Info header?

You adapt.


Adaptation allows SIP telephones, call servers, applications servers, and trunks from different vendors to communicate with one another.  Adaptation can cure a number of ills.  Sometimes the problem is in the SIP headers.  One vendor might expect data in one header while another vendor puts that data into a different header.  It’s also possible for a vendor to require data that another vendor doesn’t send regardless of the header.  The problem might be in the body of the SIP message.  For instance, Nortel puts multipart MIME into SIP message bodies and no one else in the world understands what to do with that.  Finally, the problem might involve a combination of SIP messages and the VoIP media stream.  For example, Cisco uses SIP Info messages to carry DTMF tones, while most of the rest of the world follows RFC 2833 which puts the tones into RTP data.  Depending upon the problem, SIP adaption occurs at different places in an Avaya configuration.

Session Manager

Let’s start with the Avaya Session Manager.  Among the many tasks that a Session Manager performs, it has the ability to assign Adaptation Modules against specific SIP Entity Links.  For example, a SIP Entity Link assigned to a Cisco Call Manager would use the module CiscoAdapter to ensure that the Cisco system understands the SIP messages sent to it and the non Cisco systems understand the SIP messages coming from that link.

Session Manager Adaptation Modules are static in nature in that they do not permit processing outside the modules to do their job.  In other words, an Adaptation Module cannot reference a database or external service to perform the adaptation.  It’s important to know that Session Manager Adaptation Modules can only be written by Avaya.  If you have a need for SIP adaptation that is not satisfied by an existing Adaptation Module, you need to go a different route.
Sequenced Applications

The next available place for adaptation is a Sequenced Application.  Like Adaptation Modules, Sequenced Applications run in conjunction with Session Manager, but unlike Adaptation Modules, Sequenced Application are extremely dynamic and can be written by anyone.  A Sequenced Application can add SIP headers, change SIP headers, remove SIP headers, and change the SIP message body.  Sequenced Applications are developed with ACE and the Foundation Toolkit.

Session Border Controllers

The next important place for SIP adaptation is a Session Border Controller.  Most SBCs have an adaptation layer similar to that of Session Manager, but unlike Session Manager, that layer comes with a user accessible configuration tool that allows for the creation of new adaptation modules.  In the Avaya/Sipera world that tool is called STIM while Acme uses a process they call HMR.  In all cases, these tools create static adaptation modules that do not reach out to other services or databases to perform their adaptations.


The last place for SIP adaptation is a SIP application server.  Remember when I said that Cisco doesn’t follow RFC 2833 for DTMF digits?  Avaya  Aura Messaging accepts those non standard SIP Info messages and processes them as if the touch tones came inside the RTP stream.  This allows for interoperability without having to somehow pull those digits from the SIP packets and stuff them into a G.711 audio stream.

Make it So

So, even though the SIP “standard” might not be as standard as we would like it to be, there are several ways to create a network of SIP devices that peacefully coexist.  This is the key to opening up your communications environment to applications from different vendors, trunks from different providers, “Bring Your Own Device” endpoints, and call processing systems of all shapes, sizes, and colors.

Now, if only someone would come up with a single cell phone charger for the one million or so different cell phone models out there.  That’s one standard that I desperately want to see come to light.

%d bloggers like this: