The blog YTD2525 contains a collection of clippings news and on telecom network technology.
BASIC INTRODUCTION INTO MICROWAVE THEORY AND IP APPLICATIONS
The field of terrestrial microwave communications is constantly experiencing a steady technological innovation to accommodate the ever-demanding techniques telecom providers and private microwave users employ when deploying microwave radios in their cloud networks. • In the beginning of this wireless evolution, the ubiquitous DS1s/E1s and DS3s/E3s crisscrossed networks transporting mainly voice communications, data, and video. • With the advent of Carrier Ethernet and IP, new techniques had to be developed to ensure the new Layer 2 radios were up to par with the new wave of traffic requirements including wideband online-streamed media. These new techniques come in the form of Quality of Service (QoS), Traffic Prioritization, RF Protection and Design, Spectrum Utilization, and Capacity Enhancement. • With Carrier Ethernet and IP, network design becomes more demanding and complex in terms of RF, Traffic Engineering, and QoS. However, the propagation concepts remain unchanged from TDM link engineering while the link’s throughput of L2 radios doubles, triples, or quadruples employing enhanced DSP techniques.
Today’s wireless networks – AT&T Mobility, Sprint, T-Mobile US, Verizon Wireless and other regional operators – are designed and maintained to accommodate data-intensive devices and their communicative capabilities. Networks are built to support our interconnectedness – for when we download large files and spreadsheets to get an idea of next quarter’s budget; when we send our favorite images of the day to our family; or when we are streaming our favorite television shows on Netflix. There are many ways in which we depend on our mobile devices, and by extension, the strength of wireless operators.
This said, many networks are not yet ready for connecting a new wave of devices that within the next decade will become commonplace: the “Internet of Things.” With IoT, operators will be able to monitor and integrate widespread systems with one another; or, on a more personal level, with the press of a button, all household electronics will turn on/off and arm a home security alarm. Many devices are already growing in popularity, such as Apple’s iWatch, the Nest thermostat and the August Smart Lock.
Once IoT is fully implemented and past the point of early adoption, it will enable seamless communication between devices. According to a recent Gartner report, the world is currently on track to have an active installed base of 26 billion supported devices by 2020 (excluding PCs, tablets and smartphones). That’s nearly 30-times more than the 900 million IoT devices that were reported in 2009 – and the possibilities are virtually limitless.
Supporting the Internet of Things
Technologically, machine-to-machine communication is data light and intermittent, rather than constant and data heavy, the latter categorizing the primary use of wireless networks today due to tablets and smartphones.
Networks must adjust accordingly in order to support this emerging trend of interconnectivity. For example, in the instance of pressing a single button to turn off all electronics in your house while simultaneously arming your security system, a simple internal Wi-Fi or Bluetooth network is all that’s required to facilitate communication in this environment. Project this to a larger network-wide scale, where a great multitude of devices, geographically separated, are required to communicate to one another, and it becomes clear that new types of networks will be needed to optimally manage IoT-specific applications and systems.
Wireless operators must adjust accordingly in order to support this emerging trend of interconnectivity. In particular, operators must continue to build networks that can support typical heavy data use by today’s apps while creating new networks, designed to handle the frequent but constant bursts of data from IoT related apps. While maintaining a data connection is intensive, carrier networks will prove to be the true driving force in enabling an IoT future – without them, device-to-device communication will be largely limited to in-home Wi-Fi.
A system of checks and balances
As operators optimize their networks, it’s important to understand and consider the significance of benchmarking from third-party companies (benchmarking refers to testing and measuring network performance on a routine basis, before and after an upgrade, or against other networks). National operators commonly subject their infrastructure to strict benchmarking programs to monitor performance across networks both routinely as well as before and after major enhancement initiatives. This process yields key indicators that give operators a snapshot of performance, educating further infrastructure upgrade decisions and enabling operators to make legally supportable marketing claims.
In short, benchmarking acts as a checks and balances system for operators, and the same practice should be held for new emerging IoT-specific networks. If not, performance promised may not be performance delivered. The systems connected by IoT devices may be significant; if certain messages fail to deliver, it could mean an entire slip in reported analytics, negatively impacting the whole system. Benchmarking will be an important driver to ensuring a thriving environment for an era of interconnectivity.
One shows the number of airplane deaths per miles flown, which decreased to one-thousandth of what it was in 1945 with the advent of the Federal Aviation Administration in 1958 and stricter security and maintenance protocols. The other, which details the number of new computer security threats, shows the opposite. There has been more than a 10,000-fold increase in the number of new digital threats over the last 12 years.
The problem, Mr. Kocher and security experts reason, is a lack of liability and urgency. The Internet is still largely held together with Band-Aid fixes. Computer security is not well regulated, even as enormous amounts of private, medical and financial data and the nation’s computerized critical infrastructure — oil pipelines, railroad tracks, water treatment facilities and the power grid — move online.
After a year of record-setting hacking incidents, companies and consumers are finally learning how to defend themselves and are altering how they approach computer security.
If a stunning number of airplanes in the United States crashed tomorrow, there would be investigations, lawsuits and a cutback in air travel, and the airlines’ stock prices would most likely plummet. That has not been true for hacking attacks, which surged 62 percent last year, according to the security company Symantec. As for long-term consequences, Home Depot, which suffered the worst security breach of any retailer in history this year, has seen its stock float to a high point.
In a speech two years ago, Leon E. Panetta, the former defense secretary, predicted it would take a “cyber-Pearl Harbor” — a crippling attack that would cause physical destruction and loss of life — to wake up the nation to the vulnerabilities in its computer systems.
No such attack has occurred. Nonetheless, at every level, there has been an awakening that the threats are real and growing worse, and that the prevailing “patch and pray” approach to computer security simply will not do.
So what happened?
The Wake-Up Call
A bleak recap: In the last two years, breaches have hit the White House, the State Department, the top federal intelligence agency, the largest American bank, the top hospital operator, energy companies, retailers and even the Postal Service. In nearly every case, by the time the victims noticed that hackers were inside their systems, their most sensitive government secrets, trade secrets and customer data had already left the building. And in just the last week Sony Pictures Entertainment had to take computer systems offline because of an aggressive attack on its network.
The impact on consumers has been vast. Last year, over 552 million people had their identities stolen, according to Symantec, and nearly 25,000 Americans had sensitive health information compromised — every day — according to the Department of Health and Human Services. Over half of Americans, including President Obama, had to have their credit cards replaced at least once because of a breach, according to the Ponemon Group, an independent research organization.
And this year, American companies learned it was not just Beijing they were up against. Thanks to revelations by the former intelligence agency contractor Edward J. Snowden, companies worry about protecting their networks from their own government. If the tech sector cannot persuade foreign customers that their data is safe from the National Security Agency, the tech industry analysis firm Forrester Research predicts that America’s cloud computing industry stands to lose $180 billion — a quarter of its current revenue — over the next two years to competitors abroad.
“People are finally realizing that we have a problem that most had not thought about before,” said Peter G. Neumann, a computer security pioneer at SRI International, the Silicon Valley engineering research laboratory. “We may have finally reached a crossroads.”
Is There a Playbook?
Only certain kinds of companies, like hospitals and banks, are held up to scrutiny by government regulators when they are hacked. And legal liability hasn’t been established in the courts, though Target faces dozens of lawsuits related to a hack of that company’s computer network a little over a year ago.
But if there is a silver lining to the current predicament, Mr. Neumann and other security experts say, it is that computer security, long an afterthought, has been forced into the national consciousness.
Customers, particularly those abroad, are demanding greater privacy protections. Corporations are elevating security experts to senior roles and increasing their budgets. At Facebook, the former mantra “move fast and break things” has been replaced. It is now “move slowly and fix things.” Companies in various sectors have started informal information-sharing groups for computer security. And President Obama recently called on Congress to pass a national data breach law to provide “one clear national standard” rather than the current patchwork of state laws that dictate how companies should respond to data breaches.
There is growing recognition that there is no silver bullet. Firewalls and antivirus software alone cannot keep hackers out, so corporations are beginning to take a more layered approach to data protection. Major retailers have pledged to adopt more secure payment schemes by the end of next year. Banks are making it easier for customers to monitor their monthly statements for identity theft. And suddenly, pie-in-the-sky ideas that languished in research labs for years are being evaluated by American hardware makers for use in future products.
Credit Mel Evans/Associated Press
“People are recognizing that existing technologies aren’t working,” said Richard A. Clarke, the first cybersecurity czar at the White House. “It’s almost impossible to think of a company that hasn’t been hacked — the Pentagon’s secret network, the White House, JPMorgan — it is pretty obvious that prevention and detection technologies are broken.”
Companies that continue to rely on prevention and detection technologies like firewalls and antivirus products are considered sitting ducks.
“People are still dealing with this problem in a technical way, not a strategic way,” said Scott Borg, the head of the United States Cyber Consequences Unit, a nonprofit organization. “People are not thinking about who would attack us, what their motives would be, what they would try to do. The focus on the technology is allowing these people to be blindsided.
“They are looking obsessively at new penetrations,” Mr. Borg said. “But once someone is inside, they can carry on for months unnoticed.”
The Keys to Preparation
The companies most prepared for online attacks, Mr. Borg and other experts say, are those that have identified their most valuable assets, like a university’s groundbreaking research, a multinational’s acquisition strategy, Boeing’s blueprints to the next generation of stealth bomber or Target’s customer data. Those companies take additional steps to protect that data by isolating it from the rest of their networks and encrypting it.
That approach — what the N.S.A. has termed “defense in depth” — is slowly being adopted by the private sector. Now, in addition to firewalls and antivirus products, companies are incorporating breach detection plans, more secure authentication schemes, technologies that “white list” traffic and allow in only what is known to be good, encryption and the like.
“We’re slowly getting combinations of new technologies that deal with this problem,” Mr. Clarke said.
The most prominent examples are Google, Yahoo, Microsoft and Facebook. Mr. Snowden revealed that the N.S.A. might have been grabbing data from those companies in unencrypted form as it passed between their respective data centers. Now, they all encrypt their traffic as it flows internally between their own data centers.
Though intelligence analysts may disagree, security experts say all of this is a step in the right direction. But security experts acknowledge that even the most advanced security defenses can break down. A widely used technology sold by FireEye, one of the market leaders in breach detection, failed to detect malicious code in an independent lab test this year. The product successfully identified 93 percent of the threats, but as the testers pointed out, it is not the 99 percent of detected threats that matter. It is the 1 percent that are missed that allow hackers to pull off a heist.
Even when security technologies do as advertised, companies are still missing the alerts. Six months before Target was breached last year, it installed a $1.6 million FireEye intrusion detection system. When hackers tripped the system, FireEye sounded alarms to the company’s security team in Bangalore, which flagged the alert for Target’s team at its headquarters in Minneapolis. Then nobody reacted until 40 million credit card numbers and information on 70 million more customers had been sent to computers in Russia, according to several investigators.
Part of the problem, security chiefs say, is “false positives,” the constant pinging of alerts anytime an employee enters a new database or downloads a risky app or email attachment. The result, they complain, is a depletion of resources and attention.
“We don’t need ‘big data.’ We need big information,” said Igor Baikalov, a former senior vice president for global information security at Bank of America, now chief scientist at Securonix, a private company that sells threat intelligence to businesses.
Securonix is part of a growing class of security start-ups, which includes Exabeam and Vectra Networks in Silicon Valley and several other companies that use the deluge of data from employee computers and personal devices to give security officers intelligence they can act on.
Many companies in the Fortune 500 are building their own systems that essentially do the same thing. These technologies correlate unusual activity across multiple locations, then raise an alarm if they start to look like a risk. For example, the technologies would increase the urgency of an alert if an employee suddenly downloaded large amounts of data from a database not regularly used, while simultaneously communicating with a computer in China.
The future of security, experts say, won’t be based on digital walls and moats but on these kinds of newer data-driven approaches.
“Most large organizations have come to the painful recognition that they are already in some state of break-in today,” said Asheem Chandna, a venture capital investor at Greylock Partners. “They are realizing they need to put new and advanced sensors in their network that continuously monitor what is going on.”
While much progress is being made, security experts bemoan that there is still little to prevent hackers from breaking in in the first place.
In May, the F.B.I. led a crackdown on digital crime that resulted in 90 arrests, and Robert Anderson, one of the F.B.I.’s top officers on such cases, said the agency planned to take a more aggressive stance. “There is a philosophy change. If you are going to attack Americans, we are going to hold you accountable,” he said at a cybersecurity meeting in Washington.
Still, arrests of hackers are few and far between.
“If you look at an attacker’s expected benefit and expected risk, the equation is pretty good for them,” said Howard Shrobe, a computer scientist at the Massachusetts Institute of Technology. “Nothing is going to change until we can get their expected net gain close to zero or — God willing — in the negative.”
Until last year, Dr. Shrobe was a manager at the Defense Advanced Research Projects Agency, known as Darpa, overseeing the agency’s Clean Slate program, a multiproject “Do Over” for the computer security industry. The program included two separate but related projects. Their premise was to reconsider computing from the ground up and design new computer systems that are much harder to break into and that recover quickly when they have been breached.
“ ‘Patch and pray’ is not a strategic answer,” Dr. Shrobe said. “If that’s all you do, you’re going to drown.”
The Digital Infrastructure, our third mainport, is a driver of the rapidly expanding Online Services sector.
In Europe, the Dutch are among the frontrunners in the area of Digital Infrastructure (Internet connectivity, colocation housing and hosting). In many ways this infrastructure fulfills a gateway function, similar to that of Schiphol Airport and the Rotterdam Harbor. While we all have an idea on how the latter impact our economy, much less is known about the impact of Digital Infrastructure on the Dutch economy. In this report we argue that the Digital Infrastructure, despite its modest size, is a driver of the much larger and rapidly expanding Internet economy, impacting the fortunes of future economic growth in the Netherlands.
Growth and developments in the sector
In 2013 we already concluded that the Dutch have a world class Digital Infrastructure. This year we see that the importance of this sector -inhabited by Internet exchange points, and many housing and hosting parties -is on the rise. Amsterdam based AMS-IX remains the largest Internet exchange point in the world and the colocation housing market, centered around Amsterdam, produces strong growth rates. The Dutch also rank among Europe’s elite in hosting.
Significance for the Dutch economy
The high ranking of the Netherlands Digital Infrastructure is an interesting observation, but how do we benefit from it? Our research shows that the direct economic impact of the sector itself in terms of employment, and the indirect effects of the sector on e.g. suppliers, construction companies and workforce spending, is limited related to our total GDP. The real value of the Digital Infrastructure sector, however, lies in its significant impact on the much larger Internet economy and broader digital society. The picture emerges that Digital Infrastructure cannot be separated from a successful digital society, placing the Dutch in a favorable position to profit from digital growth.
Telco’s are facing a fierce attack from OTT players. They assumed that ignoring them would solve the problem but it’s getting more complicated!
At LTE North America in Dallas this week, a panel featuring Wireless 2020, IBM, Allot Communications and US-based MVNO FreedomPop urged the service provider community to develop more flexible and tailored data bundles for customers.
During a discussion focused on future revenue generation opportunities, conversation moved towards the current landscape which is seeing telcos lose ground to over the top content providers. Ken Jackson from IBM Now Factory believes that moving to specific and tailored, content-based services is a feasible opportunity for operators to monetise services in new ways.
“Let’s look at the example of a service called NFL Now, a friend of mine can watch as much NFL as he wants, and doesn’t pay for the data he uses, he pays to watch football,” he said. “We have to return the value proposition to the customer, find out what it is they want to do, and offer it to them. Make everything customer centric, and orchestrate your business around what they need”
Haig Sarkissian from Wireless 2020 concurred with Jackson, and urged the service provider community to move away from basic all-you-can-eat data plans.
“5% of the heaviest users consume 40-50% of the network, and they pay the same as the majority,” he said. “That’s why unlimited data plans are unfair, because the majority are subsidising the data usage of the minority. Operators find that this is no longer scalable because if you eat more you have to invest more into capacity. I don’t see it changing any time in the future, unless there’s a fair way of using these “dumb pipe” plans. Is there hope for SP differentiation outside of these unlimited data plans?”
Allot Communications’ John Priest said service delivery is becoming far more personalised and tailored than in the past, suggesting operators should have a think about how they allow users to access the information and the content they want.
“I think we need things like VAS for the customer to make them feel in control of what they need. It’s not just the volume, it’s about the content they want and delivering that to them. Users want to know what they’re consuming, so the SP has to know what they’re consuming and how they’re consuming it, so that you can be more proactive in the customer care and guarantee a higher quality of service.”
“But here are a couple of examples of opportunities for SPs to generate revenue aside from traditional voice and data pricing models. Sponsored data solutions, for example, sees the user get free data if they’re going to certain sites, which is a partnership between the SP and the content provider. Similarly to music streaming services, SPs can strike up a partnership with advertisers.”
US MVNO FreedomPop’s Mauricio Sastre suggested there are opportunities for partnerships between carriers, app developers and content providers, help users consume new applications for reduced costs, or for free.
“In an effort to get their app out there, the app developers are considering paying for the data that their app consumes, so there can be a subsidised way for users to consume their application.”
Of course, when debating content services over service provider networks, we naturally reach the inevitable pain point of net neutrality, Wireless 2020’s Sarkissian rounded of the discussion on such a note.
“When do we get into the net neutrality debate, the big elephant in the room? Do you decide to charge more for content from Netflix, or NFL Now, or WebRTC services?” he said. “In my opinion, any resource that’s finite, and has the potential to be fully consumed, should be left alone from regulation. Otherwise you jeopardise the investment into, and the competition of, all of these services.”
Telstra switched on the first 150 of its free wi-fi hotspots throughout Australia last Wednesday, and expect a total of 1,000 to be operational before Boxing Day.
The move comes as part of Telstra’s $100m Wifi Nation plan, which pledges to establish two million hotspots across the country over the next five years.
The initial locations are amongst some of Australia’s busiest areas, including Melbourne’s Bourke Street Mall, Sydney’s Hyde Park and Brisbane’s King George Square. Future rollouts are expected to target “the heart of local communities”, as well as existing Telstra retail outlets and exchange buildings. Plans are also in place to follow the lead of other trial services in New York City and New Zealand, and repurpose existing payphone locations as wireless hotspots, with up to 8,000 such sites in mind.
Users will be allotted 30 minutes of access time at these hotspots, with existing Telstra customers able to use their home broadband accounts to access these hotspots indefinitely.
“This trial marks the beginning of our ambition to switch on more than two million hotspots across the nation over 5 years and give customers the best Wi-Fi experience in and out of the home,” Gordon Ballantyne, Executive for Telstra Retail, told Mumbrella. “We want customers to have greater options for connecting when they’re out and about. From browsing the web, streaming videos or sharing photos with friends, we want customers to have a taste of what the network will be like next year when Telstra Wi-Fi members will be able to use their home broadband allowance at the hotspots.”
The hotspot locations to be trialled before Christmas will be in towns and regional hubs traditionally popular during the holiday season, Ballantyne added.
Wireless hotspots in a busy shopping district represent a great opportunity for retailers, allowing them to expand online offerings and implement in-depth omnichannel marketing strategies while also providing potential customers the convenience of free internet access. That said, having the hotspots owned and operated by major telecommunications providers exposes a retailer to the potential of restricted utility and competition for bandwidth.
In a further, arguably more controversial move, Telstra have floated the possibility of allowing home broadband customers the option to open their wireless routers to Telstra’s free wi-fi network, allowing strangers on the street access to their bandwidth. Rolling out this service through new model routers for home broadband users, Telstra expect their wi-fi hotspot network to reach two million within the next five years.
This broad-stroke project is mirrored by the approach of competing telco Optus, who as Power Retail reported last month was consulting with local councils in a more selective, pick-and-choose state of mind.
It’s been estimated that the volume of global monthly mobile data traffic will exceed 15 exabytes by 2018. LTE is already proving to be a major bandwidth hog. While 4G represents only a fraction of mobile connections today, it accounts for at least 30% of mobile data traffic, thanks to a surge in high-bandwidth content such as video calling and music streaming.
Yet, the growth in bandwidth demand is not only about smartphones, tablets and other mobile computing gadgets. The sales of these devices are set to reach 2.4 billion units this year, but other types of connected ‘things’ will require their share of the already stretched networks too. Industry analysts have estimated that the number of wireless connected things will exceed 16 billion in 2014, up 20% from the year before. This growth is set to continue as the Internet of Things gathers pace, with more than double the number of connected devices – 40.9 billion – forecasted for 2020.
As existing 3G and 4G networks struggle to cope with the influx in data traffic, mobile operators are looking at solutions to offload traffic from their current base station networks. Small cells will be their solution of choice – so the number of small cells networks deployed across Europe is going to increase dramatically over the next few years. Small cells that are connected to city-wide superfast fibre networks will be the most economic and scalable way of ensuring that the needs of mobile users for more and more bandwidth are met in the future. Small cells will also be an enabler for the Internet of Things, paving the way for more connections than ever before.
Shortcomings of rooftop base stations
Today’s badly congested 3G and 4G networks rely on rooftop base stations. Many operators have been scrambling to acquire enough rooftop space for LTE, but still 4G networks don’t often meet their bandwidth hungry customers’ expectations, especially in dense urban areas such as pedestrian zones. While filling rooftops with base stations might have been a good solution for 3G, in the LTE era, the cells are becoming smaller, and mobile operators need ten times more base stations to cover the same footprint of a city.
Imagine a situation today where you have five people waiting for a bus, all with a brand new 150 mbps iPhone 6. The existing rooftop base station infrastructure is not able to cope with the sudden surge in bandwidth demand, as all five try to read the news, order groceries or download a restaurant menu, at the same time.
Recognising the need for faster evolution of mobile networks, the European Commission has committed to investing up to €700 million for the developments of ‘ubiquitous 5G communication systems’. This funding is part of a joint public and private sector initiative that aims to overcome today’s data traffic challenges. The ambitious goals of this 5G initiative include increasing wireless area capacity by a factor of 1,000 compared to 2010, creating a high-bandwidth network with 0% downtime, and enabling the roll-out of very dense wireless networks that are able to connect over 7 trillion devices amongst 7 billion people.
Getting ready for the future
As mobile operators gear themselves up for 5G, many of them realise that they can no longer rely on rooftop base stations. Why would a customer splurge on a 5G contract and a 5G-ready smartphone, if they aren’t able to get superfast download speeds? Instead, they will go to an operator that is able to give them the capacity they crave.
To eliminate the well-known capacity problems with rooftop base stations, future proof their networks and stay competitive, more and more European mobile operators are starting to tap into small cells. They are realising only small cells connected to fibre can bring mobile users the great user experience they expect on their LTE-enabled superfast mobile devices – down at street level where it really matters. When connected to fibre networks, these small cells can collectively deliver up to Gigabytes per second of capacity, making entire cities 5G ready in a cost effective way.
The mobile operator community has been talking about the potential of small cells for a couple of years, but up until recently, the size of the boxes prevented their widespread use. All leading networking vendors have invested in the development of more suitable equipment, so the technology is now ready to allow mobile operators to start planning their roll-outs in earnest.
To be able to roll out faster than their rivals, many European mobile operators are now starting to buy space on lampposts, billboards, bus stops or even public toilets, and equip them with small cells.
Small cells – the only way to 5G
Still in recovery from the substantial investment needed for 4G, some cost-conscious mobile operators might be tempted to tighten the purse strings with small cells to protect their margins.
Yet, they really don’t have a choice but to invest. If they don’t, they will lose customers. It’s as simple as that. Why would a user buy a top of the range LTE-enabled smartphone or smartwatch, if they aren’t able to make the most of its superfast download speeds – unless they are standing on a rooftop? Instead, they will get their device from an operator that is able to give them the capacity they crave.
Other small cells-ready players aren’t the only competitive threat for mobile operators. Street furniture providers might eat into the profits of those mobile operators who drag their heels over small cells too. Through city-wide wifi schemes, street furniture companies are eliminating completely the need for mobile users to use their operator for data in some cases. Why would a mobile user pay a premium for patchy 5G connectivity, if they can get better speeds and coverage with free wifi?
In any way you look at it, 5G will only materialise with small cells connected to existing superfast fibre networks. And all European mobile operators’ competitiveness – and survival – will rely on 5G.