Tag Archives: Beamforming

Top Five Questions About 6G Technology

28 Sep

As 5G continues to roll out, work is already well underway on its successor. 6G wireless technology brings with it a promise for a better future. Among other goals, 6G technology intends to merge the human, physical, and digital worlds. In doing so, there is a hope that 6G can significantly aid in achieving the UN Sustainable Development Goals.

Keysight Technologies, Tuesday, September 27, 2022, Press release picture

This article answers some of the most common questions surrounding 6G and provides more insight into the vision for 6G and how it will achieve these critical goals.

1. What is 6G?

In a nutshell, 6G is the sixth generation of the wireless communications standard for cellular networks that will succeed today’s 5G (fifth generation). The research community does not expect 6G technology to replace the previous generations, though. Instead, they will work together to provide solutions that enhance our lives.

While 5G will act as a building block for some aspects of 6G, other aspects need to be new for it to meet the technical demands required to revolutionize the way we connect to the world in a fashion.

The first area of improvement is speed. In theory, 5G can achieve a peak data rate of 20 Gbps even though the highest speeds recorded in tests so far are around 8 Gbps. In 6G, as we move to higher frequencies – above 100 GHz – the goal peak data rate will be 1,000 Gbps (1 Tbps), enabling use cases like volumetric video and enhanced virtual reality experiences.

In fact, we have already demonstrated an over-the-air transmission at 310 GHz with speeds topping 150 Gbps.

In addition to speed, 6G technology will add another crucial advantage: extremely low latency. That means a minimal delay in communications, which will play a pivotal role in unleashing the internet of things (IoT) and industrial applications.

6G technology will enable tomorrow’s IoT through enhanced connectivity. Today’s 5G can handle one million devices connected simultaneously per square kilometer (or 0.38 square miles), but 6G will make that figure jump up to 10 million.

But 6G will be much more than just faster data rates and lower latency. Below we discuss some of the new technologies that will shape the next generation of wireless communications.

2. Who will use 6G technology and what are the use cases?

We began to see the shift to more machine-to-machine communication in 5G, and 6G looks to take this to the next level. While people will be end users for 6G, so will more and more of our devices. This shift will affect daily life as well as businesses and entire industries in a transformational way.

Beyond faster browsing for the end user, we can expect immersive and haptic experiences to enhance human communications. Ericsson, for example, foresees the emergence of the “internet of senses,” the possibility to feel sensations like a scent or a flavor digitally. According to one Next Generation Mobile Networks Alliance (NGMN) report, holographic telepresence and volumetric video – think of it as video in 3D – will also be a use case. This is all so that virtual, mixed, and augmented reality could be part of our everyday lives.

However, 6G technology will likely have a bigger impact on business and industry – benefiting us, the end users, as a result. With the ability to handle millions of connections simultaneously, machines will have the power to perform tasks they cannot do today.

The NGMN report anticipates that 6G networks will enable hyper-accurate localization and tracking. This could bring advancements like allowing drones and robots to deliver goods and manage manufacturing plants, improving digital health care and remote health monitoring, and enhancing the use of digital twins.

Digital twin development will be an interesting use case to keep an eye on. It is an important tool that certain industries can use to find the best ways to fix a problem in plants or specific machines – but that is just the tip of the iceberg. Imagine if you could create a digital twin of an entire city and perform tests on the replica to assess which solutions would work best for problems like traffic management. Already in Singapore, the government is working to build a 3D city model that will enable a smart city in the future.

3. What do we need to achieve 6G?

New horizons ask for new technology. It is true that 6G will greatly benefit from 5G in areas such as edge computing, artificial intelligence (AI), machine learning (ML), network slicing, and others. At the same time, we need changes to match new technical requirements.

The most sensible demand is understanding how to work in the sub terahertz frequency. While 5G needs to operate in the millimeter wave (mmWave) bands of 24.25 GHz to 52.6 GHz to achieve its full potential, the next generation of mobile connectivity will likely move to frequencies above 100 GHz in the ranges called sub-terahertz and possibly as high as true terahertz.

Why does this matter? Because as we go up in frequency, the wave behaves in a different way. Before 5G, cellular communications used only spectrum below 6GHz, and these signals can travel up to 10 miles. As we go up into the mmWave frequency band, the range is dramatically reduced to around 1,000 feet. With sub THz signals like those being proposed for 6G, the distance the waves can travel tends to be smaller – think 10s to 100s of feet not 1000s.

That said, we can maximize the signal propagation and range by using new types of antennas. An antenna’s size is proportional to the signal wavelength, so as the frequency gets higher and the wavelength gets shorter, antennas are small enough to be deployed in a large number. In addition, this equipment uses a technique known as beamforming – directing the signal toward one specific receiver instead of radiating out in all directions like the omnidirectional antennas commonly used prior to LTE.

Another area of interest is designing 6G networks for AI and ML. 5G networks are starting to look at adding AI and ML to existing networks, but with 6G we have the opportunity to build networks from the ground up that are designed to work natively with these technologies.

According to one International Telecommunication Union (ITU) report, the world will generate over 5,000 exabytes of data per month by 2030. Or 5 billion terabytes a month. With so many people and devices connected, we will have to rely on AI and ML to perform tasks such as managing data traffic, allowing smart industrial machines to make real-time decisions and use resources efficiently, among other things.

Another challenge 6G aims to tackle is security – how to ensure the data is safe and that only authorized people can have access to it – and solutions to make systems foresee complex attacks automatically.

One last technical demand is virtualization. As 5G evolves, we will start to move to the virtual environment. Open RAN (O-RAN) architectures are moving more processing and functionality into the cloud today. Solutions like edge computing will be more and more common in the future.

4. Will 6G technology be sustainable?

Sustainability is at the core of every conversation in the telecommunications sector today. It is true that as we advance 5G and come closer to 6G, humans and machines will consume increasing data. Just to give you an idea of our carbon footprint in the digital world, one simple email is responsible for 4 grams of carbon dioxide in the atmosphere.

However, 6G technology is expected to help humans improve sustainability in a wide array of applications. One example is by optimizing the use of natural resources in farms. Using real-time data, 6G will also enable smart vehicle routing, which will cut carbon emissions, and better energy distribution, which will increase efficiency.

Also, researchers are putting sustainability at the center of their 6G projects. Components like semiconductors using new materials should decrease power consumption. Ultimately, we expect the next generation of mobile connectivity to help achieve the United Nations’ Sustainable Development Goals.

5. When will 6G be available?

The industry consensus is that the first 3rd Generation Partnership Project (3GPP) standards release to include 6G will be completed in 2030. Early versions of 6G technologies could be demonstrated in trials as early as 2028, repeating the 10-year cycle we saw in previous generations. That is the vision made public by the Next G Alliance, a North American initiative of which Keysight is a founding member, to foster 6G development in the United States and Canada.

Before launching the next generation of mobile connectivity into the market, international bodies discuss technical specifications to allow for interoperability. This means, for example, making sure that your phone will work everywhere in the world.

The ITU and the 3GPP are among the most well-known standardization bodies and hold working groups to assess research on 6G globally. Federal agencies also play a significant role, regulating and granting spectrum for research and deployment.

Amid all this, technology development is another aspect to keep in mind. Many 6G capabilities demand new solutions that often use nontraditional materials and approaches. The process of getting these solutions in place will take time.

The good news? The telecommunications sector is making fast progress toward the next G.

Here at Keysight, for instance, we are leveraging our proven track record of collaboration in 5G and Open RAN to pioneer solutions needed to create the foundation of 6G. We partner with market leaders to advance testing and measurement for emerging 6G technologies. Every week, we come across a piece of news informing that a company or a university has made a groundbreaking discovery.

The most exciting thing is that we get an inch closer to 6G every day. Tomorrow’s internet is being built today. Join us in this journey; it is just the beginning.

Learn more about the latest advancements in 6G research.

View additional multimedia and more ESG storytelling from Keysight Technologies on 3blmedia.com.

SOURCE: Keysight Technologies – https://www.accesswire.com/717630/Top-Five-Questions-About-6G-Technology – 28 09 22

NTT Docomo’s 5G RAN Infrastructure

26 Nov

In this post we will look at the 5G Infrastructure that Docomo is using in their network. It is detailed in their latest Technical Journal here. In this post we will look at the infrastructure part only.

The 5G network configuration is shown in Figure 4. With a view to 5G service development, NTT DOCOMO developed a Central Unit (CU) that consolidates the Base Band (BB) signal processing section supporting 5G, extended existing BB processing equipment known as high-density Base station Digital processing Equipment (BDE), and developed a 5G Radio Unit (RU) having signal transmit / receive functions. Furthermore, to have a single CU accommodate many RUs, NTT DOCOMO developed a 5G version of the FrontHaul Multiplexer (FHM) deployed in LTE. Each of these three types of equipment is described below.

1) CU
(a) Development concept: With the aim of achieving a smooth rollout of 5G services, NTT DOCOMO developed a CU that enables area construction without having to replace existing equipment while minimizing the construction period and facility investment. This was accomplished by making maximum use of the existing high-density BDE that performs BB signal processing, replacing some of the cards of the high-density BDE, and upgrading the software to support 5G.
(b) CU basic specifications: An external view of this CU is shown in Photo 1. This equipment has the features described below (Table 3). As described above, this equipment enables 5G-supporting functions by replacing some of the cards of the existing high-density BDE. In addition, future software upgrades will load both software supporting conventional 3G/LTE/LTE-Advanced and software supporting 5G. This will enable the construction of a network supporting three generations of mobile communications from 3G to 5G with a single CU.

The existing LTE-Advanced system employs advanced Centralized RAN (C-RAN) architecture proposed by NTT DOCOMO. This architecture is also supported in 5G with the connection between CU and RUs made via the fronthaul. Standardization of this fronthaul was promoted at the Open RAN (O-RAN) Alliance jointly established in February 2018 by five operators including NTT DOCOMO.  Since the launch of 5G services, the fronthaul in the NTT DOCOMO network was made to conform to these O-RAN fronthaul specifications that enable interoperability between different vendors, and any CU and RU that conform to these specifications can be interconnected regardless of vendor. The specifications for inter-connecting base-station equipment also con-form to these O-RAN specifications, which means that a multi-vendor connection can be made between a CU supporting 5G and a high-density BDE supporting LTE-Advanced. This enables NTT DOCOMO to deploy a CU regardless of the vendor of the existing high-density BDE and to quickly and flexibly roll out service areas where needed while making best use of existing assets. In addition, six or more fronthaul connections can be made per CU and the destination RU of each fronthaul connection can be se-lected. Since 5G supports wideband trans-mission beyond that of LTE-Advanced, the fronthaul transmission rate has been extend-ed from the existing peak rate of 9.8 Gbps to a peak rate of 25 Gbps while achieving a CU/RU optical distance equivalent to that of the existing high-density BDE.
2) RU
(a) Development concept: To facilitate flexible area construction right from the launch of 5G services, NTT DOCOMO developed the low-power Small Radio Unit (SRU) as the RU for small cells and developed, in particular, separate SRUs for each of the 3.7 GHz, 4.5 GHz, and 28 GHz frequency bands provided at the launch of the 5G pre-commercial service in September 2019. Furthermore, with an eye to early expansion of the 5G service area, NTT DOCOMO developed the Regular power Radio Unit (RRU) as the RU for macrocells to enable the efficient creation of service areas in suburbs and elsewhere.
A key 5G function is beamforming that aims to reduce interference with other cells and thereby improve the user’s quality of experience. To support this function, NTT DOCOMO developed a unit that integrates the antenna and 5G radio section (antenna-integrated RU). It also developed a unit that separates the antenna and 5G radio section (antenna-separated RU) to enable an RU to be placed alongside existing 3G/LTE/LTE-Advanced Radio Equipment (RE) and facilitate flexible installation even for locations with limited space or other constraints.

(b) SRU basic specifications: As described above, NTT DOCOMO developed the SRU to enable flexible construction of 5G service areas. It developed, in particular, antenna-integrated SRUs to support each of the 3.7 GHz, 4.5 GHz, and 28 GHz frequency bands provided at the launch of the 5G pre-commercial service and antenna-separated SRUs to support each of the 3.7 GHz and 4.5 GHz frequency bands (Photo 2). These two types of SRUs have the following features (Table 4).

The antenna-integrated RU is equipped with an antenna panel to implement the beamforming function. In the 3.7 GHz and 4.5 GHz bands, specifications call for a maximum of 8 beams, and in the 28 GHz band, for a maximum of 64 beams. An area may be formed with the number of transmit/receive beams tailored to the TDD Config used by NTT DOCOMO. In addition, the number of transmit/receive branches is 4 for the 3.7 GHz and 4.5 GHz bands and 2 for the 28 GHz band, and MIMO transmission/reception can be performed with a maximum of 4 layers for the former bands and a maximum of 2 layers for the latter band.
The antenna-separated SRU is configured with only the radio as in conventional RE to save space and facilitate installation. With this type of SRU, the antenna may be installed at a different location. Moreover, compared to the antenna-integrated SRU operating in the same frequency band, the antenna-separated SRU reduces equipment volume to 6.5ℓ or less. The antenna-separated SRU does not support the beamforming function, but features four transmit/receive branches the same as the antenna-integrated SRU for the same frequency band.
(c) RRU basic specifications: The RRU was developed in conjunction with the 5G service rollout as high-power equipment compared with the SRU with a view to early expansion of the 5G service area (Photo 3). This type of equipment has the following features (Table 5).

Compared with existing Remote Radio Equipment (RRE) for macrocells, the volume of RRU equipment tends to be larger to support 5G broadband, but in view of the latest electronic device trends, NTT DOCOMO took the lead in developing and deploying an antenna-separated RRU that could save space and reduce weight. Maximum transmission power is 36.3 W/100 MHz/branch taking the radius of a macrocell area into account. The RRU features four transmit/receive branches and achieves the same number of MIMO transmission/reception layers as the antenna-separated SRU.
NTT DOCOMO also plans to deploy an antenna-integrated RRU at a later date. The plan here is to construct 5G service areas in a flexible manner making best use of each of these models while taking installation location and other factors into account.
3) 5G FHM
The 5G FHM is equipment having a multiplexing function for splitting and combining a maximum of 12 radio signals on the fronthaul. It was developed in conjunction with the 5G service rollout the same as RRU (Photo 4).

If no 5G FHM is being used, each RU is accommodated as one cell, but when using a 5G FHM, a maximum of 12 RUs can be accommodated as one cell in a CU. At the launch of 5G services, this meant that more RUs could be accommodated in a single CU when forming a service area in a location having low required radio capacity (Figure 5). Additionally, since all RUs transmit and receive radio signals of the same cell, the 5G FHM can inhibit inter-RU interference and the occurrence of Hand-Over (HO) control between RUs as in the conventional FHM. Furthermore, the 5G FHM supports all of the 5G frequency bands, that is, the 3.7 GHz, 4.5 GHz, and 28 GHz bands, which means that service areas can be constructed in a flexible manner applying each of these frequency bands as needed.

All the fronthaul and other interfaces that Docomo used in their network was based on O-RAN alliance specifications. In a future post, we will look at some of the details.

Source: https://www.telecomsinfrastructure.com/2020/11/ntt-docomos-5g-ran-infrastructure.html 26 11 20

The “Next Big Thing” in Technology : 20 Inventions That Will Change the World

23 Jul

An article titled “I Compiled a List of Tech’s “Next Big Things” So You Wouldn’t Have to” in 2018. As anyone reading it in the 2020s will notice, a lot of what was written then is now obsolete. It is thus necessary to write an update, highlighting the key technologies emerging today that will be all the rage in in 2022, 2025 and 2030s.

Obviously, these dates should be taken with a grain of salt : predictions are wrong more often than not. They are often wrong because we tend to use history, which is at heart the study of surprises and changes, as a guide to the future. This should however not stop us from aiming to better understand the future of technology : the knowledge gained through planning is crucial to the selection of appropriate actions as future events unfold. We don’t know the answer, but we can at least ask useful questions and catalyse the conversation.

The boring, expected stuff (2022 technologies)

1. Blockchain

By now, we’ve all heard about blockchain revolutionising just about every industry imaginable. Banking, politics, healthcare… all could technically benefit from the creation of a decentralised digital ledger which tracks and stores information in various places, thus making forgery impossible. Identification is provided through complex calculations, making identity theft virtually impossible, too.

There is however one word which stands out in the description above. Decentralised. Banks, governments, hospitals… these institutions don’t want to see their power curtailed (unless on their own terms). As such, it is likely that we will see some advances in the blockchain space, but it will remain on the fringes of technology, missing the revolution predicted by its (many) fans.

More on blockchain here [Investopedia].

2. Cryptocurrency

Often mentioned in the same breath as blockchain, cryptocurrencies use the principles explained above to facilitate the exchange of goods and services online (again in a decentralised fashion, which is one of its main appeals).

Sounds fantastic, but there are two big issues with cryptocurrencies :

  • It’s key appeal (excluding illegal dealings) is that it’s cool and trendy. It was never meant to sustain the attention it got in 2017, and will never recover from the crypto-bros’ unrelenting idiocy. The technology works, there’s just no mass market.

  • Secondly, its value is VERY subjective (unlike gold, don’t @ me). Crypto-currencies are always either in pre-bubble or bubble territory. Add to that the decentralised aspect that governments and banks will seek to discredit, and you can be sure that it will continue to be a mere toy Chad keeps bringing up at frat parties (there’s always a Chad).

More on cryptocurrency here [Investopedia].

3. Affective AI / Affective computing

Artificial Intelligence is already everywhere in 2020; it’s just not as fun as we thought it’d be. If you’ve missed the AI train, it can be described as follows : the increase in storage space (cloud), calculation capabilities (chips) and access to massive datasets (e-commerce, social media…) has allowed companies to create statistical models on steroid which can evolve when fed new information.

Affective AI would take this process one step further and apply it to emotions. Effectively, an algorithm could tell your mood from the way you look (by training a deep learning algorithm on facial data), the way you write, or the way you speak, and offer a product or service in accordance. Feeling happy ? How about a Starbucks advert for a frappuccino to keep the good times coming ? Feeling down ? How about a Starbucks advert for a frozzen coffee to turn that frown upside down ?

More on Artificial Intelligence here [The Pourquoi Pas], and more on affective computing here [MIT].

4. AI Cloud Services / Data-as-a-service / AI PaaS

Most great technologies aren’t considered to be revolutionary until they reach the public en masse (forgive my French). This may be one of the reasons there’s been so much disappointment in AI of late. Indeed, only major companies have been able to benefit from automating tasks that once required human input, while the petit peuple is forced to continue using comparatively medieval algorithms. This can in part be explained by a lack computing power within individual households, but is mostly a data problem.

This may not be the case for long. Companies are realising that renting an algorithm gives the double benefit of generating extra revenue from an existing asset, while extracting more data from customers to feed the beast. As such, get ready to witness the rise of AI platforms and marketplaces, which will promise to provide algorithms that specifically match unique customer pain points (chatbots and digital assistants are only the beginning). As devs get automated and join the gig economy, this movement is likely to expand exponentially. This would allow smaller companies, and even individuals, to optimise their day-to-day processes. If that seems harmful to our collective mental health, follow your instincts.

More on AI as a service here [Towards Data Science].

5. Connected Homes / Smart Homes

The trend of artificial intelligence in our homes is already ongoing, and will only accelerate over the next few years. In fact, we’ve already become accustomed to Google’s Nest and Amazon’s Alexa being able to adjust the settings of smart objects within our houses to fit pre-set parameters.

But these two use cases are just the beginning : as with most things internet-related, these services benefit from network effects, and will exponentially gain customer value as functionalities are added. An algorithm that can make a cup of coffee while opening the blinds and increasing the bathroom temperature when it senses someone waking up is a lot more valuable than the sum of three different algorithms doing these tasks.

More on connected homes here [McKinsey]

6. 5G

Of course, connected objects cannot afford to be as laggy as the original iPhone (shots fired) : they must transmit massive amounts of data quickly and reliably. That’s where 5G comes in.

5G is the logical successor to 4G, and achieves much greater speeds thanks to higher-frequency radio waves. Though this seems simple enough, a few terms have to be understood to fully capture the difficulty of implementing 5G throughout the world.

  • Millimeter waves : this refers to a specific part of the radio frequency spectrum between 24GHz and 100GHz, which have a very short wavelength. Not only is this section of the spectrum pretty much unused, but it can also transfer data incredibly fast, though its transfer distance is shorter.

  • Microcells, femtocells, picocells : Small cell towers which act as relays within comparatively small areas such as large buildings. This infrastructure is necessary : as highlighted above, 5G transfer distance is much shorter than that of 4G (and struggles to go through thick walls).

  • Massive MIMO : The ability to transfer and receive much more data than when using 4G, from a wider variety of sources.

  • Beamforming : all these transfers need to be organised and choreographed. Beamforming does just that. Also, it sounds cool.

  • Full Duplex : the ability to send and receive data at the same time, on the same wavelength.

The technology will have a huge effect on most industries as it will change orders of magnitude in terms of the speed and quantity of data transmitted, as well as the quality of the connection. It will, among other things, connect autonomous vehicles and drones to the internet, but will also allow major advances in virtual reality and IoT. 5G is therefore not a technology that should be taken lightly.

More on 5G here [CNN]

7. Mega-constellations of satellites / Low-earth orbit satellite systems

Speaking of Internet… Over the next few years, SpaceX plans to deploy up to 42,000 satellites to create an Internet connection anywhere on the planet. The company isn’t alone in this niche: the OneWeb constellation aims to include 600 satellites by 2022, and Amazon has announced plans to launch 3,236 low-orbit satellites to cover white areas.

All this is made possible thanks to the low cost of launching these nanosatellites, which weigh barely a few pounds. A lower altitude would also make managing fleets a lot easier and cleaner.

The deployment in space of so many objects, however, poses problems in terms of interference with other satellite services, increasing the risk of collision and disturbing astronomical observation.

More on mega-constellations here [The Verge]

The pretty OK stuff (2025 technologies)

8. Autonomous Vehicles

2020 was supposed to be the year of the autonomous car. That’s not worked out quite as expected. The “coronavirus setback” will however not dampen large companies’ spirits, which will continue to update their algorithms to create cars that do away with drivers entirely.

As a quick reminder, it is generally agreed that there are 5 levels of autonomous driving, ranging from “no automation” to “full automation”. Level 0 to 2 require extensive human monitoring,while levels 3 to 5 rely on algorithms to monitor the driving environment. The most advanced autonomous cars on the market (Tesla) are currently straddling level 3 and 4. It is hoped that we can make the jump to level 5 (and full driving automation) by 2025, if not earlier. But the road ahead is long, as issues ranging from ethical dilemmas to statistical headaches still plague the industry.

Even if level 5 is reached, it’s likely that we will never truly replace the cars as we know it, but instead create special roads and spaces for autonomous cars, so that the two don’t mix. Indeed, the car as we know it is so central to our daily lives that changing it may mean rebuilding most of our daily world : parking would become less important, charging stations would change, the ways pedestrians interact with safer roads would be forever altered…

More on Autonomous vehicles here [Spectrum].

9. Quantum computing

First things first : scientists have been announcing the arrival of the quantum computer for over 50 years. But this time might be it. In October 2019, Google announced that it had achieved quantum supremacy (superiority of a quantum computer compared to a conventional computer on a particular task) by performing in three minutes a calculation which would require approximately 10,000 years on a conventional supercomputer. These figures were challenged by IBM, which estimates that a conventional computer program could have solved it in just 2.5 days.

Quantum computers, where bits are replaced by qubits with superimposable states (ex : a 0 can also be a 1 at the same time), are in theory much faster and more efficient than their older brothers, but tend to suffer from decoherence issues (loss of information). Nevertheless, developing them for pharmaceutical companies, for example, could theoretically lead to major breakthroughs in medicine creation.

More interestingly, quantum computers could easily figure out encrypted blockchain passwords, making the whole thing irrelevant (did I not say earlier that Bitcoin was doomed).

More on Quantum computing here [MIT Technology Review]. You can also take a Quantum computing class here [Qmunity].

10. Genetic predictions

The raw computing power highlighted above can be used to analyse one’s genome and predict one’s chances of getting conditions such as heart disease or breast cancer. If that sounds exactly like the plot of Gattaca, trust your instincts.

Regardless of the risks of genetic discrimination, DNA-based “predictions” could be the next great public health leap. For example, if women at high risk for breast cancer got more mammograms and those at low risk got fewer, those exams might catch more real cancers and set off fewer false alarms, leading to a better treatment rate and lower insurance premia.

It could also lead to the rise of personalised medicine, though the logistics of such a task would likely be a financial and logistical disaster given the current political climate (at least in the US).

More on Genetic predictions here [MIT Technology Review].

11. CRISPR

Even if a Gattaca-like future does come about from genetic predictions, we might still create a similar situation through straight up genetic engineering. CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats) allows researchers to easily alter DNA sequences and modify gene functions. Its many potential applications include correcting genetic defects, treating and preventing the spread of diseases and improving crops.

Editing germs to make new viruses or a “master race” is however a less fun prospect, should this technology get into unethical hands. Either way, I look forward to a time when every man looks like a mix of Tom Hiddleston and Idris Elba.

More on CRISPR here [Live Science].

12. Human Augmentations / enhancements

Thankfully, going genetic is not the answer to everything. Sometimes, some good old ingenuity and robotics is enough to solve our issues.

Slowly but surely, we are seeing more and more natural, artificial, or technological alteration of the human body in order to enhance physical or mental capabilities, often in the form of bionic limbs. As we begin to better understand how the brain transmits information to the body, more and more companies will begin to see the value of improving people’s life (for a steep fee) and descend upon this space.

It’s very likely that beyond the arms and legs augmentations that we’re already starting to see, there will be a point at which the back and the eyes are also augmented. Then, slowly but surely, augmentations will become elective, with interesting ethical implications.

More on human enhancements here [International Journal of Human-Computer Studies]

The Very Exciting Stuff (2030 Technologies)

13. Graphene

Though graphene has been over-hyped for so many years, we’re finally seeing something good come out of it. If you haven’t paid attention to the hype, graphene is a byproduct of graphite, which is itself carbon’s very close cousin. It is extremely strong, yet extremely thin, light and flexible (stronger than steel, thinner than paper). Oh, and it also conducts electricity really well.

The applications are numerous, specifically for wearable electronics and space travel, where resistance and weight is a key component. Nevertheless, it will take many years to get to a wide array of use cases : we’ve built the world around silicon, and it’s very had to displace that kind of well-established, mature technology.

More on graphene here [Digital Trends]

14. Edge Computing / Intelligent Edge

While the vast majority of data processing for connected devices now happens in the cloud, constantly sending data back and forth can take far too long (as much as a few seconds, sometimes). 5G is a temporary answer, as mentioned above, but there might be a simpler solution : allowing objects to process data on their own, without using Cloud technology (at the”edge” of the eco-system). This would unlock a wide variety of issues in manufacturing, transport, and healthcare, where split-second decisions are key to a variety of process. Even fashion could benefit by creating self-sufficient smart wearables.

As intelligent “things” proliferate, expect a shift from stand-alone intelligent objects to swarms of collaborative intelligent things. In this model, multiple devices would work together, either independently or with human input by grouping together their computing power. The leading edge of this area is being used by the military, which is studying the use of drone swarms to attack or defend military targets, but could likely go much further with hundreds of potential civilian uses.

The technology is nearly available, but as with other developments both above and below, we must first let the hardware capabilities catch up before implementing these ideas.

More on Edge computing here [The Verge]

15. Micro-chips / Bio-chips

The current main idea behind micro-chips (which are made from an array of molecular sensors on the chip surface that can analyze biological elements and chemicals) is for tracking biometrics in a medical context. It has also seen use cases emerge within the smart workspace technology ecosystem. It could however have a much wider appeal if customers decide to put their trust into it (such as banking - imagine never having to bring your wallet anywhere ever again).

Unless everyone suddenly agrees to let their blood pressure be monitored daily at work, this type of tracking is likely to remain benign in the near future. One might nevertheless imagine them becoming fairly commons in hospitals.

More on micro-chips here [The Guardian].

16. Nanorobotics

For those wanting to go even smaller than micro-chips, allow me to introduce nanorobots. Currently in R&D phases in labs throughout the world, nanorobots are essentially very, very tiny sensors with very limited processing power.

The first useful applications of these nanomachines may very well be in nanomedicine. For example, biological machines could be used to identify and destroy cancer cells or deliver drugs. Another potential application is the detection of toxic chemicals, and the measurement of their concentrations, in the environment.

More on Nanorobotic here [Nature].

17. Smart tattoos

Tattoos that can send signals via touch to interact with the world around us makes a lot of sense :

  • It’s wearable which allows for a greater freedom of movement

  • It tackles the issue of waste, which is seldom discussed when imagining the future of technology

  • It can be personalised, a trend towards which we’ve been moving for 15 years now.

In their current form, they would be temporary on the skin. They can however last much longer on prosthetic, and have the benefit of being cheap compared to a lot of the hardware available out there.

More on Smart Tattoos here [Microsoft]

18. Green Tech

Do you want your great-grand-kids to know what it’s like not to despise the sun? Then forget about all the above and concentrate on Green Tech : the science of making the world liveable. Because so much is being done in this space, we will avoid the details, and refer to better sources :

The issue with most of the above is that they tend to work well in theory, but their adoption cost is incredibly high, as they often struggle to scale. As much as we’d like to see all of them being implemented yesterday, the road ahead is still long.

More on Green Tech here [CB Insights]

19. Hydrogen fuel cells 

In a fuel cell, hydrogen combines with oxygen in the air to produce electricity, releasing only water. In itself, this isn’t new, as this principle was discovered in 1839; up until a few years ago, this idea was not profitable enough to allow for their large-scale use.

In fact, there are still some issues with the technology, as it’s easy to store a small amount of energy (hence its use in the space exploration industry), but incredibly hard to do at a larger scale.

See you in 2030 to see if we’ve solved these issues.

More on Hydrogen fuel cells here [FCHEA].

20. Meatless Meat 

I’ve tried it : lab-made meat smells, looks and tastes just like meat (beyond the odd uncanny-valley-like taste). The only things that change : healthier food, no antibiotics, no growth hormones, no emission of greenhouse gases and no animal suffering.

Above all, this is a gigantic market that whets the appetites of industrialists. After targeting vegetarians, they’ve realised that it’s much easier and rewarding to market these products to flexitarians (back in my days we called them omnivores).

By 2030, 10% of the meat eaten in the world will no longer come from an animal (allegedly). The principle is there, the technology works… all that’s left to see is if it will be widely adapted.

More on Meatless meat here [Vox].

Conclusion

Technology has a tendency to hold a dark mirror to society, reflecting both what’s great and evil about its makers. It’s important to remember that technology is often value-neutral : it’s what we do with it day in, day out that defines whether or not we are dealing with the “next big thing”.

Good luck out there.

Source: https://www.thepourquoipas.com/post/the-next-big-thing-in-technology-20-inventions 23 07 20

SU-MIMO vs MU-MIMO | Difference between SU-MIMO and MU-MIMO

13 Jun

This page compares SU-MIMO vs MU-MIMO and mentions difference between SU-MIMO and MU-MIMO with respect to 802.11ax (wifi6), 4G/LTE and 5G NR (New Radio) technologies.

Introduction : MIMO refers to multiple input multiple output. It basically refers to system having more than one antenna elements used either to increase system capacity, throughput or coverage. Beamforming techniques are used to concentrate radiated energy towards target UE which reduces interference to other UE’s and thereby improves the coverage.

There are two major types of MIMO with respect how the BS (Base Station) transmission is utilized by the mobile or fixed users. They are SU-MIMO and MU-MIMO. Both the types are used in the downlink direction i.e. from Base Station or eNB or Access point towards users.

There is another concept called massive MIMO or mMIMO in which combines multiple radio units and antenna elements on single active antenna unit. It houses 16/32/64/96 antenne elements. The massive MIMO employs beamforming which directs energy in desired user direction which reduces interference from undesired users.

SU-MIMO

• In SU-MIMO, all the streams of antenna arrays are focused on single user.
• Hence it is referred as Single User MIMO.
• It splits the available SINR between different multiple data layers towards target UE simultaneously where each layer is separately beamformed. This increases peak user throughput and system capacity.
• Here cell communicates with single user.
• Advantages : No interference

SU-MIMO vs MU-MIMO

The figure depicts SU-MIMO and MU-MIMO concept in IEEE 802.11ax (wifi6) system. It shows wifi6 compliant AP (Access Point) and wifi6 stations or users or clients.

MU-MIMO

• In MU-MIMO, multiple streams are focused on multi users. Moreover each of these streams provide radiated energy to more than one users.
• Hence it is referred as Multi User MIMO.
• It shares available SINR between multiple data layers towards multiple UEs simultaneously where each layer is separately beamformed. This increases system capacity and user perceived throughput.
• Here cell communicates with multi users.
• Advantages : Multiplexing gain

MU-MIMO in 5G NR

The figure depicts MU-MIMO used in mMIMO system in 5G. As shown multiple data streams (of multiple users) are passed through layer mapping/precoding before they are being mapped to antenna array elements and transmitted over the air.

Tabular difference between SU-MIMO and MU-MIMO

Following table summarizes difference between SU-MIMO and MU-MIMO.

Features SU-MIMO MU-MIMO
Full Form Single User MIMO Multi User MIMO
Function It is the mechanism in which information of single user is transmitted simultaneously over more than one data stream by BS (Base Station) in same time/frequency grid (i.e. resources). In MU-MIMO, data streams are distributed across multiple users on same time/frequency resources but dependent upon spatial separation.
Major Objective It helps in increasing user/link data rate as it is function of bandwidth and power availability. It helps in increasing system capacity i.e. number of users supported by base station.
Performance impact (Antenna Correlation) More susceptible Less susceptible
Performance Impact (Source of interference) Adjacent co-channel cells Links supporting same cell and other MU-MIMO users, and adjacent co-channel cells
Power allocation Split between multiple layers to same user. Fixed per transmit antenna Shared between multi-users and multiple layers. It can be allocated per MU-MIMO user based on channel condition.
CSI/Feedback process Varies upon implementation, TDD or FDD and reciprocity or feedback based. Less susceptible on feedback granularity and quality Very dependent upon CSI for channel estimation accuracy. More susceptible on feedback granularity and quality
Beamforming dependency Varies upon implementation TDD or FDD and reciprocity or feedback based. Less susceptible on feedback granularity and quality Greatly assisted by appropriate beamforming mechanisms (spatial focusing) which maximizes gain towards the intended users. More susceptible on feedback granularity and quality

 

WLAN 802.11ax related links

802.11n versus 802.11ax
802.11ac versus 802.11ax
Advantages and disadvantages of 802.11ax
BSS coloring in 11ax
RU in 802.11ax
MU-OFDMA in 802.11ax
MU-MIMO in 802.11ax
TWT power save mode in 802.11ax

 

5G NR Numerology | 5G NR Terminology

 

5G TECHNOLOGY RELATED LINKS

This 5G tutorial also covers following sub topics on the 5G technology:
5G basic tutorial    5G Frequency Bands    5G millimeter wave tutorial    5G mm wave frame    5G millimeter wave channel sounding    Difference between 4G and 5G    5G testing and test equipments    5G network architecture    5G network slicing    5G TF vs 5G NR    5G NR Physical layer    5G NR MAC layer    5G NR RLC layer    5G NR PDCP layer

What is Difference between

difference between 3G and 4G    difference between 4G and 5G    difference between 4.5G, 4.9G, 4G and 5G    difference between FDM and OFDM    Difference between SC-FDMA and OFDM    Difference between SISO and MIMO    Difference between TDD and FDD    Difference between 802.11 standards viz.11-a,11-b,11-g and 11-n    OFDM vs OFDMA    CDMA vs GSM    Bluetooth vs zigbee

 

Advantages and Disadvantages of other wireless technologies

IrDA    HomeRF    Bluetooth    Radar    RF    Wireless    Internet    Mobile Phone    IoT    Solar Energy    Fiber Optic    Microwave    Satellite    GPS    RFID    AM and FM    LTE

 

RF and Wireless Terminologies

Source: https://www.rfwireless-world.com/Terminology/Difference-between-SU-MIMO-and-MU-MIMO.html – 13 06 20

Disruptive Beamforming Trends Improving Millimeter-Wave 5G

12 Jun

5G is now a reality and the first stage of its infrastructure (sub-6 GHz) is already deployed in major cities around the world. The high data rate demand for 5G mobile users is shown to be fulfilled using the famous Multiple-Input multiple-Output (MIMO) technology. The next deployment stage of 5G is expected to utilize the millimeter-wave (mmWave) frequency spectrum, and the forthcoming base station antennas will operate at frequency bands centered at 28 GHz and 39 GHz. At these high frequencies, a steerable RF beam can reliably serve a communication device in a much better way compared to an inefficient isotropic RF radiator and this is possible by performing beamforming at the base station end, illustrated in Fig. 1. Beamforming is a technique by which a radiator is made to transmit radio signals in a particular direction. A communication device that performs this function is called a beamformer. The most common and simplest type of a beamformer is an array of half-wavelength spaced antennas connected to a single radio frequency (RF) source via a network of power dividers. Such a beamformer is referred to as a corporate-feed array. More sophisticated beamformers involve a bank of phase shifters connected to each antenna element to add beam steering capability to a simple corporate-feed array. Advanced beamformers involve digitally controlled phase shifters, lens structures, intelligent and meta-surfaces, etc., which enhances the beamformer performance.

Disruptive Beamforming Fig1

Fig. 1. mmWave beamformer serving mobile terminals in mmWave 5G network.

Disruptive mmWave Beamforming Technologies:

Designing 5G-ready beamformer hardware at mmWave is challenging due to three major reasons: 1. Huge losses faced by the electromagnetic waves while propagating through the free space, hence highly directive radiation is desirable. 2. The required network of phase-shifters and power dividers to add steering capabilities is lossy and expensive. 3. The theoretical principles of MIMO require each antenna to be connected separately to the baseband processing unit, making the overall system prohibitively expensive, especially when it comes to implementing a 64 or 128 element mmWave massive MIMO system.

In response to these challenges, disruptive technological trends have emerged that are likely to change the way we look at the mmWave beamforming hardware. One such example is the use of a multi-stage lens-based beamformer, in which the requirement of the complex phase shifter and power networks is avoided. As a result, a large number of antennas can be fed using a smaller number of radio frequency chains (power amplifier, mixer, and filter). This way, beamforming gain is achievable, thanks to a large number of antennas, while the cost of the system is kept minimal since the phase-shifting required for beamforming is done in low-cost lens structures. A simple example of such a system is shown below, in which a 15-element antenna array is shown to be capable of generating nine independent radio beams. The system is designed to operate on 28 GHz and is in line with 3GPP standards for 5G. This system is scalable to 64 or even 128 antenna elements, and still, low cost because the beamforming is possible without the requirement of complex and costly phase-shifting networks.

Disruptive Beamforming Fig2

Fig. 2. A 28 GHz two-stage Rotman lens-based beamformer.

A second example is related to successful channel sounding in mmWave 5G bands. The classical radio channel sounder hardware that works well at sub-6 GHz bands of 5G is not efficient enough to support mmWave channels. A new technique of sounding requires much simpler beamforming hardware than the conventional fully connected antenna array and can deliver fast and accurate direction-of-arrival estimations in the mmWave bands. This technique requires only a metallic cavity with sub-wavelength holes on one side and a scatterer placed inside the cavity. An example structure is shown Fig. 3. The cavity uses a frequency-diverse computational approach to do the direction-of-arrival estimation, which requires a single radio frequency chain, hence a low-cost solution again.

Disruptive Beamforming Fig3

Fig. 3. Cavity-backed frequency-diverse antenna for mmWave direction-of-arrival estimation.

A third example is related to mmWave 5G field trials. Although it is always better to rely on channel measurements and field trials to test the practical limits of the mmWave 5G before commercial deployment, rigorous field trials are often not possible and are too expensive to execute. Because of this limitation, the investigation of novel approaches within a network is not possible. In the past, the network planning sector and researchers often relied on a theoretical model to predict network performance. A single antenna used for the network calculations was often considered as an ideal omnidirectional radiator. This approximation was valid because of the simplicity of the system at sub-6 GHz 5G bands.

For mmWave 5G wireless, the assumption of an antenna as an ideal radiator can easily lead to the overestimation of the network performance. The least we can do is to integrate the practically measured 3D beamformer radiation patterns with the theoretical channel models. This approach is even more critical for dense urban environments, where connectivity and reliability of the entire network depend primarily upon the radiation performance of high directivity beamformers. This new technique can reliably estimate the practical mmWave massive MIMO performance by including the measured near-field and far-field 3D radiation patterns into the network calculations that are measured in an anechoic environment like the one shown below.

Disruptive Beamforming Fig4

Fig. 4. mmWave anechoic chamber facility at Queen’s University Belfast.

A fourth example is related to a very large mmWave array hardware. Beamformers at mmWave 5G can operate at full capacity when they have a very large number of radiating antennas. Each antenna is responsible to transmit a fraction of the total available radiated power, which means that each antenna must have a direct or indirect connection to the radio power source. This leads to cumbersome hardware at mmWave frequencies, where technology is not advanced enough to withstand high loss between the radio source and the antennas.

Using sparse antenna arrays is an alternative approach where the total radiated power from the access point is the same, while the number of radiating antennas is less than in a conventional antenna array, in which adjacent antenna spacing must be no larger than λ/2 to avoid grating lobes. Surprisingly, the direction of radiation (main lobe and side lobes) using a sparse antenna array can match perfectly that of a conventional antenna array using the Compressive Sensing technique. The randomness of antenna locations in a sparse array avoids the introduction of grating lobes while allowing adjacent antenna spacing to be greater than λ/2. This means that a larger array size can be implemented using a relatively small number of antennas.

Disruptive Beamforming Fig5

Fig. 5. A 28 GHz sparse patch antenna array beamformer.

Conclusion:

The radio infrastructure required to support mmWave 5G is not ready yet, however, the disruptive technologies are pushing the limits of engineering to make it a reality by 2025. The fastest version of 5G is in fact the mmWave 5G and we are looking forward to the benefits of its ubiquitous ultra-high-speed (up to 10 Gigabits per second) and low latency (down to 0.2 milliseconds).

Source: https://uk5g.org/5g-updates/read-articles/disruptive-beamforming-trends-improving-millimeter/ 12 06 20

Beamforming, Beam Steering & Beam Switching with Massive MIMO for 5G compared

7 Jun

Combining the high propagation loss of the millimeter wavelengths (mmWaves) employed in 5G new radio (5G NR) systems, plus the higher bandwidth demands of users, combinations of beamforming techniques and massive Multiple Input and Multiple Output (MIMO) are essential for increasing spectral efficiencies and  providing cost-effective, reliable wireless network coverage.

Beamforming

Beamforming is the application of multiple radiating elements transmitting the same signal at an identical wavelength and phase, which combine to create a single antenna with a longer, more targeted stream which is formed by reinforcing the waves in a specific direction. The general concept was first employed in 1906 for trans-oceanic radio communications.

With more radiating elements that make up the antenna, the narrower the beam. An artifact of beamforming is side lobes. These are essentially unwanted radiation of the signal that forms the main lobe in different directions. Poor engineering of antenna arrays would result in excessive interference by a beamformed signal’s side lobe. The more radiating elements that make up the antenna, the more focused the main beam is and the weaker the side lobes are.

Beamforming, Beam Steering, Beam Switching, Massive MIMO for 5G

Figure 1: Beamforming with two and four radiating elements

While digital beamforming at the baseband processor is most commonly used today, analog beamforming in the RF domain can provide antenna gains that mitigate the lossy nature of 5G millimeter waves.

Beam steering and beam switching

Beam steering is achieved by changing the phase of the input signal on all radiating elements. Phase shifting allows the signal to be targeted at a specific receiver. An antenna can employ radiating elements with a common frequency to steer a single beam in a specific direction. Different frequency beams can also be steered in different directions to serve different users. The direction a signal is sent in is calculated dynamically by the base station as the endpoint moves, effectively tracking the user. If a beam cannot track a user, the endpoint may switch to a different beam.

Beamforming, Beam Steering, Beam Switching, Massive MIMO for 5G

Figure 2: Beam steering and beam switching

This granular degree of tracking is made possible by the fact that 5G base stations must be significantly closer to users than previous generations of mobile infrastructures.

Massive MIMO

Multiple input and multiple output (MIMO) antennas have long been a feature of commercial public wireless and Wi-Fi systems, but 5G demands the application of massive MIMO. To increase the resiliency (signal-to-noise ratio / SNR) of a transmitted signal and the channel capacity, without increasing spectrum usage, a common frequency can be steered simultaneously in multiple directions.

The successful operation of MIMO systems requires the implementation of powerful digital signal processors and an environment with lots of signal interference, or “spatial diversity”; that is a rich diversity of signal paths between the transmitter and the receiver.

Massive MIMO for 5G

Figure 3: Multiple input and multiple output (MIMO)

Diversity of arrival times, as the signal is bounced from different obstacles, forms multiple time-division duplexing (TDD) channels that can deliver path redundancy for duplicate signals or increase the channel capacity by transmitting different parts of the modulated data. First conceived of in the 1980s, there are a few differences between classic multi-user MU-MIMO and Massive MIMO, but fundamentally it is still the large number of antennas employed and the large number of users supported. The degree of MIMO is indicated by the number of transmitters and the number of receivers, i.e. 4×4.

Conclusion

Beamforming, Beam Steering, Beam Switching and Massive MIMO are key ingredients for 5G base stations.

Source: https://www.5g-networks.net/category/5g-technology/ 07 06 20

Employing AI to Enhance Returns on 5G Network Investments

13 Sep

Wireless 20/20 believes that AI will play a crucial role in helping operators to maximize returns on their 5G network investments. AI will open exciting opportunities for the mobile communications sector to proactively manage the costs of deploying and maintaining new 5G networks while helping to create a more personal approach for customers.

In October and November of 2018 Ericsson conducted an AI survey, showcasing service providers that have adopted AI to manage the costs of deploying and maintaining mobile communications networks while helping to create a more personal approach for managing customer relationships customers. The Ericsson Report concludes that that more than half of mobile operators—a total of 53%—expect to have adopted AI within their 5G networks by the end of 2020.

AI in 5G Network Planning and Performance Management

5G is expected to cover more than 40% of the world’s population, and total mobile data traffic is predicted to have increased by a factor of 5 by 2024. With the advent of 5G, service providers are making huge investments in their networks to enable the new use cases that 5G offers. Ericsson research reveals that wireless service providers around the world are presently at various stages on their journey with AI and 5G. Early adopters of AI among service providers will undoubtedly gain an advantage, as they will be well–placed to deal with new challenges that result from the proliferation of additional devices following the introduction of 5G. This is because the advent of 5G will make network topologies relatively complex, with small cells and new antennas making usage patterns more difficult for humans alone to predict, and current radio propagation models becoming more complex to compute as a result of new radio spectrum bands, denser topologies, Massive MIMO, and beamforming.

The Ericsson Report also reveals that AI is presently facilitating improvements ranging from simplifying network evolution to improving performance across existing networks. Most service providers are at the stage of testing AI, with 48% focusing on AI to reduce capital expenditures. Service providers believe the highest potential return from AI adoption will be in network planning (70%) while 64% intend to maximize their returns by focusing their AI adoption efforts on network performance management. The highest current focus of AI initiatives among wireless service providers worldwide is in service quality management (17%) and operational cost savings (16%). A further 41% are focusing on using AI for optimizing network performance, and 35% for new revenue streams. AI and machine learning will enable wireless network vendors to quickly process raw data and deliver analytical outcomes to help operators more rapidly recoup their 5G network investments.

AI Will Be Vital to 5G Customer Service

For service providers, AI offers opportunities to build solutions jointly with infrastructure providers, with a common goal to more effectively manage complexity and optimize network performance. Service providers around the world are observing improved reliability for customers as the area in which AI is currently having the greatest impact upon core network activities. The Ericsson survey demonstrates that AI is creating both benefits and challenges for service providers at the advent of 5G. Enhancing customer experience was identified by 55% of service providers as a key area where AI is presently having the greatest impact within core network activities. In addition, 68% of survey respondents highlighted enhancing customer service as a business and operational objective over the next 3 years. A further 72% agreed strongly that AI will be important in enabling monetization of new network technologies and providing a better service to customers. A wide range of operators are already beginning to enhance big data with AI to automate customer service with intelligent assistants and chatbots.

Many service providers are already concluding successful trials on using AI in their networks. Only 12% feel they have a detailed knowledge of AI’s application. However, 49% considered themselves to have a fairly detailed knowledge of AI application. More than half expect AI to be adopted in their networks before the end of 2020 (a total of 53% globally) and there is a general expectation (55% globally) that the benefits will be evident within 1 to 2 years.

AI to Deliver Optimal 5G User Experience

Ericsson is convinced that AI offers the best opportunity to achieve the high levels of automation necessary to optimize and manage the complexity of 5G system performance, allowing them a shift from managing networks to managing services. As 5G-enabled technologies develop, operators will need AI to augment the human capabilities to improve efficiency and manage their OPEX. Ericsson has introduced engineering solutions that combine AI, machine learning, and human ingenuity to enable networks to self-learn, self-optimize and deliver optimal user experience, allowing operator customers to capitalize on the opportunities of 5G. Ericsson believes that AI and machine learning will be crucial to the evolution of 5G network automation, IoT and industrial digitalization.

AI and 5G in Europe

Vodafone is one Ericsson customer that is leading the industry in using AI in radio networks, because of the pioneering work between Ericsson and the operator’s Networks Centre of Excellence. Vodafone and Ericsson are collaborating to develop advanced AI and Machine Learning algorithms. One use case is for Vodafone to improve MIMO energy management at radio sites by putting radio transmitters into power-saving Sleep Mode when traffic falls below certain levels and then re-activate them automatically when traffic surges. Vodafone has deployed 5G in seven UK cities, including London. Backed by its largest ever capital investment in partnership with Ericsson, Vodafone is enabling Londoners to access the new ultra-fast 5G network without any limits based on new unlimited data plans. Vodafone will provide comprehensive 5G coverage in London, leveraging the latest Ericsson Radio System portfolio, including the latest Baseband 6630 and Massive MIMO 6488 products to enable 5G using the 3.5GHz frequency. Combined with LTE, this will achieve speeds up to 10 times faster than 4G for 5G users with much lower latency. Vodafone Spain recently launched 5G services in three cities operating in the 3.7 GHz band utilizing Ericsson products and solutions. Vodafone and Ericsson have also launched a commercial 5G network in Germany with the goal of bringing 5G to 20 million people in over 20 cities by the end of 2021. Vodafone Business and IBM will also supply enterprise customers with managed services in the areas of cloud and hosting and will work together to build and deliver solutions in areas like AI, cloud, 5G, IoT, and software-defined networking.

AI and 5G in the US

AT&T and Tech Mahindra are collaborating to build an open source artificial intelligence (AI) platform called Acumos, which will make it easy to build, share, and deploy AI applications. The Acumos AI Marketplace is an extensible framework for machine learning solutions which provides the capability to edit, integrate, compose, package, train, and deploy AI microservices. By getting developers and businesses to collaborate effectively, AT&T will industrialize the deployment of AI at enterprises to deliver tangible value and solve real business problems. AT&T has used the model of moving its own technology into the open source community to engage developers and accelerate the development of the platform. AT&T is collaborating with Tech Mahindra and to make AI simpler to improve adoption and help enable enterprises apply AI to reimagine business models, unlock the potential of data and drive business outcomes. Tech Mahindra has now expanded this strategic collaboration to assume management of many of the applications which support AT&T’s network and shared systems. The goal is to accelerate AT&T’s IT network application, shared systems modernization, and movement to the cloud. This partnership should significantly boost AT&T’s 5G time-to-market and simultaneously reduce their cost of ownership by automating aspects of their network lifecycle. Manish Vyas, President, Communications, Media and Entertainment Business and the CEO, Network, Tech Mahindra will present the Acumos Project on October 25 in Track 12 at AI World.

T-Mobile is also leveraging AI and machine learning to completely overhaul and accelerate the automation of its customer service operations in the US. T-Mobile is using the predictive capabilities of AI and machine learning to augment human abilities and reshape its customer service. T-Mobile customers immediately connect with a live customer service agent that knows them, rather than talking to an IVR or chatbot. With the help of AI, these customer service agents can quickly access the information most salient to customer needs. These AI-driven customer care initiatives will be critical as T-Mobile prepares to deliver nationwide 5G using a mix of wireless spectrum. T-Mobile plans to introduce standalone 5G in 2020 and recently accomplished the world’s first standalone 5G data session on a multi-vendor 5G next generation radio access and core network—and the first standalone 5G data session of any kind in North America. However, T-Mobile has placed some of its 5G network deployment efforts on hold amid regulatory delays in its pending Sprint merger.

Verizon plans to invest between $17 billion and $18 billion in capital expenditure as it builds out 5G networks and launches 5G services in 30 markets based on millimeter wave spectrum in 2019. During Track 12 at AI World, Verizon speakers will discuss efforts to leverage 5G, AI and Mobile Edge Computing, with the aim to have some commercial services on this infrastructure by late 2019. By installing IT and network-processing resources in data centers at the network edge, instead of in the centralized facilities where they are normally found, operators could shorten the journey for a data signal and reduce latency. Verizon is confident it will be able to cut latency by at least 80% through investment in 5G technology and the rollout of new “edge” architecture. Verizon is testing a cloud gaming platform and this latency reduction could lead to new service opportunities in areas such as virtual-reality gaming services and self-driving cars.

Verizon is also enhancing its portfolio of managed services with an AI-powered toolkit for improving 5G customer experience outcomes. Verizon has made a large investment in AI and machine learning technologies and uses advanced predictive analytics algorithms to deliver “Digital Customer Experience” offerings for businesses. Verizon’s new Digital Customer Experience platform combines four AI-powered components to improve customer support outcomes: virtual agent, live agent, knowledge assist, and social engagement. Verizon believes the use of AI in customer service is likely to increase in the near future and is integrating AI into its existing customer support pipeline, providing virtual assistance 24/7 via social media, chat services, email, text message, or phone, with support experiences based on past interactions. Verizon’s Virtual Agent platform incorporates AI to solve customer challenges on the spot and escalates users to human support agents when presented with a situation in which it is unable to help. The Knowledge Assist component combines authoring tools with machine learning to provide relevant answers and guidance for agents.

Ericsson Survey Methodology

This Ericsson report provides unique insights into the increasing need for relevant data about how service providers plan to integrate AI in to their 5G networks, based on a global comparison of high-level business objectives. This Ericsson survey was based on telephone interviews conducted by Coleman Parkes Research with 165 senior executives from 132 mobile communications service providers globally. The respondents were segmented into six categories based on function: Chief Technology Officers (CTOs), Chief Operating Officers (COOs), Chief Information Officers (CIOs), Chief Marketing Officers (CMOs), Chief Financial Officers (CFOs), and Line of Business (LOB) managers. Mapping the response of 165 executives from 132 mobile operators across the globe, the report provides valuable insights about the reasoning and expectations of using AI applications across 5G networks.

machine-learning-and-ai-aw-screen (Ericsson report)

Source: https://www.aitrends.com/ai-and-5g/employing-ai-to-enhance-returns-on-5g-network-investments/
13.09.19

Wireless Routers 101

14 Feb

A wireless router is the central piece of gear for a residential network. It manages network traffic between the Internet (via the modem) and a wide variety of client devices, both wired and wireless. Many of today’s consumer routers are loaded with features, incorporating wireless connectivity, switching, I/O for external storage devices as well as comprehensive security functionality. A wired switch, often taking the form of four gigabit Ethernet ports on the back of most routers, is largely standard these days. A network switch negotiates network traffic, sending data to a specific device, whereas network hubs simply retransmit data to all of the recipients. Although dedicated switches can be added to your network, most home networks don’t incorporate them as standalone appliances. Then there’s the wireless access point capability. Most wireless router models support dual bands, communicating over 2.4 and 5GHz and many are also able to connect to several networks simultaneously.

Part of trusting our always-on Internet connections is the belief that private information is protected at the router, which incorporates features to limit home network access. These security features can include a firewall, parental controls, access scheduling, guest networks and even a demilitarized zone (DMZ), referring to the military concept of a buffer zone between neighboring countries). The DMZ, also called a perimeter network, is a subnetwork where vulnerable processes like mail, Web and FTP servers can be placed so that, if it is breached, the rest of the network isn’t compromised. The firewall is a core component in today’s story. In fact, what differentiates a wireless router from a dedicated switch or wireless access point is the firewall. Although Windows has its own software-based firewall, the router’s hardware firewall forms the first line of defense in keeping malicious content off the home network. The router’s firewall works by making sure packets were actually requested by the user before allowing them to pass through to the local network.

Finally, you have peripheral connectivity like USB and eSATA. These ports make it possible to share external hard drives or even printers. They offer a convenient way to access networked storage without the need for a dedicated PC with a shared disk or NAS running 24/7.

Some Internet service providers (ISPs) integrate routers into their modems, yielding an “all-in-one” device. This is done to simplify setup, so the ISP has less hardware to support. It can also be advantageous to space-constrained customers. However, in general, these integrated routers do not get firmware updates as frequently, and they’re often not as robust as stand-alone routers. An example of a combo modem/router is Netgear’s Nighthawk AC1900 Wi-Fi cable modem router. In addition to its 802.11ac wireless connectivity, it offers a DOCSIS 3.0 24 x 8 broadband cable modem.

DOCSIS stands for “data over cable service interface specifications,” and version 3.0 is the current cable modem spec. DOCSIS 1.0 and 2.0 defines a single channel for data transfers, while DOCSIS 3.0 specifies the use of multiple channels to allow for faster speeds. Current DOCSIS 3.0 modems commonly use 8, 12 or 16 channels, with 24-channel modems also available. Each channel offers a theoretical maximum download speed of 38 Mb/s and a maximum upload speed of 27 Mb/s. The standard’s next update, DOCSIS 3.1, promises to offer download speeds of up to 10 Gb/s and upload speeds of up to 1 Gb/s.

MORE: All Networking Content
MORE: Networking in the Forums

Wi-Fi Standards

The oldest wireless routers supported 802.11b, which worked on the 2.4GHz band and topped out at 11 Mb/s. This original Wi-Fi standard was approved in 1999, hence the name 802.11b-1999 (later it was shortened to 802.11b).

Another early Wi-Fi standard was 802.11a, also ratified by the IEEE in 1999. It operated on the less congested 5GHz band and maxed out at 54 Mb/s, although real-world throughput was closer to half that number. Given a shorter wavelength than 2.4GHz, the range of 802.11a was shorter, which may have contributed to less uptake. While 802.11a enjoyed popularity in some enterprise applications, it was largely eclipsed by the more pervasive 802.11b in homes and small businesses. Notably, 802.11a’s 5GHz band became part of later standards.

Eventually, 802.11b was replaced by 802.11g on the 2.4GHz band, upping throughput to 54 Mb/s. It all makes for an interesting history lesson, but if your wireless equipment is old enough for that information to be relevant, it’s time to consider an upgrade.

802.11n

In the fall of 2009, 802.11n was ratified, paving the way for one device to operate on both the 2.4GHz and 5GHz bands. Speeds topped out at 600 Mb/s. With N600 and N900 gear, two separate service set identifiers (SSIDs) were transmitted—one on 2.4GHz and the other on 5GHz—while less expensive N150 and N300 routers cut costs by transmitting only on the 2.4GHz band.

Wireless N networking introduced an important advancement called MIMO, an acronym for “multiple input/multiple output.” This technology divides the data stream between multiple antennas. We’ll go into more depth on MIMO shortly.

If you’re satisfied with the performance of your N wireless gear, then hold onto it for now. After all, it does still exceed the maximum throughput offered by most ISPs. Here are some examples of available 802.11n product speeds:

Type 2.4GHz (Mb/s) 5GHz (Mb/s)
N150 150 N/A
N300 300 N/A
N600 300 300
N900 450 450

802.11ac

The 802.11ac standard, also known as Wireless AC, was released in January 2014. It broadcasts and receives on both the 2.4GHz and 5GHz bands, but the 2.4GHz frequency on an 802.11ac router is really a carryover of 802.11n. That older standard maxed out at 150 Mb/s on each spatial stream, with up to four simultaneous streams, for a total throughput of 600 Mb/s.

In 802.11ac MIMO was also refined with increased channel bandwidth and support for up to eight spatial streams. Beamforming was introduced with Wireless N gear, but it was proprietary, and with AC, it was standardized to work across different manufacturers’ products. Beamforming is a technology designed to optimize the transmission of Wi-Fi around obstacles by using the antennas to direct and focus the transmission to where it is needed.

With 802.11ac firmly established as the current Wi-Fi standard, enthusiasts shopping for routers should consider one of these devices, as they offer a host of improvements over N gear. Here are some examples of available 802.11ac product speeds:

Type 2.4GHz (Mb/s) 5GHz (Mb/s)
AC600 150 433
AC750 300 433
AC1000 300 650
AC1200 300 867
AC1600 300 1300
AC1750 450 1300
AC1900 600 1300
AC3200 600 1300, 1300

The maximum throughput achieved is the same on AC1900 and AC3200 for both the 2.4GHz and 5GHz bands. The difference is that AC3200 can transmit two simultaneous 5GHz networks to achieve such a high total throughput.

The latest wireless standard with products currently hitting the market is 802.11ac Wave 2. It implements multiple-user, multiple-input, multiple-output, popularly referred to as MU-MIMO. In broad terms, this technology provides dedicated bandwidth to more devices than was previously possible.

Wi-Fi Features

SU-MIMO And MU-MIMO

Multiple-input and multiple-output (MIMO), first seen on 802.11n devices, takes advantage of a radio phenomenon known as multipath propagation, which increases the range and speed of Wi-Fi. Multipath propagation is based on the ability of a radio signal to take slightly different pathways between the router and client, including bouncing off intervening objects as well as floors and ceilings. With multiple antennas on both the router as well as the client—and provided they both support MIMO—then using antenna diversity can combine simultaneous data streams to increase throughput.

When MIMO was originally implemented, it was SU-MIMO, designed for a Single User. In SU-MIMO, all of the router’s bandwidth is devoted to a single client, maximizing throughput to that one device. While this is certainly useful, today’s routers communicate with multiple clients at one time, limiting the SU-MIMO’s technology’s utility.

The next step in MIMO’s evolution is MU-MIMO, which stands for Multiple User-MIMO. Whereas SU-MIMO was restricted to a single client, MU-MIMO can now extend the benefit to up to four. The first MU-MIMO router released, the Linksys EA8500, features four external antennas that facilitate MU-MIMO technology allowing the router to provide four simultaneous continuous data streams to clients.

Before MU-MIMO, a Wi-Fi network was the equivalent of a wired network connected through a hub. This was inefficient; a lot of bandwidth is wasted when data is sent to clients that don’t need it. With MU-MIMO, the wireless network becomes the equivalent of a wired network controlled by a switch. With data transmission able to occur simultaneously across multiple channels, it is significantly faster, and the next client can “talk” sooner. Therefore, just as the transition from hub to switch was a huge leap forward for wired networks, so will MU-MIMO be for wireless technology.

Beamforming

Beamforming was originally implemented in 802.11n, but was not standardized between routers and clients; it essentially did not work between different manufacturers’ products. This was rectified with 802.11ac, and now beamforming works across different manufacturers’ gear.

What beamforming does is, rather than have the router transmit its Wi-Fi signal in all directions, it allows the router to focus the signal to where it is needed to increase its strength. Using light as an analogy, beamforming takes the camping lantern and turns it into a flashlight that focuses its beam. In some cases, the Wi-Fi client can also support beamforming to focus the signal of the client back to the router.

While beamforming is implemented in 802.11ac, manufacturers are still allowed to innovate in their own way. For example, Netgear offers Beamforming+ in some of its devices, which enhances throughput and range between the router and client when they are both Netgear products and support Beamforming+.

Other Wi-Fi Features

When folks visit your house, they often want to jump on your wireless network, whether to save on cellular data costs or to connect a notebook/tablet. Rather than hand out your Wi-Fi password, try configuring a Guest Network. This facilitates access to network bandwidth, while keeping guests off of other networked resources. In a way, the Guest Network is a security feature, and feature-rich routers offer this option.

Another feature to look for is QoS, which stands for Quality of Service. This capability serves to prioritize network traffic from the router to a client. It’s particularly useful in situations where a continuous data stream is required; for example, with services like Netflix or multi-player games. In fact, routers advertised as gaming-optimized typically include provisions for QoS, though you can find the functionality on non-gaming routers as well.

Another option is Parental Control, which allows you to act as an administrator for the network, controlling your child’s Internet access. The limits can include blocking certain websites, as well as shutting down network access at bedtime.

Wireless Router Security

There are two types of firewalls: hardware and software. Microsoft’s Windows operating system has a software firewall built into it. Third-party firewalls can be installed as well. Unfortunately, these only protect the device they’re installed on. While they’re an essential part of a Windows-based PC, the rest of your network is otherwise exposed.

An essential function of the router is its hardware firewall, known as a network perimeter firewall. The router serves to block incoming traffic that was not requested, thereby operating as an initial line of defense. In an enterprise setup, the hardware firewall is a dedicated box; in a residential router, it’s integrated.

A router is also designed to look for the address source in packets traveling over the network, relating them to address requests. When the packets aren’t requested, the firewall rejects them. In addition, a router can apply filtering policies, using rules to allow and restrict packets before they traverse the home network. The rules consider the source of a packet’s IP address and its destination. Moreover, packets are matched to the port they should be on. This is all done at the router to keep unwanted data off the home network.

The wireless router is responsible for the Wi-Fi signal’s security, too. There are various protocols for this, including WEP, WPA and WPA2. WEP, which stands for Wired Equivalent Privacy, is the oldest standard, dating back to 1999. It uses 64-bit, and subsequently 128-bit encryption. As a result of its fixed key, WEP is widely considered quite insecure. Back in 2005, the FBI showed how WEP could be broken in minutes using publicly available software.

WEP was supplanted by WPA (Wi-Fi Protected Access) featuring 256-bit encryption. Addressing the significant shortcoming of WEP, a fixed key, WPA’s improvement was based on the Temporal Key Integrity Program (TKIP). This security protocol uses a per-packet key system that offers a significant upgrade over WEP. WPA for home routers is implemented as WPA-PSK, which uses a pre-shared key (PSK, better known as the Wi-Fi password that folks tend to lose and forget). While the security of WPA-PSK via TKIP was definitely better than WEP, it also proved vulnerable to attack and is not considered secure.

Introduced in 2006, WPA2 (Wi-Fi Protected Access 2) is the more robust security specification. Like its predecessor, WPA2 uses a pre-shared key. However, unlike WPA’s TKIP, WPA2 utilizes AES (Advanced Encryption Standard), a standard approved by the NSA for use with top secret information.

Any modern router will support all of these security standards for the purpose of compatibility, as none of them are new, but ideally, you want to configure your router to employ WPA2/AES. There is no WPA3 on the horizon because WPA2 is still considered secure. However, there are published methods for compromising it, so accept that no network is impenetrable.

All of these Wi-Fi security standards rely on your choice of a strong password. It used to be that an eight-character sequence was considered sufficient. But given the compute power available today (particularly from GPUs), even longer passwords are sometimes recommended. Use a combination of numbers, uppercase and lowercase letters, and special characters. The password should also avoid dictionary words or easy substitutions, such as “p@$$word,” or simple additions—for example, “password123” or “passwordabc.”

While most enthusiasts know to change the router’s Wi-Fi password from its factory default, not everyone knows to change the router’s admin password, thus inviting anyone to come along and manipulate the router’s settings. Use a different password for the Wi-Fi network and router log-in page.

In the event that you lose your password, don’t fret. Simply reset the router to its factory state, reverting the log-in information to its default. Manufacturers have different methods for doing this, but many routers have a physical reset button, usually located on the rear of the device. After resetting, all custom settings are lost, and you’ll need to set a new password.

Wi-Fi Protected Setup (WPS) is another popular feature on higher-end routers. Rather than manually typing in a password, WPS lets you press a button on the router and adapter, triggering a brief discovery period. Another approach is the WPS PIN method, which facilitates discovery through the entry of a short code on either the router or client. It’s vulnerable to brute-force attack, though, so many enthusiasts recommend simply disabling WPS altogether.

Software

Web And Mobile Interfaces

Wireless routers are typically controlled through a software interface built into their firmware, which can be accessed through the router’s network address. Through this interface you can enable the router’s features, define the parameters and configure security settings. Routers employ a variety of custom operating environments, though most are Web-based. Some manufacturers do offer smartphone-enabled apps for iOS and Android, too. Here’s is an example of a software interface for the Netis WF2780, seen on a Windows desktop. While not easy to use for amateurs, it does allow for control over all the settings. Here we can see the Bandwidth Control Configuration in the Advanced Settings.

Routers offer a wide range of features, and each vendor has its own set of unique capabilities. Overall, though, they do share generally similar feature sets, including:

  • Quick Setup: For the less experienced user, Quick Setup is quite useful. This gets the device up and running with pre-configured settings, and does not require advanced networking knowledge. Of course, experienced users will want more control.
  • Wireless Configuration: This setting allows channel configuration. In some cases, the router’s power can be adjusted, depending on the application. Finally, the RF bandwidth can be selected as well. Analogous settings for 5GHz are available on a separate page.
  • Guest Network: The router software will provide the option to set up a separate Guest Network. This has the advantage of allowing visitors to use your Internet, without getting access to the entire network.
  • Security: This is where the SSIDs for each of the configured networks, as well as their passwords, can be configured.
  • Bandwidth Control: Since there is limited bandwidth, it can be controlled to provide the best experience for all (or at least the one who pays the bills). The amount of bandwidth that any user has, both on the download and upload sides, can be limited so one user does not monopolize all the bandwidth.
  • System Tools: Using this collection of tools, the router’s firmware can be upgraded and the time settings specified. This also provides a log of sites visited and stats on bandwidth used.

Here is a screenshot of a mobile app called QRSMobile for Android, which can simplify the setup of a wireless router, in this case the D-Link 820L.

This screenshot shows the smartphone app for the Google OnHub.

 

 

Open-Source Firmware

Historically, some of these vendor-provided software interfaces did not allow full control of all possible settings. Out of frustration, a community for open source router firmware development took shape. One popular example of its work is DD-WRT, which can be applied to a significant number of routers, letting you tinker with options in a granular fashion. In fact, some manufacturers even sell routers with DD-WRT installed. The AirStation Extreme AC 1750 is one such model.

Another advantage of open firmware is that you’re not at the mercy of a vendor in between updates. Older products don’t receive much attention, but DD-WRT is a constant work in progress. Other open source firmware projects in this space include OpenWRT and Tomato, but be mindful that not all routers support open firmware.

Hardware

System Board Components

Inside a wireless router is a purpose-built system, complete with a processor, memory, power circuitry and a printed circuit board. These are all proprietary components, with closed specifications, and are not upgradeable.

The above image shows the internals of Netis’ N300 Gaming Router (WF2631). We see the following components:

  1. Status LEDs that indicate network/router activity
  2. Heat sink for the processor—these CPUs don’t use much power, and are cooled without a fan
  3. Antenna leads for the three external antennas to connect to the PCB
  4. Four Ethernet LAN ports for the home network
  5. WPS Button
  6. Ethernet WAN port that connects to a provider’s modem
  7. Power jack
  8. Factory reset button
  9. 10/100BASE-TX transformer modules — these support the RJ45 connectors, which are the Ethernet ports.
  10. 100 Base-T dual-port through-hole magnetics. These are designed for IEEE802.3u (Ethernet ports).
  11. Memory chip (DRAM)

Antenna Types

As routers send and receive data across the 2.4 and 5GHz bands, they need antennas. There are multiple antenna choices: external versus internal designs, routers with one antenna and others with several. If a single antenna is good, then more must be better, right? And this is the current trend, with flagship routers like the Nighthawk X6 Tri-Band Wi-Fi Router featuring as many as six antennas, which can each be fine-tuned in terms of positioning to optimize performance. A setup like that facilitates three simultaneous network signals: one 2.4GHz and two 5GHz.

While a router with an internal antenna might look sleeker, these designs are built to blend into a living area. The range and throughput of external antennas are typically superior. They also have the advantages of reaching up to a higher position, operating at a greater distance from the router’s electronics, reducing interference, and offering some degree of configurability to tune signal transmission. This makes a better argument for function over form.

The more antennas you see on a router, the more transmit and receive radios there are, corresponding to the number of supported spatial streams. For example, a 3×3 router employs three antennas and handles three simultaneous spatial streams. Using current standards, these additional spatial streams account for much of how performance is multiplied. The Netis N300 router, pictured on the left, features three external antennae for better signal strength.

Ethernet Ports

While the wireless aspect of a wireless router gets most of the attention, a majority also enable wired connectivity. A popular configuration is one WAN port for connecting to an externally-facing modem and four LAN ports for attaching local devices.

The LAN ports top out at either 100 Mb/s or 1 Gb/s, also referred to as gigabit Ethernet or GbE. While older hardware can still be found with 10/100 ports, the faster 10/100/1000 ports are preferred to avoid bottlenecking wired transfer speeds over category 5e or 6 cables. If you have the choice between a physical or wireless connection, go the wired route. It’s more secure and frees up wireless bandwidth for other devices.

While four Ethernet ports on consumer-oriented routers is standard, certain manufacturers are changing things up. For example, the TP-Link/Google OnHub router only has one Ethernet port. This could be the start of a trend toward slimmer profiles at the expense of expansion. The OnHub router, pictured on the right, features a profile designed to be displayed, and not hidden in a closet, but this comes at the expense of external antennas, and the router has only a single Ethernet port. Asus’ RT-AC88U goes the other direction,incorporating eight Ethernet ports.

USB Ports

Some routers come with one or two USB ports. It is still common to find second-gen ports capable of speeds of up to 480 Mb/s (60 MB/s). Higher-end models implement USB 3.0, though. Though they cost more, the third-gen spec is capable 5 Gb/s (640 MB/s). The D-Link DIR-820L features a rear-mounted USB port. Also seen are the four LAN ports, as well as the Internet connection input (WAN).

One intended use of USB ports is to connect storage. All of them support flash drives; however, some routers output enough current for external enclosures with mechanical disks. If you don’t need a ton of capacity, you can use a feature like that to create an integrated NAS appliance. In some models, the storage is only accessible over a home network. In other cases, you can reach it remotely.

The other application of USB on a router is shared printing. Networked printers make it easy to consolidate to just one peripheral. Many new printers do come with Wi-Fi controllers built-in. But for those that don’t, it’s easy to run a USB cable from the device to your router and share it across the network. Just keep in mind that you might lose certain features if you hook your printer up to a router. For instance, you might not see warnings about low ink levels or paper jams.

Conclusion

The Future Of Wi-Fi

Wireless routers continue to evolve as Wi-Fi standards get ratified and implemented. One rapidly expanding area is the Connected Home space, with devices like thermostats, fire alarms, front door locks, lights and security cameras all piping in to the Internet. Some of these devices connect directly to the router, while others connect to a hub device—for example, the SmartThings Hub, which then connects to the router.

One upcoming standard is known as 802.11ad, also referred to as WiGig. Actual products based on the technology are just starting to appear. It operates on the 60GHz spectrum, which promises high bandwidth across short distances. Think of it akin to Bluetooth with a roughly 10 meter range, but performance on steroids. Look for docking stations without wires and 802.11ad as a protocol for linking our smartphones and desktops.

Used in the enterprise segment, 802.11k and 802.11r are being developed for the consumer market. The home networking industry plans to address the problem of using multiple access points to deal with Wi-Fi dead spots, and the trouble client devices have with hand-offs between multiple APs. 802.11k allows client devices to track APs for where they weaken, and 802.11r brings Fast Basic Service Set Transition (F-BSST) to facilitate authentication with APs. When 802.11k and 802.11r are combined, they will enable a technology known as Seamless Roaming. Seamless Roaming will facilitate client handoffs between routers and access points.

Beyond that will be 802.11ah, which is being developed to use on the 900MHz band. It is a low-bandwidth frequency, but is expected to double the range of 2.4GHz transmissions with the added benefit of low power. The envisioned application of it is connecting Internet of Things (IoT) devices.

Out on the distant horizon is 802.11ax, which is tentatively expected to roll out in 2019 (although remember that 802.11n and 802.11ac were years late). While the standard is still being worked on, its goal is 10 Gb/s throughput. The 802.11ax standard will focus on increasing speeds to individual devices by slicing up the frequency into smaller segments. This will be done via MIMO-OFDA, which stands for multiple-input, multiple-output orthogonal frequency division multiplexing, which will incorporate new standards to pack additional data into the 5GHz data stream.

What To Look For In A Router

Choosing a router can get complicated. You have tons of choices across a range of price points. You’ll want to evaluate your needs and consider variables like the speed of your Internet connection, the devices you intend to connect and the features you anticipate using. My own personal recommendation would be to look for a minimum wireless rating of AC1200, USB connectivity and management through a smartphone app.

Netis’ WF2780 Wireless AC1200 offers an inexpensive way to get plenty of wireless performance at an extremely low price. While it lacks USB, you do get four external antennas (two for 2.4GHz and two for 5GHz), four gigabit Ethernet ports and the flexibility to use this device as a router, access point or repeater. Certain features are notably missing, but at under $60, this is an entry-level upgrade that most can afford.

Moving up to the mid-range, we find the TP-Link Archer C9. It features AC1900 wireless capable of 600 Mb/s on the 2.4GHz band and 1300 Mb/s on the 5GHz band. It has three antennas and a pair of USB ports, one of which is USB 3.0. There’s a 1GHz dual-core processor at the router’s heart and a TP-Link Tether smartphone app to ease setup and management. You’ll find the device for $130.

At the top end of the market is AC3200 wireless. There are several routers in this tier, including D-Link’s AC3200 Ultra Wi-Fi Router (DIR-890L/R). It features Tri-Band technology, which supports a 2.4GHz network at 600 Mb/s and two 5GHz networks at 1300 Mb/s. To accomplish this, it has a dual-core processor and no less than six antennas. There’s also an available app for network management, dual USB ports and GbE wired connectivity. The Smart Connect feature can dynamically balance the wireless clients among the available bands to optimize performance and prevent older devices from slowing down the rest of the network. Plus, this router has the aesthetics of a stealth destroyer and the red metallic paint job of a sports car! Such specs do not come cheap; expect to pay $300.

Conclusion

Wireless routers are assuming an ever-important role as the centerpiece of a residential home network. With the increasing need for multiple, simultaneous continuous data streams, robust throughput is no longer a nice feature, but rather a necessity. This becomes even more imperative as streaming 4K video moves from a high-end niche into the mainstream. By taking into consideration such factors as the data load as well as the number of simultaneous users, enthusiasts shopping for wireless routers will get the help they need to choose the router that best fits their needs and budget.

MORE: All Networking Content
MORE: Networking in the Forums

Source: http://www.tomshardware.com/reviews/wireless-routers-101,4456.html