Archive | WiFi RSS feed for this section

An Empirical Study of the Transmission Power Setting for Bluetooth-Based Indoor Localization Mechanisms

7 Jun

Nowadays, there is a great interest in developing accurate wireless indoor localization mechanisms enabling the implementation of many consumer-oriented services. Among the many proposals, wireless indoor localization mechanisms based on the Received Signal Strength Indication (RSSI) are being widely explored. Most studies have focused on the evaluation of the capabilities of different mobile device brands and wireless network technologies. Furthermore, different parameters and algorithms have been proposed as a means of improving the accuracy of wireless-based localization mechanisms. In this paper, we focus on the tuning of the RSSI fingerprint to be used in the implementation of a Bluetooth Low Energy 4.0 (BLE4.0) Bluetooth localization mechanism. Following a holistic approach, we start by assessing the capabilities of two Bluetooth sensor/receiver devices. We then evaluate the relevance of the RSSI fingerprint reported by each BLE4.0 beacon operating at various transmission power levels using feature selection techniques. Based on our findings, we use two classification algorithms in order to improve the setting of the transmission power levels of each of the BLE4.0 beacons. Our main findings show that our proposal can greatly improve the localization accuracy by setting a custom transmission power level for each BLE4.0 beacon.

1. Introduction

Nowadays, there is great interest in developing indoor localization algorithms making use of the latest developments on low-power wireless technologies. Among the latest developments, Bluetooth technologies are attracting the attention of many researchers. Their wide availability, practically all smartphones incorporate a Bluetooth interface, is behind the increasing interest in developing indoor localization-based services.
Most recent Bluetooth indoor localization systems are based on the Received Signal Strength Indication (RSSI) metric [1,2]. Recent studies have shown that Bluetooth Low Energy 4.0 (BLE4.0) signals are very susceptible [3] to fast fading impairments. This fact makes it difficult to apply the RSSI-distance models commonly used in the development of Wi-Fi-based localization mechanisms [4,5]. Recent studies proposed alternative methods, as, for example, the use of Voronoi diagrams [6] or the use of a probability distribution to match the best solution to the localization problem [7]. In the context of BLE4.0 beacons, in [8], a proposal based on the use of an Isomap and a Weighted k-Nearest Neighbor (WKNN) is presented. As in previous related works [9,10], we explore the use of two supervised learning algorithms: the k-Nearest Neighbour (k-NN) and the Support Vector Machine (SVM) algorithms [11]. We go a step further by exploring the benefits of individually setting the transmission power as a means to improve the quality of the RSSI fingerprint to be used by the learning algorithms.
Figure 1 shows the overall proposed methodology. First, we analyse the capabilities of two mobile devices: a smartphone and a Raspberry Pi with a BLE4.0 antenna. Once the best device (in terms of the accuracy performance) has been selected, we study the relevance of every BLE4.0 beacon in our experimental environment. From this analysis, we conclude that an ad hoc setting of the transmission power level of the BLE4.0 beacons plays a major role on the quality of the signal fingerprint. In order to get a better insight on our findings, we pay particular attention on describing the floor plan of the lab premises. In fact, recent results show that the use of the floor plan as a basis to identify the multipath components may be exploited to enhance the accuracy of wireless indoor localization scheme [12]. Although the use of such schemes are still at their infancy and limited to wideband communications, they have revealed some insight on the impact of the structural features over the RSSI metric. In [12], Leit et al. have conducted several trials making use of ultra-wide band communications transceivers. Our main aim regarding this latter issue is to provide some insight on the impact of architectural features over the transmission power setting of the BLE4.0 beacons. To the best of our knowledge, this is the first study proposing an asymmetric transmission power setting of the BLE4.0 beacons. We then make use of two supervised learning algorithms to characterize the BLE4.0 beacon signal propagation. These algorithms will then be used for developing indoor localization mechanisms. The results obtained in a real-world scenario validate the proposal.

Figure 1. Overall schema proposal.
In the following, the paper is organized as follows. Section 2 reviews the related work and describes the main contribution of our work. Section 3 describes the experimental set-up including the challenges we can face when developing a BLE4.0 fingerprint-based localization mechanism. We also include a brief description of the two classification algorithms used on our proposal. In Section 4, we examine the adequacy of the experimental set-up on developing the localization scheme. Two main parameters are studied: (i) the contribution of each BLE4.0 beacon deployed in the environment; and (ii) the transmission power level of each BLE4.0 beacon. From this preliminary analysis, we conclude, in Section 5, that the accuracy of the localization mechanism can be improved by setting the transmission power of each BLE4.0 beacon at an appropriate level.

2. Related Work

Nowadays, the design of robust wireless indoor localization mechanisms is a very active research area. Among the many technologies available in the market nowadays, BLE4.0 beacons have spurred the interest of many practitioners and researchers. The main benefits of the technology rely on the installation and maintenance cost of the battery-operated BLE4.0 beacons. The development of a BLE-based indoor localization make use of the RSSI reported by the mobile devices—then, followed by one of two main approaches: Triangulation [13] and fingerprinting [14,15,16]. Lately, other approaches, such as context [17] and crowdsensing [18], are also being actively explored. Despite the efforts being carried out by the research community, the robust development of wireless indoor localization mechanism remains a major challenge. In this work, we are interested on improving the information obtained from the fingerprint of each individual BLE4.0 beacon. Since our goal is to develop the localization scheme based on a classification algorithm, we explore the benefits of setting the transmission power setting of each individual BLE4.0 beacon to improve the quality of the radio map (fingerprint). As in previous related works [10,15], we explore the use of two supervised learning algorithms: The k-Nearest Neighbour (k-NN) and the Support Vector Machine (SVM) algorithms [11]. In the sequel, we briefly review the most relevant works recently reported in the literature and point out the main aim of our work.
In [14], Kriz et al. have developed a localization comprising a set of Wi-Fi Access Points (AP) supplemented by BLE4.0 devices. The localization mechanism was based on the Weighted-Nearest Neighbours in Signal Space algorithm. Two of the main goals of this study have been to enhance the accuracy of wireless indoor localization by introducing the use of the BLE4.0 devices and the deployment of an information system being continuously updated by the RSSI levels reported by the mobile devices. Two main system parameters related to the BLE4.0 devices were varied to verify the performance of the indoor localization mechanism, namely, the scanning duration and the BLE4.0 beacons density. However, the transmission power was set to its maximum value all along the experimental trials.
In [15], the authors conduct an experimental study using 19 BLE4.0 beacons. Their study includes an analysis of the transmission power used by the BLE4.0 beacons over the accuracy of a BLE-based indoor localization scheme. Their results show that their initial power setting, set at the highest available level, was unnecessarily high for their deployment and that an attenuation of up to 25 dB would have had a low impact on the positioning accuracy. Different to our main aim, they were interested in identifying the attenuation bounds ensuring 100% availability of positioning, while avoiding a configuration consisting of proximity “islands”. All along their experimental fields trials, all BLE4.0 beacons were configured using the same transmission power setting. Their results also provide some insights on the tradeoffs between the number of BLE4.0 beacons required and the transmission power settings.
In [16], Paek et al. evaluate the accuracy in proximity and distance estimation of three different Bluetooth devices. Towards this end, they explore the setting of various transmission power levels. Besides finding that the three device brands vary substantially in the transmission power configuration, they conclude that the best power setting will depend on the actual aim of the localization mechanism. They conclude that higher transmission power will better fit to cover larger areas, while low transmission power should be used to detect the proximity of the target to a given area (BLE4.0 beacon). They conclude that the accuracy and efficiency of location estimation heavily depend on the accuracy of the measured RSSI measurements and the model used to estimate the distance and other environmental characteristics. In fact, one of their main claims is the need of a novel approach to overcome some of the main challenges faced by RSSI dynamics. In this work, we first examine the RSSI dynamics using two different devices: A commercial Android smartphone and a Raspberry Pi equipped with a BLE4.0 antenna. From a preliminary analysis, and one having identified the benefits of using the BLE4.0 antenna, we introduce a novel approach based on the use of an asymmetric transmission power setting of the BLE4.0 beacons. Our main aim to improve the quality of the information to be used to feed the classification algorithms. To the authors knowledge, the use of an asymmetric transmission power setting has not been explored on improving the accuracy of a BLE-based indoor localization algorithm.

3. BLE4.0 Indoor Localization: Set-Up, Tools and Algorithms

In this section, we introduce the specifications and technical details of our experimental setting. First, we describe the physical layout of the testbed that we have used to carry all indoor localization experiments. Next, the capabilities of two different mobile devices are experimentally assessed. Finally, the two classification algorithms used in our experiments are described.

3.1. Experimental Indoor Set-Up

Our experiments were conducted in a lab of our research institute. We placed four BLE4.0 beacons at each one of the four corners of a 9.3 m by 6.3 m rectangular area. A fifth BLE4.0 beacon was placed in the middle of one of the longest edges of the room. Figure 2 depicts the experimental area where the five BLE4.0 beacons have been labelled as ’Be07’, ’Be08’, ’Be09’, ’Be10’ and ’Be11’. We divided the experimental area into 15 sectors of 1 m2 each separated by a guard sector of 0.5 m2. A 1.15 m-wide strip was left around the experimental area. This arrangement will allow us to better differentiate the RSSI level of joint sectors when reporting our results. Measurements were taken by placing the mobile device at the centre of each one of the 15 sectors as described below. The shortest distance between a BLE4.0 beacon and the receiver was limited to 1.5 m. Figure 3 shows four views taken from each one of the four corners of the lab. As seen from the figure, we have placed BLE4.0 beacons ’Be10’ and ’Be11’ in front of a window, Figure 3a,b, while all of the other BLE4.0 beacons have been placed in front of the opposite plasterboard wall. We further notice that BLE4.0 beacon ’Be08’ has been placed by the left edge of the entrance door, close to the a corridor with a glass wall, Figure 3d. Our choice has been based on recent results reported in the literature claiming that knowing the geometry of the experimental environment space may be exploited to develop more accurate indoor localization mechanisms [12].

Figure 2. BLE4.0 beacon indoor localization set-up.
Figure 3. Pictures from each one of the four corners of the lab. (a) from Be07; (b) from Be08; (c) from Be10; (d) from Be11.
According to the specifications of the five BLE4.0 beacons used in our experiments, they may operate at one of eight different transmission power (Tx) levels. Following the specifications, the transmission power levels are labelled in consecutive order from the highest to the lowest level as Tx=0x01,Tx=0x02,,Tx=0x08 [19]. During our experiments, we conducted various measurement campaigns by fixing the transmission power level of all of the BLE4.0 beacons at the beginning of each campaign. Furthermore, all measurements were taken under line-of-sight conditions.

3.2. Bluetooth Receiver’s Characteristics

Receiver devices are very sensitive when used in indoor localization [20]. We start by assessing the capabilities of the two mobile devices: a smartphone running the Android 5.1 operating system, and a Raspberry Pi 2 equipped with a USB BLE4.0 antenna [21], from now on referred to as the smartphone and the BLE4.0 antenna, respectively. Furthermore, we will refer to each one of the 151 m2 sectors by a number from 1 to 15, where the sectors are numbered from left to right starting from the upper left corner.
We carried out a survey campaign as follows:

  • We fixed the transmission power of all BLE4.0 beacons to the same level.
  • We placed the mobile device at the centre of each one of the 151 m2 and measured the RSSI of each one of the five BLE4.0 beacons for a time period of one minute.
  • We evaluated the mean and standard deviation of the RSSI for each one of the five BLE4.0 beacons.
The survey was carried out through a time period of five days evenly distributed between the morning and evening hours. The lab occupancy was limited to six people: Two of them were in charge of collecting the data, two other scientists working at the room located at one end of the lab, and the other two scientists at a different area connected with our scenario by means of a corridor. Sporadically, these people passed through the lab during the measurement campaign. This survey campaign was repeated three times in a time span of one month in order to provide different real life conditions and variability to the data gathering process.
It is worth mentioning that the sampling rate of the smartphone is limited to 15 samples/second, while we have set a sampling rate of the BLE4.0 antenna to 86.6 samples/second. In fact, we were unable to match the sampling rates of both devices. Figure 4a,b show the average and standard deviation of the RSSI values for BLE4.0 beacons ’Be07’ and ’Be09’, respectively, using Tx=0x04. Since the purpose of this first experiment was to evaluate the capabilities of both mobile devices, the use of mid-power seemed to be the best choice. The figures show that the BLE4.0 antenna offers better results than the smartphone, higher RSSI levels and lower standard deviation.

Figure 4. RSSI (dBm) for BLE4.0 Antenna and smartphone with transmission power Tx=0x04 for each sector (1.15) of our environment. (a) for Be07; (b) for Be09.

3.3. Bluetooth Signal Attenuation

In the previous section, we have found that the first moment and standard deviation of the RSSI does not provide us with the means to determine the distance of a target device from a reference beacon. In this section, we further analyse the challenges faced when developing a localization scheme using as the main source of information the BLE4.0 RSSI levels. This analysis will allow us to motivate the use of supervised learning algorithm as a basis to develop wireless indoor localization mechanisms.
We focus now on the analysis of the traces of the RSSI data collected for BLE4.0 beacon ’Be07’ and ’Be10’. Our choice has been based on the fact that BLE4.0 beacon ’Be07’ and ’Be10’ have been placed at the two opposite corners of the lab. As seen in Figure 3c, BLE4.0 beacon ’Be07’ was placed close to the entrance of two office spaces, while ’Be10’ was placed by the window (see Figure 3a).
In the following, we analyse two different snapshots of the three data captures, denoted, from now on, as Take 1 and Take 2. The traces correspond to the data collected at sectors 4, 8 and 15. Be aware that, since we just counted with a BLE4.0 antenna, all traces were taken at different times of the day and at different dates. For simplicity, we will refer to Take 1 to the traces corresponding to the first data capture campaign; and by Take 2 to the traces resulting from the second data gathering campaign.

Case 1: Sector 8

We start our analysis by examining the RSSI traces taken at Sector 8, the one corresponding to the sector located at the centre of the experimental area. Figure 5a,b show the two RSSI traces for each one of the two BLE4.0 beacons. We notice that, for a given BLE4.0 beacon, both traces show similar RSSI mean values (dashed lines). Since both BLE4.0 beacons were located at the same distance from the centre of the experimental area, we may expect to get similar average RSSI values for both BLE4.0 beacons. However, as seen from the figure, the RSSI average reported for BLE4.0 beacon ’Be10’ is higher than the one reported for BLE4.0 beacon ’Be07’. The main reason for this discrepancy may be explained by the fact that the BLE4.0 signals are highly sensitive to fast fading impairment: an issue that we will address in the following sections. This result is highly relevant since it clearly shows that we were quite successful in replicating our experiments: a must to set up a baseline scenario aiming to explore the impact of a given parameter over the performance of our proposal. It is also an important source of information to be exploited by the classification process.

Figure 5. Sector 8: Comparison of the RSSI from different BLE4.0 beacons for Tx=0x04. (a) for Be07; (b) for Be10.

Case 2: Sector 4

Figure 6a,b show the traces for both BLE4.0 beacons at Sector 4. In this case, BLE4.0 beacon ’Be07’ is closer to this sector than BLE4.0 beacon ’Be10’. However, as seen in the figures, the RSSI traces for BLE4.0 beacon ’Be07’ exhibit lower values than those reported for BLE4.0 beacon ’Be10’. It is also important to mention that, despite the captures for both beacons having been taken at different times, the average RSSI signal levels (dashed lines) of BLE4.0 beacon ’Be07’ for both traces were lower than the ones reported for the traces for BLE4.0 beacon ’Be10’. However, a more in-depth analysis of the impact of external actors over the signal should be conducted. For instance, a more in-depth study of the impact of the room occupancy and more importantly on how to integrate this parameter into the information to be fed to the classification algorithms should be studied.

Figure 6. Sector 4: Comparison of the RSSI from different BLE4.0 beacons for Tx=0x04. (a) for Be07; (b) for Be10.

Case 3: Sector 15

In this case, we analyse the traces collected at Sector 15, the closest sector to BLE4.0 beacon ’Be10’. As can be seen in Figure 7a,b, it is surprising that the average signal level (dashed lines) of BLE4.0 beacon ’Be07’ is higher than the average of the BLE4.0 beacon ’Be10’. This confirms once again that the signal is highly sensitive to the fast fading impairment. We also notice that the traces Take 1 for both BLE4.0 beacons are smoother than the traces obtained during the second campaign, Take 2. The high variance of Take 1 of BLE4.0 beacon ’Be07’ can be explained by the fact that the way from the main door of the lab into the offices passes between the location of BLE4.0 beacon ’Be07’ and Sector 15. This shows the importance of counting with an estimate of the room occupancy as a key parameter to develop accurate wireless indoor localization mechanisms. It also shows the benefits of counting with a baseline scenario to guide the classification task and identify the relevance of other key parameters. In our case, we are interested here in exploring the proper setting of the transmission power of the BLE4.0 beacons.

Figure 7. Sector 15: Comparison of the RSSI from different BLE4.0 beacons for Tx=0x04. (a) for Be07; (b) for Be10.
The above analysis of the statistics of the data collected reveal that Bluetooth signals are very susceptible to fast fading impairments. They also show, up to a certain extent, the impact of the occupancy over the signal level: a well-known fact, but still difficult to characterize and more importantly to mitigate. Current studies are being carried by several groups on developing efficient methods to generate RSSI fingerprint databases. In this work, we should focus on fusing the fingerprint of the beacons by varying the power settings as a means to mitigate the fast fading impairment. We then evaluate the performance of two supervised learning algorithms as a basis to develop an indoor localization mechanism.

3.4. Supervised Learning Algorithms

As already stated, the statistics of the Bluetooth signal, mean and standard deviation, show the need of exploring alternative data processing mechanisms towards the development of an RSSI-based localization solution. We base our proposal on the use of the two following classification algorithms [22]:

  • k-NN: Given a test instance, this algorithm selects the k nearest neighbours, based on a pre-defined distance metric of the training set. In our case, we use the Euclidean distance since our predictor variables (features) share the same type, i.e., the RSSI values, properly fitting the indoor localization problem [22]. Although k-NN uses the most common neighbour of the k located categories (that is the mode of the category) to classify a given test instance, some variations are used (e.g., weighted distances) to avoid removing relevant information. In this paper, we have set the hyperparameter to k = 5 as the best solution, based on some of our preliminary numerical analysis. We use both mentioned versions of the algorithm: the weighted distance (WD) and mode (MD).
  • SVM: Given the training data, a hyperplane is defined to optimally discriminate between different categories. If linear classifier are used, SVM constructs a line that performs an optimal discrimination. For the non-linear classifier, kernel functions are used, which maximize the margin between categories. In this paper, we have explored the use of linear classifier and Polynomial kernel with two different grades, namely, 2 and 3. Finally, we present only the best results, which were obtained with a Polynomial kernel with a quadratic function [22].
In order to properly justify which of the two mobile devices best fit our needs, we evaluate the accuracy of our proposal using the two classification algorithms. Both devices, BLE4.0 antenna and smartphone, were tested using k-NN and SVM, where k-NN was proven to be the most optimal and efficient algorithm for these types of problems because it works well in a low-dimensional space (in this case, five features) avoiding the curse of dimensionality (the more volume of input, the more training time since it increases at an exponential rate). Although SVM gives a similar precision to k-NN, its runtime is higher because with a view to having a well separated hyperplane, the input space should be high enough [11,23]. We used the data collected during the previously described experimental campaign. For each trial, the data training set consisted of two-thirds and a validation set of one-third of the vectors, randomly selected for each experiment. The results show the mean error of the algorithm executed 50 times.
Table 1 shows that the use of the device equipped with the BLE4.0 antenna provides much better results. A greater accuracy is reported by the BLE4.0 antenna device than for the smartphone. In fact, the results show that the accuracy is almost three times better than the one reported by the smartphone. Based on these results, we decided to use the BLE4-0 antenna device as our experimental tool.

Table 1. Global accuracy for k-NN using mode (with k = 5) and SVM (with a quadratic polynomial kernel function) algorithms for transmission power Tx=0x04. Best results are shown in bold.

4. On the Adequacy of the Bluetooth-Based Localization Platform

This section is devoted to analyse the adequacy of our experimental platform. To do that, first, we performed a preliminary analysis to assess the relevance of each of the five BLE4.0 beacons with respect to a classification model. This analysis has been done using the RSSI reported using different transmission power levels. Furthermore, this study should set the basis for exploring an asymmetric setting of the transmission power. In other words, it is worth exploring if the precision of the localization mechanisms may be improved using different transmission power levels. Obviously, the resulting configuration should be derived from the signal fingerprint of each BLE4.0 beacon.

4.1. Relevance of BLE4.0 Beacons

We propose the use of feature selection techniques in order to assess the relevance of each BLE4.0 beacon in the classification model [24]. Although these techniques are used mainly to improve a model, they are also used to identify the importance of the features with the response variable [25]. Here, we use two well-known techniques: the ExtraTrees [26] and Gradient Boosting Classifier [27]. Our choice is based on the fact that both algorithms are robust and accurate. In addition, differently to the Principal Component Analysis and SelectKBest algorithms [28], they do not require any previous parameter tuning. In the following, a brief description of these two algorithms is presented:

  • ExtraTrees stands for Extremely Randomized Trees, which is an ensemble method that builds multiple models (random trees) for each sample of the training data. Then, all of the predictions are averaged. Default sklearn python library hyperparameters were used.
  • Gradient Boosting Algorithm is also an ensemble method using decision trees as base models and weighted voting selection method. Furthermore, it makes a prior model every time it is executed. Default sklearn python library hyperparameters were used.
Both algorithms compute a score associated to each feature, which represents the relevance, in percentage, of this feature to the classification process [29].
Table 2 shows the number of samples per BLE4.0 beacon captured at different transmission power levels using the BLE4.0 antenna device. Although the BLE4.0 beacons may operate at eight different transmission power levels, we have not made use of the two lowest levels, namely, Tx=0x07 and Tx=0x08, since they have not been detected over the whole experimental area.

Table 2. Sample sizes of the RSSI captured using the BLE4.0 at various transmission power (Tx) levels.
The ideal situation would be when all BLE4.0 beacons have the same relevance to the classification model, or similarly to find a uniform distribution in the relevance scores. Figure 8 and Figure 9 show the scores for the five BLE4.0 beacons over the six different transmission power levels under study. An analysis of the results clearly show that the transmission power plays a significant role. For instance, Figure 8a shows that, when Tx=0x01, the BLE4.0 beacon ’Be011’ is more relevant to the classification model than all of the other BLE4.0 beacons. However, in the case when the when Tx=0x02, the BLE4.0 beacon ’Be010’ becomes more relevant. Moreover, Figure 8d and Figure 9d, with Tx=0x04, exhibit a more uniform distribution, and all BLE4.0 beacons have a similar relevance in the classification model.

Figure 8. Relevance score of each BLE4.0 beacon for ExtraTrees algorithm for different transmission power (Tx) levels. (a) Tx=0x01; (b) Tx=0x02; (c) Tx=0x03; (d) Tx=0x04; (e) Tx=0x05; (f) Tx=0x06.
Figure 9. Relevance score of each BLE4.0 beacon for Gradient Boosting Classifier algorithm for different transmission power (Tx) levels. (a) Tx=0x01; (b) Tx=0x02; (c) Tx=0x03; (d) Tx=0x04; (e) Tx=0x05; (f) Tx=0x06.
From these results, it is clear that all BLE4.0 beacons exhibit similar relevance scores. They do not deviate more than 5% from the others and none of them exceeds a 30% of the total relevance. These figures allow us to confirm that the experimental set-up is balanced and therefore suitable for exploring the performance of our proposed indoor localization mechanism.

4.2. Baseline Evaluation

In this section, we evaluate the accuracy of the two classification algorithms for each one of the six different transmission power levels, i.e., all BLE4.0 beacons operate at the same transmission power level. Table 3 shows that the best accuracy for the k-NN and the SVM algorithms are 65% for Tx=0x03 and 61.7% for Tx=0x06, respectively.

Table 3. Global accuracy using BLE4.0 antenna for k-NN (with k = 5) using mode and SVM (with a quadratic polynomial kernel function) algorithms for different transmission power (Tx) levels. Best results are shown in bold.
Figure 10 shows the RSSI values for BLE4.0 beacons ’Be07’, ’Be09’ and ’Be11’ when operating at Tx=0x03 and Tx=0x05, i.e., the transmission power levels reporting the best and worst results for the k-NN algorithm.

Figure 10. RSSI values for the best (top) and worst (bottom) transmission power (Tx) level for BLE4.0 beacons ’Be07’, ’Be09’ and ’Be10’ throughout the area captured by the BLE4.0 antenna. (a) Be07 with Tx=0x03; (b) Be09 with Tx=0x03; (c) Be10 with Tx=0x03; (d) Be07 with Tx=0x05; (e) Be09 with Tx=0x05; (f) Be10 with Tx=0x05.
From the figures, it is clear that better results are obtained when the RSSI reported for the various sectors are clearly differentiated. In particular, Figure 10a–c allows us to properly identify the actual location of the BLE4.0 beacons: the highest RSSI value of the footprint is closely located to the BLE4.0 beacon. However, Figure 10d–f does not exhibit this feature: some of the highest RSSI values are reported far away from the actual BLE4.0 beacon physical location. More specifically, in all these latter cases, the highest RSSI values are reported at two different points. For instance, in the case of BLE4.0 beacon ’Be10’ operating at Tx=0x05 (see Figure 10f), the highest RSSI values are reported at two opposite corners of the experimental area. This signal impairment, known as deep multipath fading, is one of the main obstacles towards the development of robust and accurate BLE-based location mechanisms [7]. In the presence of multipath fading, the information to be derived from the RSSI values of each individual BLE4.0 beacons will definitely mislead the classification process.
Among the various proposals reported in the literature, transmission power control is theoretically one of the most effective approaches for mitigating the multipath effect [30]. However, this process is not as straightforward as it seems. For instance, the results for the BLE4.0 beacon ’Be10’, show that the use of Tx=0x02 may provide some of the best results (see Figure 8b and Figure 9b). However, setting the transmission powers of the BLE4.0 beacons to Tx=0x02 results on the second lowest ranked power transmission configuration (see Table 3). This clearly shows that the setting of the other BLE4.0 beacons play a major role on the overall outcome.
From the previous analysis, it is worth exploring if an asymmetric transmission power setting has a positive impact on the classification. As seen from Figure 10, the different settings of the transmission power of the BLE4.0 beacons may provide lower or higher relevance to the classification process. In the next section, we undertake an in-depth study on this issue.

5. Asymmetric Transmission Power

In this section, we start by motivating the study of an asymmetric transmission power setting of the BLE4.0 beacons over the accuracy of the classification model. We then find the setting by examining all of the transmission power setting/BLE4.0-beacon combinations. Our results are reported in terms of the local and global accuracy. The former provides the accuracy of the classification model per each one of the 15 sectors, while the latter refers to the accuracy over the whole experimental area.

5.1. Fingerprint as a Function of the Transmission Power

In the previous section, we have found that the accuracy of the classification process heavily depends on the transmission power of the BLE4.0 beacons. More specifically, we noticed that, in the presence of the multipath fading impairment, the classification process is heavily penalized. It is therefore worth exploring an asymmetric transmission power setting of the BLE4.0 beacons. Such a setting should allow us to exploit the characteristics of the fingerprint as a means to improve the accuracy of the identification process.
In order to further motivate our work, we start by visually examining the RSSI values associated to the fingerprint of three of the five BLE4.0 beacons used in our testbed, namely, BLE4.0 beacons ’Be11’, ’Be07’ and ’Be08’ (see Figure 11). Figure 11a,d show the RSSI values for BLE4.0 beacon ’Be11’ when operating at two different transmission power levels. The values shown in Figure 11d exhibits better characteristics: the highest RSSI value is closely located and delimited around the area where the BLE4.0 beacon ’Be11’ is placed, i.e., the upper right corner of the figure. On the contrary, the values shown in Figure 11a does not allow us to easily identify the location of the BLE4.0 beacon ’Be11’. The results for the other two BLE4.0 beacons exhibit similar characteristics. We further notice that the most useful fingerprints for BLE4.0 beacon ’Be07’ and ’Be11’ share the same transmission power level Tx=0x04. However, in the case of BLE4.0 beacon ’Be08’, the transmission power setting that provides better results is Tx=0x01. Therefore, it is worth exploring the setting of the transmission power setting as a way to improve the accuracy of the identification algorithms.

Figure 11. RSSI values for different transmission power levels (Tx) for BLE4.0 beacons ’Be11’, ’Be07’ and ’Be08’. (a) ’Be11’ with Tx=0x03; (b) ’Be07’ with Tx=0x01; (c) ’Be08’ with Tx=0x05; (d) ’Be11’ with Tx=0x04; (e) ’Be07’ with Tx=0x04; (f) ’Be08’ with Tx=0x01.

5.2. On Deriving the Best Asymmetric Transmission Power Setting

In this section, we conduct an ad hoc process to find the best transmission power setting by evaluating all the transmission power setting/BLE4.0-beacon combinations. Each combination is evaluated in terms of its local and global accuracy. In our case, our platform consists of five BLE4.0 beacons operating at one of six possible transmission power levels. This involves a total of 7776 combinations to be processed.

Case 1: Asymmetric Transmission Power for k-NN

Figure 12 shows the overall cumulative positioning error for the three best and the three worst combined transmission power levels for k-NN, using both versions of the classification algorithm, namely, weighted distance (a) and mode (b). The most relevant transmission power level combination is the one with the configuration: BLE4.0 beacon ’Be07’ with Tx=0x04, BLE4.0 beacon ’Be08’ with Tx=0x01, BLE4.0 beacon ’Be09’ with Tx=0x02, BLE4.0 beacon ’Be10’ with Tx=0x01 and BLE4.0 beacon ’Be11’ with Tx=0x01, which, in the following, will be represented by [4,1,2,1,1] for short. This vector contains the transmission power level assigned to BLE4.0 beacons ’Be07’, ’Be08’, ’Be09’, ’Be10’, and ’Be11’, respectively. The figure shows that this setting limits the positioning error to less than 3 m in 95% of the times, for both versions of the k-NN classification algorithm. For the worst configurations, the 95% of the cumulative error is achieved with errors of 4 m (WD) and 5.5 m (MD), respectively.

Figure 12. Positioning error for k-NN (with k = 5) using (a) weighted distance; (b) mode. In both plots, the three best and the three worst combined transmission power for each BLE4.0 beacon are shown.
Figure 13 shows the RSSI values for the most relevant transmission power levels. The results show that the location of each BLE4.0 beacon is properly identified by the RSSI fingerprint. That is, such sectors are quite relevant to the classification algorithms.

Figure 13. RSSI values using the most relevant transmission power (Tx) level setting for each BLE4.0 beacon: [4,1,2,1,1]. (a) ’Be07’ with Tx=0x04; (b) ’Be08’ with Tx=0x01; (c) ’Be09’ with Tx=0x02; (d) ’Be10’ with Tx=0x01; (e) ’Be11’ with Tx=0x01.
Table 4 shows the local accuracy for each sector (15 in total) using the most relevant transmission power levels. The results show that the best accuracy is reported for the sectors close to the BLE4.0 beacons, while the accuracy deteriorates as a function of the distance.

Table 4. Local accuracy in each sector of our experimental area with the most relevant transmission power level for k-NN using mode (with k = 5). The centre shows the accuracy (in %) of each sector. Corners and middle-left hand are the position of BLE4.0 beacons with BeXY name. The most relevant transmission power level was [4,1,2,1,1].
Comparing the results in Table 4 with those in Figure 13, we notice that the midpoint sector, with an accuracy of 18.10%, does not have a distinctive RSSI differentiated from the others, i.e., the RSSI values of all the BLE4.0 beacons are very constant in this sector.
In the case of BLE4.0 beacon ’Be09’, Figure 13c, we have a representative RSSI totally different from the one reported for the other sectors. This guarantees a good classification at this sector with a 100% of local accuracy (see Table 4). Moreover, from Figure 4b, we can observe that sector 7 (the closest to BLE4.0 beacon ’Be09’) has a characteristic RSSI totally different from the others. This result confirms the benefits of counting with a sector with a distinctive RSSI fingerprint: a substantial improvement, locally and globally, on the positioning accuracy.

Case 2: Asymmetric Transmission Power for SVM

Similarly to the previous section, we carried out an analysis using the SVM algorithm. In this case, we found that the most relevant transmission power levels were exactly the same as for the k-NN algorithm: [4,1,2,1,1]. The global accuracy was 75.57% and the RSSI propagation heatmap is also shown in Figure 13.
Figure 14 shows the positioning error for the three best and worst combined transmission power levels for SVM, which are very similar to the ones obtained with k-NN. The figure shows that, for the three best transmission power level settings, the positioning error is lower than 3 m in 95% of the times. For the three worst configurations, a positioning error of less than 6 meters is obtained with a cumulative probability of 0.95.

Figure 14. Positioning error for SVM (with a quadratic polynomial kernel function). In both plots, the three best and the three worst combined transmission power for each BLE4.0 beacon are shown.
Table 5 shows the local accuracy for each sector (15 in total) using the most relevant transmission power level setting ([4,1,2,1,1]), showing a very similar behaviour as k-NN. We can observe that the areas that have a weak characterization by RSSI propagation will have a worst local accuracy, as observed for the midpoint with only a 19.83% of local accuracy.

Table 5. Local accuracy in each sector of our experimental area with the most relevant transmission power level for SVM (with a quadratic polynomial kernel function). The centre shows the accuracy (in %) of each sector. Corners and middle-left hand are the position of BLE4.0 beacons with BeXY name. The most relevant transmission power level was [4,1,2,1,1].
Our results confirm that a proper setting of the transmission power of each BLE4.0 beacon has a positive impact on the performance of both classification algorithms SVM and k-NN by a proper setting, we mean to make use of the RSSI map of each BLE4.0 beacon allowing us to differentiate one sector from another.
Although we do not have conclusive evidence on the nature and extend of the impact of the architecture of our lab premises over the signal, we notice that the highest power levels have been assigned to BLE4.0 beacons ’Be08’, ’Be10’ and ’Be11’, the ones closer to the window and the open corridor, while lower transmission power levels have been assigned to BLE4.0 beacons ’Be07’ and ’Be09’, the ones located at the plasterboard wall. As mentioned in the introduction, recent studies have shown that the use of a priori floor plan information may enable the development of more accurate wireless indoor localization schemes [12].

5.3. Asymmetric Transmission Power Setting

Table 6 and Table 7 show the results for different transmission power settings obtained for both classification algorithms: k-NN and SVM. For each algorithm, two different transmission power settings were used: best configuration using symmetric transmission power setting ([3,3,3,3,3] for k-NN and [6,6,6,6,6] for SVM; and best configuration using asymmetric transmission power level setting ([4,1,2,1,1] for both k-NN and SVM). From the results in Table 6, it is clear that properly setting the transmission power of each BLE4.0 beacon, the cumulative positioning error can be substantially reduced. Furthermore, k-NN (MD) reports in general slightly better results than k-NN (WD) and SVM. These results are corroborated with the ones presented in Table 7. The results show that k-NN (MD) with the asymmetric transmission power setting exhibits a lower mean error, approximately 0.07 m lower than the obtained by SVM.

Table 6. Cumulative positioning error with different transmission power (Tx) level settings for k-NN (with k = 5) using weighted distance (WD) and mode (MD); and SVM (with a quadratic polynomial kernel function). Best results are shown in bold.
Table 7. Mean error for k-NN (with k = 5) using weighted distance (WD) and mode (MD); and SVM (with a quadratic polynomial kernel function) with the same and the most relevant transmission power level (Tx). Best results are shown in bold.
Finally, Table 8 shows the global accuracy using different asymmetric transmission power level settings (the five worst and the five best results), and using all symmetric transmission power settings. We can observe that, for SVM, the worst and best asymmetric transmission power settings report an accuracy rate of 35.70% and 75.57%, respectively: the latter being substantially better to the 61.70% reported using the best results using a symmetric transmission power setting, i.e., [6-6-6-6-6]. From the results shown in the table, we notice that the k-NN algorithm reports higher scores in all transmission power settings—for both, the five worst and five best settings than those reported when the SVM algorithm is applied. We further notice that both algorithms rank the same transmission power setting, namely, [4-1-2-1-1] as the best one.

Table 8. Accuracy results for the k-NN using mode (with k = 5) (right) and SVM localization (with a quadratic polynomial kernel function) (left) algorithms. Worst and best settings using different asymmetric transmission power settings, and the best symmetric transmission power level settings (shown in italic font). Best results are shown in bold.
A further analysis of the results depicted in Table 8 show that both algorithms clearly classify the transmission power of some of the BLE4.0 beacons as the best choices. This is the case, for instance, for BLE4.0 beacons ’Be08’ whose best transmission power is Tx=0x01 for all five best settings reported by both algorithms. As for the case of BLE4.0 beacons ’Be07’ and ’Be09’, the most recommended values are Tx=0x04 and Tx=0x02, respectively. As previously discussed for the case of BLE4.0 beacon ’Be09’ (see Figure 13c), the classification process greatly benefits when the RSSI provides the means to identify the location of the reference BLE4.0 beacon. Our results seem to confirm the benefits of using the transmission power setting whose RSSI better contribute to the classification process. However, in the case of the SVM algorithm, we notice that the transmission power value used by BLE4.0 beacon ’Be09’ in the fourth best ranked setting is the same as the one used in the worst ranked setting. We should further explore the relevance of the individual transmission power level as a major source of information and more importantly, the impact of the asymmetric power levels setting as a means to overcome the multipath fading impairment.

5.4. On the Relevance of the Individual RSSI Values

With the purpose of evaluating the relevance of the information provided by the RSSI values as a major source of information to guide the classification process, we look at the ranking of the individual transmission power values used by each one of the BLE4.0 beacons. In the previous section, we noticed that in the worst and fourth best transmission power settings reported by the SVM results, the transmission power of BLE4.0 beacon ’Be09’ has been set to Tx=0x04. In order to explore further this issue, we looked for each one of the BLE4.0 beacons, and the worst ranked setting making use of the transmission power values for each of the BLE4.0 beacons. We carry out this study only for the k-NN algorithm use mode. Similar conclusions may be derived from an analysis of the results reported by SVM. In fact, the aforementioned case for BLE4.0 beacon ’Be09’ provided us the basis of our analysis.
Table 9 shows the rankings among the worst transmission power settings of the transmission power used in the best setting by each BLE4.0 beacon. As seen from the table, the transmission power used in the best case for all BLE4.0 beacons also make part of a reduced number of the worst settings. For instance, in the case of BLE4.0 beacon ’Be09’, the transmission power value, Tx=0x02, having been visually characterized as an excellent source of information, makes up part of the 0.5% worst settings. These results clearly show that the RSSI derived from the transmission power used by an individual source does not guarantee by itself the best classification process. We should then further explore the use of an asymmetric transmission power setting as a means to mitigate the multipath fading impairment. This analysis should provide us a basis to identify the approach to be used to improve the classification process.

Table 9. Ranking of the transmission power values used by each BLE4.0 beacon for k-NN using mode (with k = 5) results.

5.5. On Mitigating the Multipath Fading Impairment

In this section, we start by taking a closer look at the transmission power setting [1-1-1-1-1]. Our choice is based on the fact that both classification algorithms ranked this setting as the fourth best symmetric setting (see Table 8). Furthermore, we notice that, in the best setting, the transmission power of three out of the five beacons has been set to Tx=0x01. Our main aim is therefore to provide a further insight on the improvement on the quality of the information provided to the classification algorithms.
From Figure 11b,e, we can clearly identify the presence of the multipath fading effect. From the figures, one may think that changing the transmission power of BLE4.0 beacon ’Be07’ to Tx=0x04 will lead to similar or even worse results than the ones reported for Tx=0x01. However, our results show that by simply changing the setting of BLE4.0 beacon ’Be07’, i.e., using the setting [4-1-1-1-1], the global accuracy reported by the k-NN algorithm considerably improves from 62.10 to 69.9%. This can be explained by a close look at the results reported in Figure 13 for BLE4.0 beacons ’Be07’ and ’Be08’. From the figures, it is clear that by setting the transmission power of BLE4.0 beacon ’Be07’ to Tx=0x04 and ’Be08’ to Tx=0x04, the highest RSSI levels of BLE4.0 beacon ’Be08’ located at the bottom part of the figure help to mitigate the effect of the multipath fading impairment.
Let us now consider the transmission power setting [4-4-4-4-4]. As shown in Table 8, both classification algorithms have ranked this setting as the second best one among the symmetric transmission power settings. Our results reported that by simply changing the power setting to [4-4-2-4-4], the global accuracy of increases from 64.7% (see Table 8) to 69.2%, i.e., an improvement of almost 5%. However, if we set the transmission power to [1-4-4-4-3], the global accuracy drops to 62.2%, i.e., a decrease close to 2.5%. In fact, we could expect a higher drop since the RSSI values for BLE4.0 beacon ’Be07’ (see Figure 11a) does not allow us to clearly identify the actual location of BLE4.0 beacon ’Be11’. Let us now consider the setting [1-4-4-4-4]. From our previous analysis and the RSSI values of BLE4.0 beacon ’Be07’ when using Tx=0x01 (see Figure 11b), we may not expect a higher drop than the one reported for the previously analysed [4-4-4-4-3] setting. However, our results report a global accuracy of 57.5% for this latter setting. That is to say, the accuracy drops more than 7% with respect to the symmetric setting [4-4-4-4-4].
The above analysis sets the basis towards deriving a methodology allowing us to enhance the performance of the classification algorithms. From the results reported in Table 8, we may start by setting the transmission power of all the BLE4.0 beacons to the same values; all symmetric settings rank around the median. The use of a database of RSSI values of all of the BLE4.0 beacons at different transmission power levels may be used to derive a setting offering better results. In fact, various works recently reported in the literature are working on the creation of such databases [31]. Since finding the best setting depends on the combination and features of the RSSI maps, one of the first approaches is to study different combinatorial optimization algorithms, e.g., genetic algorithms. In other words, one may start by setting a symmetric transmission power setting, and, based on the RSSI levels reported using different transmission power settings, the quality of the information to be provided to the classification algorithms may be enhanced.
From this analysis, we can conclude that:

  • Although it is important to classify the sectors with a distinctive RSSI, the percentage of settings obtained is not a considerable matching of the combinations between the two classification algorithms.
  • The RSSI value of a given BLE4.0 beacon proves to be a useful, but not definitely, the main source of information on setting the best transmission power setting.
  • An asymmetric transmission power setting may prove useful on mitigating the information to be provided to the classification algorithms due to the multipath fading effect.

6. Conclusions

This study has revealed some useful insight on the required tool characteristics to calibrate an accurate BLE4.0-assisted indoor localization mechanism. Based on the constraints imposed by the smartphones, mainly the limited sampling rate and antennas, the basic requirements of the calibration platform can be simply stated as: (i) the use of a hardware transmitter with different transmission power levels; (ii) the use of BLE4.0 antenna; and (iii) an evaluation of the relevance of the RSSI of each BLE4.0 beacon to the classification models taking into account their placement and transmission powers.
Although we have not been able to fully explain the extent and nature of the impact of the architectural features over the RSSI metric, we have paid attention to describing the lab floor. Our results provide some insight on the relevance of knowing the placement of the BLE4.0 beacons with respect to reflective surfaces, e.g., windows and plasterboard walls.
In this work, we have presented the importance of using a good BLE4.0 receiver—in this case, a BLE4.0 antenna, for indoor localization, improving the accuracy significantly over the one obtained using a smartphone.
Our approach integrates the study of a balanced Bluetooth sensor topology analysing the relevance of this BLE4.0 beacons for the classification algorithms, Gradient Boosting Classifier and Extra Trees being a robust and accurate solution.
Our immediate research efforts will be focused on improving the experimental set-up to further evaluate the use of different transmission power levels using the classification algorithms. Our main goal is to develop a methodology allowing us to find the optimal setting of the transmission power levels and placement of the BLE4.0 beacons. We believe that these two parameters should greatly improve the local and global accuracy of our proposal.
Moreover, we also have in mind to extend this research to incorporate different study of the Bluetooth network topology, trying to improve the local and global accuracy. The use of other Machine Learning algorithms is quite important to improve the accuracy and, of course, the different filters to identify the outliers.
Another major task in our immediate research plans is to study different combinatorial optimization algorithms (e.g., genetic algorithms) to perform the asymmetric assignment optimally and automatically.

Acknowledgments

This work has been partially funded by the “Programa Nacional de Innovación para la Competitividad y Productividad, Innóvate – Perú” of the Peruvian government, under Grant No. FINCyT 363-PNICP-PIAP-2014, and by the Spanish Ministry of Economy and Competitiveness under Grant Nos. TIN2015-66972-C5-2-R and TIN2015-65686-C5-3-R.

Author Contributions

Manuel Castillo-Cara and Jesús Lovón-Melgarejo conceived and designed the experiments; Manuel Castillo-Cara and Jesús Lovón-Melgarejo performed the experiments; Luis Orozco-Barbosa and Ismael García-Varea analyzed the data; and Gusseppe Bravo-Rocca contributed with reagents/materials/analysis tools. All authors wrote and revised the document.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:

RSSI Received Signal Strength Indication
BLE4.0 Bluetooth Low Energy 4.0
k-NN k-Nearest Neighbour
SVM Support Vector Machine
AP Access Point
Tx Transmission Power
dB Decibel
dBm Decibel-milliwatts
MD Mode
WD Weighted Distance

References

  1. Shuo, S.; Hao, S.; Yang, S. Design of an experimental indoor position system based on RSSI. In Proceedings of the 2nd International Conference on Information Science and Engineering, Hangzhou, China, 4–6 December 2010; pp. 1989–1992. [Google Scholar]
  2. Feldmann, S.; Kyamakya, K.; Zapater, A.; Lue, Z. An indoor bluetooth-based positioning system: Concept, implementation and experimental evaluation. In Proceedings of the International Conference on Wireless Networks, Las Vegas, NV, USA, 23–26 June 2003; pp. 109–113. [Google Scholar]
  3. Shukri, S.; Kamarudin, L.; Cheik, G.C.; Gunasagaran, R.; Zakaria, A.; Kamarudin, K.; Zakaria, S.S.; Harun, A.; Azemi, S. Analysis of RSSI-based DFL for human detection in indoor environment using IRIS mote. In Proceedings of the 3rd IEEE International Conference on Electronic Design (ICED), Phuket, Thailand, 11–12 August 2016; pp. 216–221. [Google Scholar]
  4. Rappaport, T. Wireless Communications: Principles and Practice, 2nd ed.; Prentice Hall PTR: Upper Saddle River, NJ, USA, 2001. [Google Scholar]
  5. Martínez-Gómez, J.; del Horno, M.M.; Castillo-Cara, M.; Luján, V.M.B.; Barbosa, L.O.; García-Varea, I. Spatial statistical analysis for the design of indoor particle-filter-based localization mechanisms. Int. J. Distrib. Sens. Netw.2016, 12. [Google Scholar] [CrossRef]
  6. Onishi, K. Indoor position detection using BLE signals based on voronoi diagram. In Proceedings of the International Conference on Intelligent Software Methodologies, Tools, and Techniques, Langkawi, Malaysia, 22–24 September 2014; pp. 18–29. [Google Scholar]
  7. Palumbo, F.; Barsocchi, P.; Chessa, S.; Augusto, J.C. A stigmergic approach to indoor localization using bluetooth low energy beacons. In Proceedings of the 12th IEEE International Conference on Advanced Video and Signal Based Surveillance, Karlsruhe, Germany, 25–28 August 2015; pp. 1–6. [Google Scholar]
  8. Wang, Q.; Feng, Y.; Zhang, X.; Su, Y.; Lu, X. IWKNN: An effective bluetooth positioning method based on isomap and WKNN. Mob. Inf. Syst. 2016, 2016, 8765874:1–8765874:11. [Google Scholar] [CrossRef]
  9. Faragher, R.; Harle, R. An analysis of the accuracy of bluetooth low energy for indoor positioning applications. In Proceedings of the 27th International Technical Meeting of The Satellite Division of the Institute of Navigation, Tampa, FL, USA, 8–12 September 2014; Volume 812, pp. 201–210. [Google Scholar]
  10. Peng, Y.; Fan, W.; Dong, X.; Zhang, X. An Iterative Weighted KNN (IW-KNN) Based Indoor Localization Method in Bluetooth Low Energy (BLE) Environment. In Proceedings of the 2016 International IEEE ConferencesUbiquitous Intelligence & Computing, Advanced and Trusted Computing, Scalable Computing and Communications, Cloud and Big Data Computing, Internet of People, and Smart World Congress, Toulouse, France, 18–21 July 2016; pp. 794–800. [Google Scholar]
  11. Zhang, L.; Liu, X.; Song, J.; Gurrin, C.; Zhu, Z. A comprehensive study of bluetooth fingerprinting-based algorithms for localization. In Proceedings of the 27th IEEE International Conference on Advanced Information Networking and Applications Workshops (WAINA), Barcelona, Spain, 25–28 March 2013; pp. 300–305. [Google Scholar]
  12. Leitinger, E.; Meissner, P.; Rüdisser, C.; Dumphart, G.; Witrisal, K. Evaluation of position-related information in multipath components for indoor positioning. IEEE J. Sel. Areas Commun. 2015, 33, 2313–2328. [Google Scholar] [CrossRef]
  13. Wang, Q.; Guo, Y.; Yang, L.; Tian, M. An indoor positioning system based on ibeacon. In Transactions on Edutainment XIII; Pan, Z., Cheok, A.D., Müller, W., Zhang, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2017; pp. 262–272. [Google Scholar]
  14. Kriz, P.; Maly, F.; Kozel, T. Improving indoor localization using bluetooth low energy beacons. Mob. Inf. Syst. 2016, 2016, 2083094:1–2083094:11. [Google Scholar] [CrossRef]
  15. Faragher, R.; Harle, R. Location fingerprinting with bluetooth low energy beacons. IEEE J. Sel. Areas Commun. 2015, 33, 2418–2428. [Google Scholar] [CrossRef]
  16. Paek, J.; Ko, J.; Shin, H. A measurement study of ble ibeacon and geometric adjustment scheme for indoor location-based mobile applications. Mob. Inf. Syst. 2016, 2016, 1–13. [Google Scholar] [CrossRef]
  17. Perera, C.; Aghaee, S.; Faragher, R.; Harle, R.; Blackwell, A. A contextual investigation of location in the home using bluetooth low energy beacons. arXiv, 2017; arXiv:cs.HC/1703.04150. [Google Scholar]
  18. Pei, L.; Zhang, M.; Zou, D.; Chen, R.; Chen, Y. A survey of crowd sensing opportunistic signals for indoor localization. Mob. Inf. Syst. 2016, 2016, 1–16. [Google Scholar] [CrossRef]
  19. Jaalee. Beacon IB0004-N Plus. Available online: https://www.jaalee.com/ (accessed on 6 March 2017).
  20. Anagnostopoulos, G.G.; Deriaz, M.; Konstantas, D. Online self-calibration of the propagation model for indoor positioning ranging methods. In Proceedings of the International Conference on Indoor Positioning and Indoor Navigation (IPIN), Alcala de Henares, Spain, 4–7 October 2016; pp. 1–6. [Google Scholar]
  21. Trendnet. Micro Bluetooth USB Adapter. Available online: https://www.trendnet.com/products/USB-adapters/TBW-107UB/ ( accessed on 6 March 2017).
  22. Brownlee, J. Spot-check classification algorithms. In Machine Learning Mastery with Python; Machine Learning Mastery Pty Ltd.: Vermont Victoria, Australia, 2016; pp. 100–120. [Google Scholar]
  23. Breiman, L. Statistical modeling: The two cultures (with comments and a rejoinder by the author). Stat. Sci. 2001, 16, 199–231. [Google Scholar] [CrossRef]
  24. Brownlee, J. Feature selection. In Machine Learning Mastery with Python; Machine Learning Mastery Pty Ltd.: Vermont Victoria, Australia, 2016; pp. 52–56. [Google Scholar]
  25. Rivas, T.; Paz, M.; Martín, J.; Matías, J.M.; García, J.; Taboada, J. Explaining and predicting workplace accidents using data-mining techniques. Reliab. Eng. Syst. Saf. 2011, 96, 739–747. [Google Scholar] [CrossRef]
  26. Geurts, P.; Ernst, D.; Wehenkel, L. Extremely randomized trees. Mach. Learn. 2006, 63, 3–42. [Google Scholar] [CrossRef]
  27. Brownlee, J. Ensemble methods. In Machine Learning Mastery with Python; Machine Learning Mastery Pty Ltd.: Vermont Victoria, Australia, 2016; pp. 91–95. [Google Scholar]
  28. Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine learning in python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
  29. Li, J.; Cheng, K.; Wang, S.; Morstatter, F.; Trevino, R.P.; Tang, J.; Liu, H. Feature selection: A data perspective. arXiv, 2016; arXiv:1601.07996. [Google Scholar]
  30. Rahim, A.; Dimitrova, R.; Finger, A. Techniques for Bluetooth Performance Improvement. Available online: https://pdfs.semanticscholar.org/3205/2262d3c152a3cc947acbc7b325debe9cbeef.pdf (accessed on 7 June 2017).
  31. Chen, L.; Li, B.; Zhao, K.; Rizos, C.; Zheng, Z. An improved algorithm to generate a Wi-Fi fingerprint database for indoor positioning. Sensors 2013, 13, 11085–11096. [Google Scholar] [CrossRef] [PubMed]

Source: http://www.mdpi.com/1424-8220/17/6/1318/htm

Advertisements

Comparative Study WIFI vs. WIMAX

5 Sep

Wireless networking has become an important area of research in academic and industry. The main objectives of this paper is to gain in-depth knowledge about the Wi-Fi- WiMAX technology and how it works and understand the problems about the WiFiWiMAX technology in maintaining and deployment. The challenges in wireless networks include issues like security, seamless handover, location and emergency services, cooperation, and QoS. The performance of the WiMAX is better than the Wi-Fi and also it provide the good response in the access. It’s evaluated the Quality of Service (Qos) in Wi-Fi compare with WiMAX and provides the various kinds of security Mechanisms. Authentication to verify. The identity of the authorized communicating client stations. Confidentiality (Privacy) to secure that the wirelessly conveyed information will remain private and protected. Take necessary actions and configurations that are needed in order to deploy Wi-Fi -WiMAX with increased levels of security and privacy.

Download: ART20161474

Source: https://www.ijsr.net/archive/v5i9/ART20161474.pdf

A total of 192 telcos are deploying advanced LTE technologies

15 Aug

A total of 521 operators have commercially launched LTE, LTE-Advanced or LTE-Advanced Pro networks in 170 countries, according to a recent report focused on the state of LTE network reach released by the Global mobile Suppliers Association.

In 2015, 74 mobile operators globally launched 4G LTE networks, GSA said. Bermuda, Gibraltar, Jamaica, Liberia, Myanmar, Samoa and Sudan are amongst the latest countries to launch 4G LTE technology.

The report also reveals that 738 operators are currently investing in LTE networks across 194 countries. This figure comprises 708 firm network deployment commitments in 188 countries – of which 521 networks have launched – and 30 precommitment trials in another 6 countries.

According to the GSA, active LTE network deployments will reach 560 by the end of this year.

A total of 192 telcos, which currently offer standard LTE services, are deploying LTE-A or LTE-A Pro technologies in 84 countries, of which 147 operators have commercially launched superfast LTE-A or LTE-A Pro wireless broadband services in 69 countries.

“LTE-Advanced is mainstream. Over 100 LTE-Advanced networks today are compatible with Category 6 (151-300 Mbps downlink) smartphones and other user devices. The number of Category 9 capable networks (301-450 Mbps) is significant and expanding. Category 11 systems (up to 600 Mbps) are commercially launched, leading the way to Gigabit service being introduced by year-end,” GSA Research VP Alan Hadden said.

The GSA study also showed that the 1800 MHz band continues to be the most widely used spectrum for LTE deployments. This frequency is used in 246 commercial LTE deployments in 110 countries, representing 47% of total LTE deployments. The next most popular band for LTE systems is 2.6 GHz, which is used in 121 networks. Also, the 800 MHz band is being used by 119 LTE operators.

A total of 146 operators are currently investing in Voice over LTE deployments, trials or studies in 68 countries, according to the study. GSA forecasts there will be over 100 LTE network operators offering VoLTE service by the end of this year.

Unlicensed spectrum technologies boost global indoor small cell market

In related news, a recent study by ABI Research forecasts that the global indoor small cell market will reach revenue of $1.8 billion in 2021, manly fueled by increasing support for unlicensed spectrum technologies, including LTE-License Assisted Access and Wi-Fi.

The research firm predicts support for LTE-based and Wi-Fi technologies using unlicensed spectrum within small cell equipment will expand to comprise 51% of total annual shipments by 2021 at a compound annual growth rate of 47%

“Unlicensed LTE (LTE-U) had a rough start, meeting negative and skeptic reactions to its possible conflict with Wi-Fi operations in the 5 GHz bands. But the ongoing standardization and coexistence efforts increased the support in the technology ecosystem,” said Ahmed Ali, senior analyst at ABI Research.

“The dynamic and diverse nature of indoor venues calls for an all-inclusive small cell network that intelligently adapts to different user requirements,” the analyst added. “Support for multioperation features like 3G/4G and Wi-Fi/LAA access is necessary for the enterprise market.”

Source: http://www.rcrwireless.com/20160815/asia-pacific/gsa-reports-521-lte-deployments-170-countries-tag23
LTE network

Critical (Outdoor) IoT Applications Need Robust Connectivity

14 Apr

It’s safe to assume that the majority of all Internet of Things (IoT) devices operate near large populations of people. Of course, right? This is where the action happens – smart devices, smart cars, smart infrastructure, smart cities, etc. Plus, the cost of getting “internet-connected” in these areas is relatively low – public access to Wi-Fi is becoming widely available, cellular coverage is blanketed over cities, etc.

But what about the devices out in the middle of nowhere? The industrial technology that integrates and communicates with heavy machinery that isn’t always “IP connected,” operating in locations not only hard to reach, but often exposed harsh weather. The fact remains, this is where IoT connectivity is potentially most challenging to enable, but also perhaps the most important to have. Why? Because these numerous assets help deliver the lifeblood for our critical infrastructures – electricity, water, energy, etc. Without these legacy and geographically dispersed machines, a smart world may never exist.

But let’s back up for a second and squash any misconceptions about the “industrial” connectivity picture we’re painting above. Take this excerpt from Varun Nagaraj in a past O’Reilly Radar article:

“… unlike most consumer IoT scenarios, which involve digital devices that already have IP support built in or that can be IP enabled easily, typical IIoT scenarios involve pre-IP legacy devices. And unfortunately, IP enablement isn’t free. Industrial device owners need a direct economic benefit to justify IP enabling their non-IP devices. Alternatively, they need a way to gain the benefits of IP without giving up their investments in their existing industrial devices – that is, without stranding these valuable industrial assets.

Rather than seeing industrial device owners as barriers to progress, we should be looking for ways to help industrial devices become as connected as appropriate – for example, for improved peer-to-peer operation and to contribute their important small data to the larger big-data picture of the IoT.”

It sounds like the opportunity ahead for the industrial IoT is to  provide industrial devices and machines with an easy migration path to internet connectivity by creatively addressing its constraints (outdated protocols, legacy equipment, the need for both wired and wireless connections, etc.) and enabling new abilities for the organization.

Let’s look at an example of how this industrial IoT transformation is happening.

Voice, Video, Data & Sensors
Imagine you are a technician from a power plant in an developing part of the world with lots of desert terrain. The company you work for provides power to an entire region of people, which is difficult considering the power plant location is in an extremely remote location facing constant sand blasts and extreme temperatures. The reliance your company places on the industrial devices being used to monitor and control all facets of the power plant itself is paramount. If they fail, the plant fails and your customers are without power. This is where reliable, outdoor IoT connectivity is a must:

  • With a plethora of machinery and personnel onsite, you need a self-healing Wi-Fi mesh network over the entire power plant so that internet connections aren’t lost mid-operation.
  • Because the traditional phone-line system doesn’t extend to the remote location of the power plant, and cell coverage is weak, the company requires Voice over IP (VoIP) communications. Also, because there’s no physical hardware involved, personnel never needs to worry about maintenance, repairs or upgrades.
  • The company wants to ensure no malfeasance takes place onsite, especially due to the mission-critical nature of the power plant. Therefore, security camera control and video transport is required back to a central monitoring center.
  • Power plants require cooling applications to ensure the integrity and safety of the power generation taking place. The company requires Supervisory Control and Data Acquisition (SCADA) networking for monitoring the quality of the inbound water being used to cool the equipment.
  • The company wants to provide visibility to its customers in how much energy they are consuming. This requires Advanced Metering Infrastructure (AMI) backhaul networking to help manage the energy consumption taking place within the smart grid.
  • Since the power plant is in a remote location, there is only one tiny village nearby being used by the families and workers at the power plant. The company wants to provide a Wi-Fi hotspot for the residents.

From the outline above, it sounds like a lot of different IoT networking devices will need to be used to address all of these applications at the power plant. If the opportunity ahead for the industrial IoT is to  provide industrial devices and machines with an easy migration path to IP connectivity, what solutions are available to make this a reality for the power plant situation above? Not just that, but a solution with proven reliability in extreme environmental conditions? We might know one

Source: http://bigdata.sys-con.com/node/3766382

The Future of Wireless – In a nutshell: More wireless IS the future.

10 Mar

Electronics is all about communications. It all started with the telegraph in 1845, followed by the telephone in 1876, but communications really took off at the turn of the century with wireless and the vacuum tube. Today it dominates the electronics industry, and wireless is the largest part of it. And you can expect the wireless sector to continue its growth thanks to the evolving cellular infrastructure and movements like the Internet of Things (IoT). Here is a snapshot of what to expect in the years to come.

The State of 4G

4G means Long Term Evolution (LTE). And LTE is the OFDM technology that is the dominant framework of the cellular system today. 2G and 3G systems are still around, but 4G was initially implemented in the 2011-2012 timeframe. LTE became a competitive race by the carriers to see who could expand 4G the fastest. Today, LTE is mostly implemented by the major carriers in the U.S., Asia, and Europe. Its rollout is not yet complete—varying considerably by carrier—but nearing that point. LTE has been wildly successful, with most smartphone owners rely upon it for fast downloads and video streaming. Still, all is not perfect.

Fig. 1

1. The Ceragon FibeAir IP-20C operates in the 6 to 42 GHz range and is typical of the backhaul to be used in 5G small cell networks.

While LTE promised download speeds up to 100 Mb/s, that has not been achieved in practice. Rates of up to 40 or 50 Mb/s can be achieved, but only under special circumstances. With a full five-bar connection and minimal traffic, such speeds can be seen occasionally. A more normal rate is probably in the 10 to 15 Mb/s range. At peak business hours during the day, you are probably lucky to get more than a few megabits per second. That hardly makes LTE a failure, but it does mean that it has yet to live up to its potential.

One reason why LTE is not delivering the promised performance is too many subscribers. LTE has been oversold, and today everyone has a smartphone and expects fast access. But with such heavy use, download speeds decrease in order to serve the many.

There is hope for LTE, though. Most carriers have not yet implemented LTE-Advanced, an enhancement that promises greater speeds. LTE-A uses carrier aggregation (CA) to boost speed. CA combines LTE’s standard 20 MHz bandwidths into 40, 80, or 100 MHz chunks, either contiguous or not, to enable higher data rates. LTE-A also specifies MIMO configurations to 8 x 8. Most carriers have not implemented the 4 x 4 MIMO configurations specified by plain-old LTE. So as carriers enable these advanced features, there is potential for download speeds up to 1 Gb/s. Market data firm ABI Research forecasts that LTE carrier aggregation will power 61% of smartphones in 2020.

This LTE-CA effort is generally known as LTE-Advanced Pro or 4.5G LTE. This is a mix of technologies defined by the 3GPP standards development group as Release 13. It includes carrier aggregation as well as Licensed Assisted Access (LAA), a technique that uses LTE within the 5 GHz unlicensed Wi-Fi spectrum. It also deploys LTE-Wi-Fi Link Aggregation (LWA) and dual connectivity, allowing a smartphone to talk simultaneously with a small cell site and an Wi-Fi access point. Other features are too numerous to detail here, but the overall goal is to extend the life of LTE by lowering latency and boosting data rate to 1 Gb/s.

But that’s not all. LTE will be able to deliver greater performance as carriers begin to facilitate their small-cell strategy, delivering higher data rates to more subscribers. Small cells are simply miniature cellular basestations that can be installed anywhere to fill in the gaps of macro cell site coverage, adding capacity where needed.

Another method of boosting performance is to use Wi-Fi offload. This technique transfers a fast download to a nearby Wi-Fi access point (AP) when available. Only a few carriers have made this available, but most are considering an LTE improvement called LTE-U (U for unlicensed). This is a technique similar to LAA that uses the 5 GHz unlicensed band for fast downloads when the network cannot handle it. This presents a spectrum conflict with the latest version of Wi-Fi 802.11ac that uses the 5 GHz band. Compromises have been worked out to make this happen.

So yes, there is plenty of life left in 4G. Carriers will eventually put into service all or some of these improvements over the next few years. For example, we have yet to see voice-over-LTE (VoLTE) deployed extensively. Just remember that the smartphone manufacturers will also make hardware and/or software upgrades to make these advanced LTE improvements work. These improvements will probably finally occur just about the time we begin to see 5G systems come on line.

5G Revealed

5G is so not here yet. What you are seeing and hearing at this time is premature hype. The carriers and suppliers are already doing battle to see who can be first with 5G. Remember the 4G war of the past years? And the real 4G (LTE-A) is not even here yet. Nevertheless, work on 5G is well underway. It is still a dream in the eyes of the carriers that are endlessly seeking new applications, more subscribers, and higher profits.

Fig. 2a

2a. This is a model of the typical IoT device electronics. Many different input sensors are available. The usual partition is the MCU and radio (TX) in one chip and the sensor and its circuitry in another. One chip solutions are possible.

The Third Generation Partnership Project (3GPP) is working on the 5G standard, which is still a few years away. The International Telecommunications Union (ITU), which will bless and administer the standard—called IMT-2020—says that the final standard should be available by 2020. Yet we will probably see some early pre-standard versions of 5G as the competitors try to out-market one another. Some claim 5G will come on line by 2017 or 2018 in some form. We shall see, as 5G will not be easy. It is clearly going to be one of the most, if not the most, complex wireless system ever.  Full deployment is not expected until after 2022. Asia is expected to lead the U.S. and Europe in implementation.

The rationale for 5G is to overcome the limitations of 4G and to add capability for new applications. The limitations of 4G are essentially subscriber capacity and limited data rates. The cellular networks have already transitioned from voice-centric to data-centric, but further performance improvements are needed for the future.

Fig. 2b

2b. This block diagram shows another possible IoT device configuration with an output actuator and RX.

Furthermore, new applications are expected. These include carrying ultra HD 4K video, virtual reality content, Internet of Things (IoT) and machine-to-machine (M2M) use cases, and connected cars. Many are still forecasting 20 to 50 billion devices online, many of which will use the cellular network. While most IoT and M2M devices operate at low speed, higher network rates are needed to handle the volume. Other potential applications include smart cities and automotive safety communications.

5G will probably be more revolutionary than evolutionary. It will involve creating a new network architecture that will overlay the 4G network. This new network will use distributed small cells with fiber or millimeter wave backhaul (Fig. 1), be cost- and power consumption-conscious, and be easily scalable. In addition, the 5G network will be more software than hardware. 5G will use software-defined networking (SDN), network function virtualization (NFV), and self-organizing network (SON) techniques. Here are some other key features to expect:

  • Use of millimeter (mm) -wave bands. Early 5G may also use 3.5- and 5-GHz bands. Frequencies from about 14 GHz to 79 GHz are being considered. No final assignments have been made, but the FCC says it will expedite allocations as soon as possible. Testing is being done at 24, 28, 37, and 73 GHz.
  • New modulation schemes are being considered. Most are some variant of OFDM. Two or more may be defined in the standard for different applications.
  • Multiple-input multiple-output (MIMO) will be incorporated in some form to extend range, data rate, and link reliability.
  • Antennas will be phased arrays at the chip level, with adaptive beam forming and steering.
  • Lower latency is a major goal. Less than 5 ms is probably a given, but less than 1 ms is the target.
  • Data rates of 1 Gb/s to 10 Gb/s are anticipated in bandwidths of 500 MHz or 1 GHz.
  • Chips will be made of GaAs, SiGe, and some CMOS.

One of the biggest challenges will be integrating 5G into the handsets. Our current smartphones are already jam-packed with radios, and 5G radios will be more complex than ever. Some predict that the carriers will be ready way before the phones are sorted out. Can we even call them phones anymore?

So we will eventually get to 5G, but in the meantime, we’ll have to make do with LTE. And really–do you honestly feel that you need 5G?

What’s Next for Wi-Fi?

Next to cellular, Wi-Fi is our go-to wireless link. Like Ethernet, it is one of our beloved communications “utilities”. We expect to be able to access Wi-Fi anywhere, and for the most part we can. Like most of the popular wireless technologies, it is constantly in a state of development. The latest iteration being rolled out is called 802.11ac, and provides rates up to 1.3 Gb/s in the 5 GHz unlicensed band. Most access points, home routers, and smartphones do not have it yet, but it is working its way into all of them. Also underway is the process of finding applications other than video and docking stations for the ultrafast 60 GHz (57-64 GHz) 802.11ad standard. It is a proven and cost effective technology, but who needs 3 to 7 Gb/s rates up to 10 meters?

At any given time there are multiple 802.11 development projects ongoing. Here are a few of the most significant.

  • 802.11af – This is a version of Wi-Fi in the TV band white spaces (54 to 695 MHz). Data is transmitted in local 6- (or 😎 MHz bandwidth channels that are unoccupied. Cognitive radio methods are required. Data rates up to about 26 Mb/s are possible. Sometimes referred to as White-Fi, the main attraction of 11af is that the possible range at these lower frequencies is many miles, and non-line of sight (NLOS) through obstacles is possible. This version of Wi-Fi is not in use yet, but has potential for IoT applications.
  • 802.11ah – Designated as HaLow, this standard is another variant of Wi-Fi that uses the unlicensed ISM 902-928 MHz band. It is a low-power, low speed (hundreds of kb/s) service with a range up to a kilometer. The target is IoT applications.
  • 802.11ax – 11ax is an upgrade to 11ac. It can be used in the 2.4- and 5-GHz bands, but most likely will operate in the 5-GHz band exclusively so that it can use 80 or 160 MHz bandwidths. Along with 4 x 4 MIMO and OFDA/OFDMA, peak data rates to 10 Gb/s are expected. Final ratification is not until 2019, although pre-ax versions will probably be complete.
  • 802.11ay – This is an extension of the 11ad standard. It will use the 60-GHz band, and the goal is at least a data rate of 20 Gb/s. Another goal is to extend the range to 100 meters so that it will have greater application such as backhaul for other services. This standard is not expected until 2017.

Wireless Proliferation by IoT and M2M

Wireless is certainly the future for IoT and M2M. Though wired solutions are not being ruled out, look for both to be 99% wireless. While predictions of 20 to 50 billion connected devices still seems unreasonable, by defining IoT in the broadest terms there could already be more connected devices than people on this planet today. By the way, who is really keeping count?

Fig. 3

3. This Monarch module from Sequans Communications implements LTE-M in both 1.4-MHz and 200-kHz bandwidths for IoT and M2M applications.

The typical IoT device is a short range, low power, low data rate, battery operated device with a sensor, as shown in Fig. 2a. Alternately, it could be some remote actuator, as shown in Fig. 2b. Or the device could be a combination of the two. Both usually connect to the Internet through a wireless gateway but could also connect via a smartphone. The link to the gateway is wireless. The question is, what wireless standard will be used?

Wi-Fi is an obvious choice because it is so ubiquitous, but it is overkill for some apps and a bit too power-hungry for some. Bluetooth is another good option, especially the Bluetooth Low Energy (BLE) version. Bluetooth’s new mesh and gateway additions make it even more attractive. ZigBee is another ready-and-waiting alternative. So is Z-Wave. Then there are multiple 802.15.4 variants, like 6LoWPAN.

Add to these the newest options that are part of a Low Power Wide Area Networks (LPWAN) movement. These new wireless choices offer longer-range networked connections that are usually not possible with the traditional technologies mentioned above. Most operate in unlicensed spectrum below 1 GHz. Some of the newest competitors for IoT apps are:

  • LoRa – An invention of Semtech and supported by Link Labs, this technology uses FM chirp at low data rates to get a range up to 2-15 km.
  • Sigfox – A French development that uses an ultra narrowband modulation scheme at low data rates to send short messages.
  • Weightless – This one uses the TV white spaces with cognitive radio methods for longer ranges and data rates to 16 Mb/s.
  • Nwave – This is similar to Sigfox but details minimal at this time.
  • Ingenu – Unlike the others, this one uses the 2.4-GHz band and a unique random phase multiple access scheme.
  • HaLow – This is 802.11ah Wi-Fi, as described earlier.
  • White-Fi – This is 802.11af, as described earlier.

There are lots of choices for any developer. But there are even more options to consider.

Cellular is definitely an alternative for IoT, as it has been the mainstay of M2M for over a decade. M2M uses mostly 2G and 3G wireless data modules for monitoring remote machines or devices and tracking vehicles. While 2G (GSM) will ultimately be phased out (next year by AT&T, but T-Mobile is holding on longer), 3G will still be around.

Now a new option is available: LTE. Specifically, it is called LTE-M and uses a cut-down version of LTE in 1.4-MHz bandwidths. Another version is NB-LTE-M, which uses 200-kHz bandwidths for lower speed uses. Then there is NB-IoT, which allocates resource blocks (180-kHz chunks of 15-kHz LTE subcarriers) to low-speed data. All of these variations will be able to use the existing LTE networks with software upgrades. Modules and chips for LTE-M are already available, like those from Sequans Communications(Fig. 3).

One of the greatest worries about the future of IoT is the lack of a single standard. That is probably not going to happen. Fragmentation will be rampant, especially in these early days of adoption. Perhaps there will eventually be only a few standards to emerge, but don’t bet on it. It may not even really be necessary.

3 Things Wireless Must Have to Prosper

  • Spectrum – Like real estate, they are not making any more spectrum. All the “good” spectrum (roughly 50 MHz to 6 GHz) has already been assigned. It is especially critical for the cellular carriers who never have enough to offer greater subscriber capacity or higher data rates.  The FCC will auction off some available spectrum from the TV broadcasters shortly, which will help. In the meantime, look for more spectrum sharing ideas like the white spaces and LTE-U with Wi-Fi.
  • Controlling EMI – Electromagnetic interference of all kinds will continue to get worse as more wireless devices and systems are deployed. Interference will mean more dropped calls and denial of service for some. Regulation now controls EMI at the device level, but does not limit the number of devices in use. No firm solutions are defined, but some will be needed soon.
  • Security – Security measures are necessary to protect data and privacy. Encryption and authentication measures are available now. If only more would use them.
Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Source: http://electronicdesign.com/4g/future-wireless

IR.51 IMS OVER WI-FI V3.0

3 Mar

The IP Multimedia Subsystem (IMS) Profile for Voice and Video, documented in this Permanent Reference Document (PRD), identifies a minimum mandatory set of features which are defined in 3GPP specifications that a wireless device (the User Equipment (UE)) and network are required to implement in order to guarantee interoperable, high quality IMS-based telephony and conversational video services over Wi-Fi access networks.

Download IMS Profile for Voice, Video and SMS over Wi-Fi – Version 3.0 – 01 March 2016

Source: http://www.gsma.com/newsroom/all-documents/ir-51-ims-over-wi-fi-v/

Wireless Routers 101

14 Feb

A wireless router is the central piece of gear for a residential network. It manages network traffic between the Internet (via the modem) and a wide variety of client devices, both wired and wireless. Many of today’s consumer routers are loaded with features, incorporating wireless connectivity, switching, I/O for external storage devices as well as comprehensive security functionality. A wired switch, often taking the form of four gigabit Ethernet ports on the back of most routers, is largely standard these days. A network switch negotiates network traffic, sending data to a specific device, whereas network hubs simply retransmit data to all of the recipients. Although dedicated switches can be added to your network, most home networks don’t incorporate them as standalone appliances. Then there’s the wireless access point capability. Most wireless router models support dual bands, communicating over 2.4 and 5GHz and many are also able to connect to several networks simultaneously.

Part of trusting our always-on Internet connections is the belief that private information is protected at the router, which incorporates features to limit home network access. These security features can include a firewall, parental controls, access scheduling, guest networks and even a demilitarized zone (DMZ), referring to the military concept of a buffer zone between neighboring countries). The DMZ, also called a perimeter network, is a subnetwork where vulnerable processes like mail, Web and FTP servers can be placed so that, if it is breached, the rest of the network isn’t compromised. The firewall is a core component in today’s story. In fact, what differentiates a wireless router from a dedicated switch or wireless access point is the firewall. Although Windows has its own software-based firewall, the router’s hardware firewall forms the first line of defense in keeping malicious content off the home network. The router’s firewall works by making sure packets were actually requested by the user before allowing them to pass through to the local network.

Finally, you have peripheral connectivity like USB and eSATA. These ports make it possible to share external hard drives or even printers. They offer a convenient way to access networked storage without the need for a dedicated PC with a shared disk or NAS running 24/7.

Some Internet service providers (ISPs) integrate routers into their modems, yielding an “all-in-one” device. This is done to simplify setup, so the ISP has less hardware to support. It can also be advantageous to space-constrained customers. However, in general, these integrated routers do not get firmware updates as frequently, and they’re often not as robust as stand-alone routers. An example of a combo modem/router is Netgear’s Nighthawk AC1900 Wi-Fi cable modem router. In addition to its 802.11ac wireless connectivity, it offers a DOCSIS 3.0 24 x 8 broadband cable modem.

DOCSIS stands for “data over cable service interface specifications,” and version 3.0 is the current cable modem spec. DOCSIS 1.0 and 2.0 defines a single channel for data transfers, while DOCSIS 3.0 specifies the use of multiple channels to allow for faster speeds. Current DOCSIS 3.0 modems commonly use 8, 12 or 16 channels, with 24-channel modems also available. Each channel offers a theoretical maximum download speed of 38 Mb/s and a maximum upload speed of 27 Mb/s. The standard’s next update, DOCSIS 3.1, promises to offer download speeds of up to 10 Gb/s and upload speeds of up to 1 Gb/s.

MORE: All Networking Content
MORE: Networking in the Forums

Wi-Fi Standards

The oldest wireless routers supported 802.11b, which worked on the 2.4GHz band and topped out at 11 Mb/s. This original Wi-Fi standard was approved in 1999, hence the name 802.11b-1999 (later it was shortened to 802.11b).

Another early Wi-Fi standard was 802.11a, also ratified by the IEEE in 1999. It operated on the less congested 5GHz band and maxed out at 54 Mb/s, although real-world throughput was closer to half that number. Given a shorter wavelength than 2.4GHz, the range of 802.11a was shorter, which may have contributed to less uptake. While 802.11a enjoyed popularity in some enterprise applications, it was largely eclipsed by the more pervasive 802.11b in homes and small businesses. Notably, 802.11a’s 5GHz band became part of later standards.

Eventually, 802.11b was replaced by 802.11g on the 2.4GHz band, upping throughput to 54 Mb/s. It all makes for an interesting history lesson, but if your wireless equipment is old enough for that information to be relevant, it’s time to consider an upgrade.

802.11n

In the fall of 2009, 802.11n was ratified, paving the way for one device to operate on both the 2.4GHz and 5GHz bands. Speeds topped out at 600 Mb/s. With N600 and N900 gear, two separate service set identifiers (SSIDs) were transmitted—one on 2.4GHz and the other on 5GHz—while less expensive N150 and N300 routers cut costs by transmitting only on the 2.4GHz band.

Wireless N networking introduced an important advancement called MIMO, an acronym for “multiple input/multiple output.” This technology divides the data stream between multiple antennas. We’ll go into more depth on MIMO shortly.

If you’re satisfied with the performance of your N wireless gear, then hold onto it for now. After all, it does still exceed the maximum throughput offered by most ISPs. Here are some examples of available 802.11n product speeds:

Type 2.4GHz (Mb/s) 5GHz (Mb/s)
N150 150 N/A
N300 300 N/A
N600 300 300
N900 450 450

802.11ac

The 802.11ac standard, also known as Wireless AC, was released in January 2014. It broadcasts and receives on both the 2.4GHz and 5GHz bands, but the 2.4GHz frequency on an 802.11ac router is really a carryover of 802.11n. That older standard maxed out at 150 Mb/s on each spatial stream, with up to four simultaneous streams, for a total throughput of 600 Mb/s.

In 802.11ac MIMO was also refined with increased channel bandwidth and support for up to eight spatial streams. Beamforming was introduced with Wireless N gear, but it was proprietary, and with AC, it was standardized to work across different manufacturers’ products. Beamforming is a technology designed to optimize the transmission of Wi-Fi around obstacles by using the antennas to direct and focus the transmission to where it is needed.

With 802.11ac firmly established as the current Wi-Fi standard, enthusiasts shopping for routers should consider one of these devices, as they offer a host of improvements over N gear. Here are some examples of available 802.11ac product speeds:

Type 2.4GHz (Mb/s) 5GHz (Mb/s)
AC600 150 433
AC750 300 433
AC1000 300 650
AC1200 300 867
AC1600 300 1300
AC1750 450 1300
AC1900 600 1300
AC3200 600 1300, 1300

The maximum throughput achieved is the same on AC1900 and AC3200 for both the 2.4GHz and 5GHz bands. The difference is that AC3200 can transmit two simultaneous 5GHz networks to achieve such a high total throughput.

The latest wireless standard with products currently hitting the market is 802.11ac Wave 2. It implements multiple-user, multiple-input, multiple-output, popularly referred to as MU-MIMO. In broad terms, this technology provides dedicated bandwidth to more devices than was previously possible.

Wi-Fi Features

SU-MIMO And MU-MIMO

Multiple-input and multiple-output (MIMO), first seen on 802.11n devices, takes advantage of a radio phenomenon known as multipath propagation, which increases the range and speed of Wi-Fi. Multipath propagation is based on the ability of a radio signal to take slightly different pathways between the router and client, including bouncing off intervening objects as well as floors and ceilings. With multiple antennas on both the router as well as the client—and provided they both support MIMO—then using antenna diversity can combine simultaneous data streams to increase throughput.

When MIMO was originally implemented, it was SU-MIMO, designed for a Single User. In SU-MIMO, all of the router’s bandwidth is devoted to a single client, maximizing throughput to that one device. While this is certainly useful, today’s routers communicate with multiple clients at one time, limiting the SU-MIMO’s technology’s utility.

The next step in MIMO’s evolution is MU-MIMO, which stands for Multiple User-MIMO. Whereas SU-MIMO was restricted to a single client, MU-MIMO can now extend the benefit to up to four. The first MU-MIMO router released, the Linksys EA8500, features four external antennas that facilitate MU-MIMO technology allowing the router to provide four simultaneous continuous data streams to clients.

Before MU-MIMO, a Wi-Fi network was the equivalent of a wired network connected through a hub. This was inefficient; a lot of bandwidth is wasted when data is sent to clients that don’t need it. With MU-MIMO, the wireless network becomes the equivalent of a wired network controlled by a switch. With data transmission able to occur simultaneously across multiple channels, it is significantly faster, and the next client can “talk” sooner. Therefore, just as the transition from hub to switch was a huge leap forward for wired networks, so will MU-MIMO be for wireless technology.

Beamforming

Beamforming was originally implemented in 802.11n, but was not standardized between routers and clients; it essentially did not work between different manufacturers’ products. This was rectified with 802.11ac, and now beamforming works across different manufacturers’ gear.

What beamforming does is, rather than have the router transmit its Wi-Fi signal in all directions, it allows the router to focus the signal to where it is needed to increase its strength. Using light as an analogy, beamforming takes the camping lantern and turns it into a flashlight that focuses its beam. In some cases, the Wi-Fi client can also support beamforming to focus the signal of the client back to the router.

While beamforming is implemented in 802.11ac, manufacturers are still allowed to innovate in their own way. For example, Netgear offers Beamforming+ in some of its devices, which enhances throughput and range between the router and client when they are both Netgear products and support Beamforming+.

Other Wi-Fi Features

When folks visit your house, they often want to jump on your wireless network, whether to save on cellular data costs or to connect a notebook/tablet. Rather than hand out your Wi-Fi password, try configuring a Guest Network. This facilitates access to network bandwidth, while keeping guests off of other networked resources. In a way, the Guest Network is a security feature, and feature-rich routers offer this option.

Another feature to look for is QoS, which stands for Quality of Service. This capability serves to prioritize network traffic from the router to a client. It’s particularly useful in situations where a continuous data stream is required; for example, with services like Netflix or multi-player games. In fact, routers advertised as gaming-optimized typically include provisions for QoS, though you can find the functionality on non-gaming routers as well.

Another option is Parental Control, which allows you to act as an administrator for the network, controlling your child’s Internet access. The limits can include blocking certain websites, as well as shutting down network access at bedtime.

Wireless Router Security

There are two types of firewalls: hardware and software. Microsoft’s Windows operating system has a software firewall built into it. Third-party firewalls can be installed as well. Unfortunately, these only protect the device they’re installed on. While they’re an essential part of a Windows-based PC, the rest of your network is otherwise exposed.

An essential function of the router is its hardware firewall, known as a network perimeter firewall. The router serves to block incoming traffic that was not requested, thereby operating as an initial line of defense. In an enterprise setup, the hardware firewall is a dedicated box; in a residential router, it’s integrated.

A router is also designed to look for the address source in packets traveling over the network, relating them to address requests. When the packets aren’t requested, the firewall rejects them. In addition, a router can apply filtering policies, using rules to allow and restrict packets before they traverse the home network. The rules consider the source of a packet’s IP address and its destination. Moreover, packets are matched to the port they should be on. This is all done at the router to keep unwanted data off the home network.

The wireless router is responsible for the Wi-Fi signal’s security, too. There are various protocols for this, including WEP, WPA and WPA2. WEP, which stands for Wired Equivalent Privacy, is the oldest standard, dating back to 1999. It uses 64-bit, and subsequently 128-bit encryption. As a result of its fixed key, WEP is widely considered quite insecure. Back in 2005, the FBI showed how WEP could be broken in minutes using publicly available software.

WEP was supplanted by WPA (Wi-Fi Protected Access) featuring 256-bit encryption. Addressing the significant shortcoming of WEP, a fixed key, WPA’s improvement was based on the Temporal Key Integrity Program (TKIP). This security protocol uses a per-packet key system that offers a significant upgrade over WEP. WPA for home routers is implemented as WPA-PSK, which uses a pre-shared key (PSK, better known as the Wi-Fi password that folks tend to lose and forget). While the security of WPA-PSK via TKIP was definitely better than WEP, it also proved vulnerable to attack and is not considered secure.

Introduced in 2006, WPA2 (Wi-Fi Protected Access 2) is the more robust security specification. Like its predecessor, WPA2 uses a pre-shared key. However, unlike WPA’s TKIP, WPA2 utilizes AES (Advanced Encryption Standard), a standard approved by the NSA for use with top secret information.

Any modern router will support all of these security standards for the purpose of compatibility, as none of them are new, but ideally, you want to configure your router to employ WPA2/AES. There is no WPA3 on the horizon because WPA2 is still considered secure. However, there are published methods for compromising it, so accept that no network is impenetrable.

All of these Wi-Fi security standards rely on your choice of a strong password. It used to be that an eight-character sequence was considered sufficient. But given the compute power available today (particularly from GPUs), even longer passwords are sometimes recommended. Use a combination of numbers, uppercase and lowercase letters, and special characters. The password should also avoid dictionary words or easy substitutions, such as “p@$$word,” or simple additions—for example, “password123” or “passwordabc.”

While most enthusiasts know to change the router’s Wi-Fi password from its factory default, not everyone knows to change the router’s admin password, thus inviting anyone to come along and manipulate the router’s settings. Use a different password for the Wi-Fi network and router log-in page.

In the event that you lose your password, don’t fret. Simply reset the router to its factory state, reverting the log-in information to its default. Manufacturers have different methods for doing this, but many routers have a physical reset button, usually located on the rear of the device. After resetting, all custom settings are lost, and you’ll need to set a new password.

Wi-Fi Protected Setup (WPS) is another popular feature on higher-end routers. Rather than manually typing in a password, WPS lets you press a button on the router and adapter, triggering a brief discovery period. Another approach is the WPS PIN method, which facilitates discovery through the entry of a short code on either the router or client. It’s vulnerable to brute-force attack, though, so many enthusiasts recommend simply disabling WPS altogether.

Software

Web And Mobile Interfaces

Wireless routers are typically controlled through a software interface built into their firmware, which can be accessed through the router’s network address. Through this interface you can enable the router’s features, define the parameters and configure security settings. Routers employ a variety of custom operating environments, though most are Web-based. Some manufacturers do offer smartphone-enabled apps for iOS and Android, too. Here’s is an example of a software interface for the Netis WF2780, seen on a Windows desktop. While not easy to use for amateurs, it does allow for control over all the settings. Here we can see the Bandwidth Control Configuration in the Advanced Settings.

Routers offer a wide range of features, and each vendor has its own set of unique capabilities. Overall, though, they do share generally similar feature sets, including:

  • Quick Setup: For the less experienced user, Quick Setup is quite useful. This gets the device up and running with pre-configured settings, and does not require advanced networking knowledge. Of course, experienced users will want more control.
  • Wireless Configuration: This setting allows channel configuration. In some cases, the router’s power can be adjusted, depending on the application. Finally, the RF bandwidth can be selected as well. Analogous settings for 5GHz are available on a separate page.
  • Guest Network: The router software will provide the option to set up a separate Guest Network. This has the advantage of allowing visitors to use your Internet, without getting access to the entire network.
  • Security: This is where the SSIDs for each of the configured networks, as well as their passwords, can be configured.
  • Bandwidth Control: Since there is limited bandwidth, it can be controlled to provide the best experience for all (or at least the one who pays the bills). The amount of bandwidth that any user has, both on the download and upload sides, can be limited so one user does not monopolize all the bandwidth.
  • System Tools: Using this collection of tools, the router’s firmware can be upgraded and the time settings specified. This also provides a log of sites visited and stats on bandwidth used.

Here is a screenshot of a mobile app called QRSMobile for Android, which can simplify the setup of a wireless router, in this case the D-Link 820L.

This screenshot shows the smartphone app for the Google OnHub.

 

 

Open-Source Firmware

Historically, some of these vendor-provided software interfaces did not allow full control of all possible settings. Out of frustration, a community for open source router firmware development took shape. One popular example of its work is DD-WRT, which can be applied to a significant number of routers, letting you tinker with options in a granular fashion. In fact, some manufacturers even sell routers with DD-WRT installed. The AirStation Extreme AC 1750 is one such model.

Another advantage of open firmware is that you’re not at the mercy of a vendor in between updates. Older products don’t receive much attention, but DD-WRT is a constant work in progress. Other open source firmware projects in this space include OpenWRT and Tomato, but be mindful that not all routers support open firmware.

Hardware

System Board Components

Inside a wireless router is a purpose-built system, complete with a processor, memory, power circuitry and a printed circuit board. These are all proprietary components, with closed specifications, and are not upgradeable.

The above image shows the internals of Netis’ N300 Gaming Router (WF2631). We see the following components:

  1. Status LEDs that indicate network/router activity
  2. Heat sink for the processor—these CPUs don’t use much power, and are cooled without a fan
  3. Antenna leads for the three external antennas to connect to the PCB
  4. Four Ethernet LAN ports for the home network
  5. WPS Button
  6. Ethernet WAN port that connects to a provider’s modem
  7. Power jack
  8. Factory reset button
  9. 10/100BASE-TX transformer modules — these support the RJ45 connectors, which are the Ethernet ports.
  10. 100 Base-T dual-port through-hole magnetics. These are designed for IEEE802.3u (Ethernet ports).
  11. Memory chip (DRAM)

Antenna Types

As routers send and receive data across the 2.4 and 5GHz bands, they need antennas. There are multiple antenna choices: external versus internal designs, routers with one antenna and others with several. If a single antenna is good, then more must be better, right? And this is the current trend, with flagship routers like the Nighthawk X6 Tri-Band Wi-Fi Router featuring as many as six antennas, which can each be fine-tuned in terms of positioning to optimize performance. A setup like that facilitates three simultaneous network signals: one 2.4GHz and two 5GHz.

While a router with an internal antenna might look sleeker, these designs are built to blend into a living area. The range and throughput of external antennas are typically superior. They also have the advantages of reaching up to a higher position, operating at a greater distance from the router’s electronics, reducing interference, and offering some degree of configurability to tune signal transmission. This makes a better argument for function over form.

The more antennas you see on a router, the more transmit and receive radios there are, corresponding to the number of supported spatial streams. For example, a 3×3 router employs three antennas and handles three simultaneous spatial streams. Using current standards, these additional spatial streams account for much of how performance is multiplied. The Netis N300 router, pictured on the left, features three external antennae for better signal strength.

Ethernet Ports

While the wireless aspect of a wireless router gets most of the attention, a majority also enable wired connectivity. A popular configuration is one WAN port for connecting to an externally-facing modem and four LAN ports for attaching local devices.

The LAN ports top out at either 100 Mb/s or 1 Gb/s, also referred to as gigabit Ethernet or GbE. While older hardware can still be found with 10/100 ports, the faster 10/100/1000 ports are preferred to avoid bottlenecking wired transfer speeds over category 5e or 6 cables. If you have the choice between a physical or wireless connection, go the wired route. It’s more secure and frees up wireless bandwidth for other devices.

While four Ethernet ports on consumer-oriented routers is standard, certain manufacturers are changing things up. For example, the TP-Link/Google OnHub router only has one Ethernet port. This could be the start of a trend toward slimmer profiles at the expense of expansion. The OnHub router, pictured on the right, features a profile designed to be displayed, and not hidden in a closet, but this comes at the expense of external antennas, and the router has only a single Ethernet port. Asus’ RT-AC88U goes the other direction,incorporating eight Ethernet ports.

USB Ports

Some routers come with one or two USB ports. It is still common to find second-gen ports capable of speeds of up to 480 Mb/s (60 MB/s). Higher-end models implement USB 3.0, though. Though they cost more, the third-gen spec is capable 5 Gb/s (640 MB/s). The D-Link DIR-820L features a rear-mounted USB port. Also seen are the four LAN ports, as well as the Internet connection input (WAN).

One intended use of USB ports is to connect storage. All of them support flash drives; however, some routers output enough current for external enclosures with mechanical disks. If you don’t need a ton of capacity, you can use a feature like that to create an integrated NAS appliance. In some models, the storage is only accessible over a home network. In other cases, you can reach it remotely.

The other application of USB on a router is shared printing. Networked printers make it easy to consolidate to just one peripheral. Many new printers do come with Wi-Fi controllers built-in. But for those that don’t, it’s easy to run a USB cable from the device to your router and share it across the network. Just keep in mind that you might lose certain features if you hook your printer up to a router. For instance, you might not see warnings about low ink levels or paper jams.

Conclusion

The Future Of Wi-Fi

Wireless routers continue to evolve as Wi-Fi standards get ratified and implemented. One rapidly expanding area is the Connected Home space, with devices like thermostats, fire alarms, front door locks, lights and security cameras all piping in to the Internet. Some of these devices connect directly to the router, while others connect to a hub device—for example, the SmartThings Hub, which then connects to the router.

One upcoming standard is known as 802.11ad, also referred to as WiGig. Actual products based on the technology are just starting to appear. It operates on the 60GHz spectrum, which promises high bandwidth across short distances. Think of it akin to Bluetooth with a roughly 10 meter range, but performance on steroids. Look for docking stations without wires and 802.11ad as a protocol for linking our smartphones and desktops.

Used in the enterprise segment, 802.11k and 802.11r are being developed for the consumer market. The home networking industry plans to address the problem of using multiple access points to deal with Wi-Fi dead spots, and the trouble client devices have with hand-offs between multiple APs. 802.11k allows client devices to track APs for where they weaken, and 802.11r brings Fast Basic Service Set Transition (F-BSST) to facilitate authentication with APs. When 802.11k and 802.11r are combined, they will enable a technology known as Seamless Roaming. Seamless Roaming will facilitate client handoffs between routers and access points.

Beyond that will be 802.11ah, which is being developed to use on the 900MHz band. It is a low-bandwidth frequency, but is expected to double the range of 2.4GHz transmissions with the added benefit of low power. The envisioned application of it is connecting Internet of Things (IoT) devices.

Out on the distant horizon is 802.11ax, which is tentatively expected to roll out in 2019 (although remember that 802.11n and 802.11ac were years late). While the standard is still being worked on, its goal is 10 Gb/s throughput. The 802.11ax standard will focus on increasing speeds to individual devices by slicing up the frequency into smaller segments. This will be done via MIMO-OFDA, which stands for multiple-input, multiple-output orthogonal frequency division multiplexing, which will incorporate new standards to pack additional data into the 5GHz data stream.

What To Look For In A Router

Choosing a router can get complicated. You have tons of choices across a range of price points. You’ll want to evaluate your needs and consider variables like the speed of your Internet connection, the devices you intend to connect and the features you anticipate using. My own personal recommendation would be to look for a minimum wireless rating of AC1200, USB connectivity and management through a smartphone app.

Netis’ WF2780 Wireless AC1200 offers an inexpensive way to get plenty of wireless performance at an extremely low price. While it lacks USB, you do get four external antennas (two for 2.4GHz and two for 5GHz), four gigabit Ethernet ports and the flexibility to use this device as a router, access point or repeater. Certain features are notably missing, but at under $60, this is an entry-level upgrade that most can afford.

Moving up to the mid-range, we find the TP-Link Archer C9. It features AC1900 wireless capable of 600 Mb/s on the 2.4GHz band and 1300 Mb/s on the 5GHz band. It has three antennas and a pair of USB ports, one of which is USB 3.0. There’s a 1GHz dual-core processor at the router’s heart and a TP-Link Tether smartphone app to ease setup and management. You’ll find the device for $130.

At the top end of the market is AC3200 wireless. There are several routers in this tier, including D-Link’s AC3200 Ultra Wi-Fi Router (DIR-890L/R). It features Tri-Band technology, which supports a 2.4GHz network at 600 Mb/s and two 5GHz networks at 1300 Mb/s. To accomplish this, it has a dual-core processor and no less than six antennas. There’s also an available app for network management, dual USB ports and GbE wired connectivity. The Smart Connect feature can dynamically balance the wireless clients among the available bands to optimize performance and prevent older devices from slowing down the rest of the network. Plus, this router has the aesthetics of a stealth destroyer and the red metallic paint job of a sports car! Such specs do not come cheap; expect to pay $300.

Conclusion

Wireless routers are assuming an ever-important role as the centerpiece of a residential home network. With the increasing need for multiple, simultaneous continuous data streams, robust throughput is no longer a nice feature, but rather a necessity. This becomes even more imperative as streaming 4K video moves from a high-end niche into the mainstream. By taking into consideration such factors as the data load as well as the number of simultaneous users, enthusiasts shopping for wireless routers will get the help they need to choose the router that best fits their needs and budget.

MORE: All Networking Content
MORE: Networking in the Forums

Source: http://www.tomshardware.com/reviews/wireless-routers-101,4456.html

LTE-U v. Wi-Fi Battle Set to Escalate

4 Feb

The battle between LTE-U and Wi-Fi will continue, even escalate – there is a lot at stake. LTE-U is designed to let cellular networks boost data speeds over short distances. Additionally, because no added rights that have to be purchased, LTE-U would allow carriers to extend their core networks at a fraction of the cost of their existing systems.

But they stomp on Wi-Fi signals. Because upper unibands can have a watt, or more, of transmit power in outdoor usage, they can overpower the shared Wi-Fi bands. Testing has shown that to be the case, and an LTE-U network can “override any Wi-Fi signal in the area, creating enough interference to block nearby corporate networks and public Wi-Fi hotspots – not good!

Proponents of LTE-U argue that it is a legitimate competitor to Wi-Fi technology, and should therefore be allowed to operate in the same spectrum. That is not the argument. The argument is that if it is going to share, then it has to be a good neighbor, and it is tuning out that such is not the case.

Wi-Fi currently uses an 802.11 listen-before-talk (LBT) contention-based protocol. LTE-U relies on an arbitrary duty cycle mechanism. LTE-U needs to adopt the same LBT protocol so everyone can just get along and share the medium. In the United Kingdom, they have acknowledged the problem and have regulated the 5 GHz spectrum. Is that what has to happen here?

Carriers are rushing LTE-U into the market because it is a cash cow. They want to get it out before the FCC has a chance to rule, because they know LTE-U, as it stands today, is a flawed platform and if they end up having to re-engineer the access protocols, it will cost them a lot of money. If the carriers succeeded, traditional Wi-Fi vendors will be forced to look for clean spectrum. The FCC, and industry leaders need to stop the 800 pound gorillas from bullying their way into the spectrum, and regulate the 5 GHz band.

Source: http://www.aglmediagroup.com/category/small-cells/

Viavi Solutions sees an evolution of network monitoring to meet demand from 5G, VoLTE, NFV

18 Jan

As 2016 dawns on the wireless industry and operators continue coping with the challenge of improving customer experience and reducing costs, four aiding technologies will take center stage: network functions virtualization; voice over LTE and Wi-Fi calling; self-organizing networks; and the rise of “5G” networks. While we’ve been hearing about these next-generation technologies for some time, the challenge in the next year will be ensuring they are all working to maximize business opportunity profitability. And this will require granular, end-to-end real-time visibility across all devices and parts of the network.

Today we are poised to see a real revolution in networking over the next year where network operators now have the potential to intelligently and efficiently manage the ebb and flow of traffic and exploit under-utilized resources without compromising infrastructure or the customer experience. But it will take advancements in real-time visibility to do so. As end users come to expect flawlessness from their providers, assuring service will become much more detailed than simply checking to make sure everything’s plugged in.

Network functions virtualization
NFV can significantly lower network operating costs and increase flexibility and service velocity. Today, industry guidelines are for the most part in place to allow introducing the virtualized functions themselves, but management and orchestration standards for the self-configuration required to truly enable NFV are still in their infancy.
While 2016 will see a significant increase in NFV deployments, these will primarily revolve around semi-automatic configuration – in other words, not the full-blown automation required to realize 100% of NFV’s benefit. The NFV industry is therefore likely to put a great deal of effort into developing guidelines for the management and orchestration side of NFV deployments.

The benefits of NFV will only be realized if network performance management tools can access these new, virtual network interfaces. Operators will need to invest in solutions that ensure they can satisfy quality-of-service needs, including resiliency and latency in initial virtualization deployments. This next year should show a major ramp-up in the availability of test and assurance solutions able to provide truly actionable performance insights for virtualized network environments.

Voice over LTE and Wi-Fi
The fast growth in VoLTE rollouts will continue in 2016, as it becomes the de facto voice service over the legacy voice service. But VoLTE cannot exist as an island. It needs to evolve to reflect the way people communicate today, which comprises not just voice but also data, messaging social media, video and other multimedia-rich services. This implies that assurance systems must empower more granular and flexible control over performance parameters and thresholds to meet the needs of these different applications, alongside the visibility to react in real-time to unpredictable user behaviors.

The interaction between VoLTE and VoWi-Fi will mature, characterized by soft and seamless handoffs between the access methods. Managing VoLTE end to end – meaning understanding service quality from handset to the radio access network to backhaul to core – will be a key operator goal as they ensure that their services deliver high customer quality of experience. This means deploying sophisticated assurance platforms to know in real time where VoLTE services are performing poorly and where there is a stress in the network.

Self-organizing networks
Self-organizing networks are essentially the key to a connected future. By automating configuration, optimization and healing of the network, this frees up operational resources to focus on what’s truly important – better quality of experience and aligning revenue to network optimization. And, with the number of connected “things” positively exploding, managing and keeping up with the sheer number of devices requires an automated approach that also yields a new set of network-assurance challenges operators will have to deal with in 2016.

Today, many SON techniques simply baseline a network. In 2016, as the extreme non-uniformity in the network becomes more apparent, it will take a new, end-to-end approach to SON to keep these benefits coming.

The network will become more sporadic and this will manifest in several forms: time, subscriber, location and application. For example, take subscriber and location: a recent Viavi Solutions customer study found just 1% of users consume more than half of all data on a network. The study also found 50% of all data is consumed in less than 0.35% of the network area. To achieve significant performance gains via SON, operators can apply predictive approaches using analytics that reveal exactly which users are consuming how much bandwidth – and where they are located. This level of foresight is key to not only unlocking the full potential of SON in the RAN, but also to maximizing ROI for software-defined networking and NFV in the core.

5G
2016 will be the year that at least the term “5G” proliferates, but we’re still a ways off from actual implementations. A future filled with driverless cars, drones that can deliver packages and location-based IoT products will require always-on networks with less than 1 millisecond latency – and that’s what 5G promises on paper. But 5G is imminent, and 2016 will reveal many advances toward building and delivering it to end users and their applications.

The race to 5G is bringing with it advancements in the network that inch us closer to always-on, always-fast and always improving networks. This work is pushing the industry to develop new tools and solutions that offer real-time troubleshooting and network healing, faster turn-up times and the ability to instantaneously respond to traffic spikes driven by external events. These new solutions may, at the same time, encourage new revenue streams by supporting the delivery of location- and contextually-relevant applications and services. Examples of these include mobile payment support and security as well as smart city applications for public services and emergency support.

The move to 5G is not an evolution, but a revolution – and major challenges exist across every stage of the technology deployment lifecycle and every part of the end-to-end network.

To move the needle on 5G development in 2016, operators need a partner with a wide breadth of expertise and solutions to collaborate on strategic planning and development in consideration of the significant dependencies and coordination needed for successful deployment.

Edge network configuration must change and move towards ultra-dense heterogeneous networks. Front- and backhaul transport require lower latency. These and other factors present significant challenges for commercial 5G evolution; however, the train has clearly left the station. And it will gain substantial momentum in 2016.

To 2016 and beyond

It’s exciting to watch the networking revolution – with myriad new capabilities and services surfacing thanks to evolving end-user habits and demands, the network simply cannot remain stagnant. And as new approaches – from hyped technologies like SDN/NFV or 5G – come about, operators need more sophisticated ways of ensuring it’s all working. In 2016, expect not only to see the network evolve, but also ways organizations capture and leverage analytics for assurance and optimization.

Photo copyright: wisiel / 123RF Stock Photo

Source: http://www.rcrwireless.com/20160118/opinion/2016-predictions-network-revolutions-require-new-monitoring-approaches-in-2016-tag10

The Introduction of Wi-Fi Technology

8 Jan

The term Wi-fi is very common today. It is possible that when you are at airport or restaurant or any public place you are under wi-fi signal.The Best way for connect Internet through wireless.lets look its feature and applications.

Wi-Fi Alliance Logo.svgThe Wi-fi technology is Best technology for connect and use internet its also use for connecting all your device and create networks like Printers, scanners, Mobile phones. It removes the need for wires for connecting it. Now a days almost all devices are Wi-Fi compatible. Now The Goverments of many countries setup wifi networks for Free basic internet services for their people. It can setup into the home for small network or create large scale city infrastructure by goverments, WiFi has a lot of advantages. Wireless networks are easy to set up
and inexpensive.
A wireless network uses radio waves, just like cell phones, televisions and radios do. In fact, communication across a wireless network is a lot like two-way radio communication. Here’s what happens:

  1. A computer’s wireless adapter translates data into a radio signal and transmits it using an antenna.
  2. A wireless router receives the signal and decodes it. The router sends the information to the Internet using a physical, wired Ethernet connection.

The process also works in reverse, with the router receiving information from the Internet, translating it into a radio signal and sending it to the computer’s wireless adapter.
The Wi-Fi Alliance defines Wi-Fi as any “wireless local area network” (WLAN) product based on the Institute of Electrical and Electronics Engineers’ (IEEE) 802.11 standards.However, the term “Wi-Fi” is used in general English as a synonym for “WLAN” since most modern WLANs are based on these standards. “Wi-Fi” is a trademark of the Wi-Fi Alliance. The “Wi-Fi Certified” trademark can only be used by Wi-Fi products that successfully complete Wi-Fi Alliance interoperabilit certification testing.

  • 802.11a transmits at 5 GHz and can move up to 54 megabits of data per second. It also uses orthogonal frequency-division multiplexing (OFDM), a more efficient coding technique that splits that radio signal into several sub-signals before they reach a receiver. This greatly reduces interference.
  • 802.11b is the slowest and least expensive standard. For a while, its cost made it popular, but now it’s becoming less common as faster standards become less expensive. 802.11b transmits in the 2.4 GHz frequency band of the radio spectrum. It can handle up to 11 megabits of data per second, and it uses complementary code keying (CCK) modulation to improve speeds.
  • 802.11g transmits at 2.4 GHz like 802.11b, but it’s a lot faster — it can handle up to 54 megabits of data per second. 802.11g is faster because it uses the same OFDM coding as 802.11a.
  • 802.11n is the most widely available of the standards and is backward compatible with a, b and g. It significantly improved speed and range over its predecessors. For instance, although 802.11g theoretically moves 54 megabits of data per second, it only achieves real-world speeds of about 24 megabits of data per second because of network congestion. 802.11n, however, reportedly can achieve speeds as high as 140 megabits per second. 802.11n can transmit up to four streams of data, each at a maximum of 150 megabits per second, but most routers only allow for two or three streams.
  • 802.11ac is the newest standard as of early 2013. It has yet to be widely adopted, and is still in draft form at the Institute of Electrical and Electronics Engineers (IEEE), but devices that support it are already on the market. 802.11ac is backward compatible with 802.11n (and therefore the others, too), with n on the 2.4 GHz band and ac on the 5 GHz band. It is less prone to interference and far faster than its predecessors, pushing a maximum of 450 megabits per second on a single stream, although real-world speeds may be lower. Like 802.11n, it allows for transmission on multiple spatial streams — up to eight, optionally. It is sometimes called 5G WiFi because of its frequency band, sometimes Gigabit WiFi because of its potential to exceed a gigabit per second on multiple streams and sometimes Very High Throughput (VHT) for the same reason.
  • Other 802.11 standards focus on specific applications of wireless networks, like wide area networks (WANs) inside vehicles or technology that lets you move from one wireless network to another seamlessly.
  • WiFi radios can transmit on any of three frequency bands. Or, they can “frequency hop” rapidly between the different bands. Frequency hopping helps reduce interference and lets multiple devices use the same wireless connection simultaneously.

Wi-fi hotspot is used for the send wifi signal in area. and all device must be wifi compatible for catching wifi signal.
Sources : http://computer.howstuffworks.com/wireless-network4.htm – https://en.wikipedia.org/wiki/Wi-Fi

Source: http://techcyclopedia.blogspot.nl/2016/01/the-introduction-of-wi-fi-technology.html

 

%d bloggers like this: