Faster prediction of wireless downtime

Faster prediction of wireless downtime
The increasing numbers of mobile devices make it important to predict wireless outage capacity quickly and accurately to minimize service disruptions. Credit: Pexels

As the number of mobile devices grows along with demand for faster connections and larger data volumes, wireless networks can easily exceed capacity, resulting in severe network slowdowns and outages. While engineers have developed various sophisticated signal processing methods to accommodate sudden changes in network loads, it has been challenging to evaluate and compare the performance of different approaches in realistic network environments. The reason for this difficulty is that network outages due to capacity saturation can be such rare events that producing simulations to identify outages can be very computationally intensive and take considerable time.

Raul Tempone and colleagues from the King Abdullah University of Science and Technology (KAUST) Strategic Research Initiative on Uncertainty Quantification in Science and Engineering (SRI-UQ) have now applied an importance sampling technique that can simulate rare events for the problem of wireless outage capacity.

"The outage capacity is one of the most important performance metrics of ," explained Tempone. "It measures the percentage of time that the communication system undergoes an outage, which is typically in the order of one second per million or more. There are no efficient analytical solutions to this problem, and to simulate this situation using conventional simulation methods might take more than a billion simulation runs."

Motivated by the need for a much faster simulation method, Tempone and his team turned to importance sampling. This is a well-known approach through which a clever problem transformation makes it possible to sample more frequently from the event of interest. This effectively turns rare events in the original problem into non-rare events in the transformed problem. For example, for a typical outage capacity of the order of one in 100 million, the importance sampling approach allows the outage capacity to be estimated in 100 million times fewer simulation runs than conventional methods, dramatically reducing the time needed for estimation.

Unlike existing methods for estimating outage capacity that are only applicable to specific scenarios, the importance sampling approach is generic, making it suitable for a wide range of challenging network scenarios.

"Despite continuous advances in the concept of importance sampling in the field of rare events simulations, its popularity among researchers in the field of systems is still quite limited," Tempone said. "Our work is the first to bridge the gap between the framework of rare event algorithms and the evaluation of outage capacity for wireless ."

More information: Nadhir Ben Rached et al. Unified Importance Sampling Schemes for Efficient Simulation of Outage Capacity Over Generalized Fading Channels, IEEE Journal of Selected Topics in Signal Processing (2016). DOI: 10.1109/JSTSP.2015.2500201

Citation: Faster prediction of wireless downtime (2016, July 18) retrieved 23 April 2024 from https://phys.org/news/2016-07-faster-wireless-downtime.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

GoDaddy says no attack behind Web outage

2 shares

Feedback to editors