Excessive latency affects the experienced quality of internet services: online games lag, streaming video buffers, and video conversations are choppy or even interrupted. Toke Høiland-Jørgensen is a researcher at Karlstad University in Sweden, working to improve network performance. In his thesis, "On the Bleeding Edge" he describes what can cause latency and how smart routers can help reduce delays.

"A number of initiatives has been taken to reduce in the internet, and there are several causes of delays. What we have looked at specifically is something called "Bufferbloat" and how this creates latency, primarily in home networks. It is often in home networks that latency occurs, because the last-mile capacity to the home is lower than in the rest of the internet. This means there is a risk that the link becomes congested, which leads to queues being formed."

Bufferbloat – congested data queues

The internet consists of a number of networks linked by routers, where data pass through to different parts of the Internet. A router can be compared to a road junction where , like cars at a crossroads, form queues and are waved on according to predefined rules. It is perfectly natural that temporary queues are formed due to short-term variations in the traffic. What creates problems is when the queues fill up and stay that way for a longer time. Then traffic comes to a halt and the result is delays, so-called Bufferbloat.

"The rules regulating how data is let through vary in different types of routers. You could say that there are different types of queueing systems, and we have studied how the flow of data across networks is influenced by these different systems."

Different types of queue systems

The "first come first served" principle (also called "first in – first out") is the simplest principle to use to regulate a queue, and is the most common one today. This principle means that the router lets through data packets in the order they arrive. But more advanced principles can give better performance, for example, the so-called fairness principle. This involves a queueing system that gives priority on the basis of a number of factors, and which mixes data flows from distinct sources, allowing the flows to proceed concurrently through the network instead of having to wait for each other.

"What we have seen is that the traditional "first in – first out" principle is one of the factors causing latency, and that the fairness principle reduces latency considerably. With this "fairness queueing", the capacity between different users and applications is shared. We have developed an algorithm based on this principle that works well in WiFi networks, and which contains an automatic prioritisation of the small data packets that are important for a quick response, such as DNS lookups and game traffic."

Software reduces latency in your home network

Together with researchers in the USA and Poland, Toke Høiland-Jørgensen has developed software that anyone can install in their home routers to reduce latency.

"We have worked deliberately to make our research results available for all to use. Our algorithm, for instance, is included in software for home routers that anyone can install in their router to reduce latency in their ."

More information: On the Bleeding Edge: Debloating Internet Access Networks. kau.diva-portal.org/smash/record.jsf?aq2=%5B%5B%5D%5D&c=1&af=%5B%5D&searchType=SIMPLE&query=On+the+bleeding+edge&language=sv&pid=diva2%3A1044014&aq=%5B%5B%5D%5D&sf=all&aqe=%5B%5D&sortOrder=author_sort_asc&onlyFullText=false&noOfRows=50&dswid=-4214

Provided by Karlstad University