Slow websites are a major source of irritation; we simply cannot be bothered to wait. But according to PhD student Marcel Harkema from the University of Twente, there is no longer any excuse for them. "Too many IT systems are built without first considering the quality of the service. A slow website caused by capacity problems, for example, will damage the owner's image." Marcel Harkema carried out research into the factors primarily responsible for software performance, and collated his data into performance models to be used for solving preliminary problems like this.
"In practice, website designers regularly make a rough estimate of the capacity needed for a website. In simple terms, how many people can use the website before it becomes sluggish? Decisions are often made by looking at similar situations elsewhere, and then waiting to see how it turns out. To save costs, people tend to start off with a small (i.e. too small) configuration. Any problems that arise once the website is up-and-running are solved by adding extra capacity. However, users soon become dissatisfied and if the site is introducing a new service, for example, it will almost certainly mean a bad start."
Marcel Harkema asked himself the following question: "Is there a more accurate way of estimating how much capacity you will need beforehand, so that you can minimize initial problems and keep your users happy?" To Harkema's mind, the challenge is to identify just a limited number of factors that go to determine system performance.
Harkema's PhD research shows how you can use these factors to build mathematical models that designers could use to make a prior calculation of how specific configurations will perform in practice. Marcel Harkema validated the models in his research by showing that the performances calculated can adequately predict actual performance in practice.
Explore further: When Facebook goes down it takes big chunks of the internet with it