The structural insecurities underlying the internet

Mar 13, 2014 by Alan Woodward, The Conversation
After 25 years, it’s getting a bit dusty in there. Credit: Arrqh, CC BY-SA

Most people would agree with the principle that good foundations are essential to any structure that is intended to last. But what if when you started building, you didn't envisage how large, complex or essential your structure would become? As we celebrate 25 years of the world wide web, the extraordinarily accurate science of hindsight brings to light just such a situation.

We have all become dependent on a network that was never intended to be as large or secure as it is now required to be. The big question is, do we go back and start again or do we simply accept history and ensure that our structure somehow compensates for its weaknesses?

Shaky foundations

To decide, we have to distinguish between two quite distinct entities: the internet, and the , which sits on top of it. It is the internet, in the form of its underlying network protocol known as IPv4, that provides the weak link being broken by some of the latest high profile cyber attacks.

When the first few computer networks were connected, it was to share resources. Spreading the load between machines meant that those with spare capacity could help out those that needed more.

By the time Tim Berners-Lee and his colleagues at CERN came to think about networking, academics around the world were already using precursors of the internet to share data, from JANET, which still thrives today, to the stranger, more esoteric applications running on the internet such as the long forgotten GOPHER.

The brilliance of what Berners-Lee did was to come up with an extensible mark-up language known as Hypertext Mark-up Language, or HTML. This allowed us all to write pages that could be universally accessed. Crucially, HTML was made freely available so people started writing browsers that would enable you to read HTML based web pages.

And that, with the benefit of hindsight, was where the problem inherent in the internet was compounded. Neither IPv4 nor HTML were built with security in mind. The entire purpose of the web was to allow academics and other researchers to freely share their work. Indeed, the more people that accessed it and read your work the happier you would be.

It never entered anyone's head that we might wish to restrict access or that we might one day pay for things online or use it to communicate our most intimate thoughts. The web was a victim of its own success. HTML unlocked the potential of connecting people, and since humans just love to share and chat, we all got hooked.

By the mid-1990s, businesses finally found the web and that's when the floodgates opened. It was when money became involved that people really began to realise that security was an issue. Secure HTML emerged alongside other secure extensions to the original protocols which made it possible for us to interact over a public network in a secure manner.

Enter the baddies

For a while, these extra layers of security added on top of the web seemed to work well but the shaky foundations on which they were built soon began to cause problems.

As more and more commerce went on over the web, the criminally minded, who should never be underestimated for their ingenuity, began to look at how they could subvert the system. And as criminals always do, they went straight for the weakest link. In this case, that was the basic technology underpinning the web.

They began to impersonate users sometimes using IP "spoofing" to trick others into giving up information, and to mount distributed denial of service (DDOS) attacks. Initially these DDOS attacks were simplistic. Hacktivists would harness an army of supporters to all send simultaneous requests for the same web page at the same time. The site would be unable to cope with the number of requests and would become unavailable to valid users.

But then criminals, who had always had an eye on those ageing underlying technologies, realised that because IPv4 allowed you to spoof your address, you could ask a question but have the answer sent to someone else. Worse still, they realised that the domain name server (DNS) – the essential component that enables web addresses to be converted to internet addresses, meaning data can actually be routed around networks – could be used to amplify the data being directed at a victim.

Since using DNS in DDOS attacks, the internet's other older protocols have been co-opted to mount similar DDOS attacks employing ever greater volumes of data, and increasingly by people with criminal intent rather than hacktivists. All of this is possible because of the technological foundations upon which the Web is built.

The next 25 years

There are those who suggest we should effectively start again but this is probably not practical. The web doesn't run on some ethereal cloud but on real physical networks which have taken considerable investment to produce.

Others suggest that IPv4 should be abandoned and we should move onto the IPv6 – the most recent version of the , which has the potential to be more secure because it has the potential to prevent spoofing of IP addresses and to guarantee the sender is who they cliam to be. IPv6 has added advantages such as the fact that IPv4 long since exhausted its addresses whereas IPv6 has no such limitation – yet another indication of how people drastically underestimated how much would eventually be attached to the web and would thus require an address. Despite this, network providers seem in no hurry to replace IPv4 as the de facto standard.

It's not all doom and gloom though. The days of the are not necessarily numbered. It has a way of evolving, almost organically, as threats emerge. We have solutions to many of the problems that threaten our safety online, particularly those that relate to spoofing IP addresses, and miusing tyhe older protocols, and will probably continue to produce more.

The irony is that in such a hyper-connected world we struggle to get the word out about these solutions. People can access the information they need to stay safe online but are not doing so. It is almost as if there is so much communication that important messages are being lost in what is perceived as background noise.

Explore further: Attackers use Network Time Protocol for denial exploit

add to favorites email to friend print save as pdf

Related Stories

Attackers use Network Time Protocol for denial exploit

Feb 12, 2014

( —Reports are calling it the world's most massive distributed denial-of-service (DDoS) attack ever, referring to this week's report about a massive exploit making use of the Network Time Protocol ...

Web founder calls for Internet bill of rights

Mar 12, 2014

A bill of rights should be created to govern the Internet in the wake of revelations about the depth of government surveillance, the inventor of the World Wide Web said on Wednesday.

Next-generation Internet addresses tested

Jun 08, 2011

A worldwide test was under way on Wednesday of the next generation of Internet addresses designed to replace the dwindling pool of 4.3 billion unique identifiers in the original system.

Recommended for you

UK: Former reporter sentenced for phone hacking

5 hours ago

(AP)—A former British tabloid reporter was given a 10-month suspended prison sentence Thursday for his role in the long-running phone hacking scandal that shook Rupert Murdoch's media empire.

Evaluating system security by analyzing spam volume

6 hours ago

The Center for Research on Electronic Commerce (CREC) at The University of Texas at Austin is working to protect consumer data by using a company's spam volume to evaluate its security vulnerability through the ...

Surveillance a part of everyday life

7 hours ago

Details of casual conversations and a comprehensive store of 'deleted' information were just some of what Victoria University of Wellington students found during a project to uncover what records companies ...

European Central Bank hit by data theft

7 hours ago

(AP)—The European Central Bank said Thursday that email addresses and other contact information have been stolen from a database that serves its public website, though it stressed that no internal systems or market-sensitive ...

Twitter admits to diversity problem in workforce

10 hours ago

(AP)—Twitter acknowledged Wednesday that it has been hiring too many white and Asian men to fill high-paying technology jobs, just like several other major companies in Silicon Valley.

Social Security spent $300M on 'IT boondoggle'

21 hours ago

(AP)—Six years ago the Social Security Administration embarked on an aggressive plan to replace outdated computer systems overwhelmed by a growing flood of disability claims.

User comments : 0