Future Internet aims to sever links with servers

Oct 30, 2013
This is a diagram showing how information would be shared on the PURSUIT Internet, compared with the present architecture. Credit: Barney Brown, University of Cambridge.

A revolutionary new architecture aims to make the internet more "social" by eliminating the need to connect to servers and enabling all content to be shared more efficiently.

Researchers have taken the first step towards a radical new architecture for the internet, which they claim will transform the way in which is shared online, and make it faster and safer to use.

The prototype, which has been developed as part of an EU-funded project called "Pursuit", is being put forward as a proof-of concept model for overhauling the existing structure of the internet's IP layer, through which isolated networks are connected, or "internetworked".

The Pursuit Internet would, according to its creators, enable a more socially-minded and , in which users would be able to obtain information without needing direct access to the where content is initially stored.

Instead, individual computers would be able to copy and republish content on receipt, providing other users with the option to access data, or fragments of data, from a wide range of locations rather than the source itself. Essentially, the model would enable all online content to be shared in a manner emulating the "peer-to-peer" approach taken by some file-sharing sites, but on an unprecedented, internet-wide scale.

That would potentially make the internet faster, more efficient, and more capable of withstanding rapidly escalating levels of global user demand. It would also make information delivery almost immune to server crashes, and significantly enhance the ability of users to control access to their private information online.

While this would lead to an even wider dispersal of online materials than we experience now, however, the researchers behind the project also argue that by focusing on information rather than the web addresses (URLs) where it is stored, digital content would become more secure. They envisage that by making individual bits of data recognisable, that data could be "fingerprinted" to show that it comes from an authorised source.

Dr Dirk Trossen, a senior researcher at the University of Cambridge Computer Lab, and the technical manager for Pursuit, said: "The current internet architecture is based on the idea that one computer calls another, with packets of information moving between them, from end to end. As users, however, we aren't interested in the storage location or connecting the endpoints. What we want is the stuff that lives there."

"Our system focuses on the way in which society itself uses the internet to get hold of that content. It puts information first. One colleague asked me how, using this , you would get to the server. The answer is: you don't. The only reason we care about web addresses and servers now is because the people who designed the network tell us that we need to. What we are really after is content and information."

In May this year, the Pursuit team won the Future Internet Assembly (FIA) award after successfully demonstrating applications which can, potentially, search for and retrieve information online on this basis. The breakthrough raises the possibility that almost anybody could identify specific pieces of content in fine detail, radically changing the way in which information is stored and held online.

For example, at the moment if a user wants to watch their favourite TV show online, they search for that show using a search engine which retrieves what it thinks is the URL where that show is stored. This content is hosted by a particular server, or, in some cases, a proxy server.

If, however, the user could correctly identify the content itself – in this case the show – then the location where the show is stored becomes less relevant. Technically, the show could be stored anywhere and everywhere. The Pursuit network would be able to map the desired content on to the possible locations at the time of the desired viewing, ultimately providing the user with a list of locations from which that information could be retrieved.

The designers of Pursuit hope that, in the future, this is how the will work. Technically, online searches would stop looking for URLs (the Uniform Resource Locator) and start looking for URIs (Uniform Resource Identifiers). In simple terms, these would be highly specific identifiers which enable the system to work out what the information or content is.

This has the potential to revolutionise the way in which information is routed and forwarded . "Under our system, if someone near you had already watched that video or show, then in the course of getting it their computer or platform would republish the content," Trossen explained. "That would enable you to get the content from their network, as well as from the original server."

"Widely used content that millions of people want would end up being widely diffused across the network. Everyone who has republished the could give you some, or all of it. So essentially we are taking dedicated servers out of the equation."

Any such system would have numerous benefits. Most obviously, it would make access to information faster and more efficient, and prevent servers or sources from becoming overloaded. At the moment, if user demand becomes unsustainable, servers go down and have to be restored. Under the Pursuit model, demand would be diffused across the system.

"With a system like the one we are proposing, the whole system becomes sustainable," Trossen added. "The need to do something like this is only going to become more pressing as we record and upload more information."

Explore further: Google search serves users from 700 percent more locations than a year ago, study shows

More information: Further information about the PURSUIT project can be found at: www.fp7-pursuit.eu/PursuitWeb/

Related Stories

Efficient, intelligent, content-aware networks

Apr 15, 2013

The rapid, exponential growth of internet traffic means investment in infrastructure, new technologies and paradigms for getting content to users are needed. EU-funded researchers are pushing these boundaries ...

The European project 'digital.me' opens its code

Sep 24, 2013

The EU's "digital.me" project brings Fraunhofer IAO together with seven research and industry partners to develop a system for user-controlled social networks and services that can serve as a central hub ...

Explainer: What is geoblocking?

Apr 19, 2013

So you sit down in front of your computer to catch the latest episode of Doctor Who directly from BBC's iPlayer, and you are greeted by an error message informing you that the program will play only in the ...

Internet architecture is at odds with its use

Aug 23, 2012

The largest manmade structure is now used much differently than was originally intended by its designers. Of all Internet communication, only a fraction of traffic is intended to be exchanged between specific network elements ...

Recommended for you

Social Security spent $300M on 'IT boondoggle'

7 hours ago

(AP)—Six years ago the Social Security Administration embarked on an aggressive plan to replace outdated computer systems overwhelmed by a growing flood of disability claims.

Cheaper wireless plans cut into AT&T 2Q profit

7 hours ago

(AP)—AT&T posted lower net income for the latest quarter due to cheaper cellphone plans it introduced as a response to aggressive pricing from smaller competitor T-Mobile US.

Six charged in global e-ticket hacking scheme

8 hours ago

Criminal charges were filed Wednesday against six people in what authorities said was a global cyber-crime ring that created fraudulent e-tickets for major concerts and sporting events.

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

tscati
3 / 5 (2) Oct 30, 2013
Oooh, this reminds me of something, what was it??? Ah yes, 'BitTorrent'

And there's a basic problem with many connections - upload speeds are an order of magnitude slower than download speeds. I can download something to iPlayer from the BBC servers and proxies at 8Mb/s - I think my neighbour would be a bit miffed if they had to get it at 450kb/s via upload from my PC. And I'd be miffed if I was trying to upload some data files at the same time. When we all have 200Mb/s symmetric connections then maybe...

And also I prefer to get my data from the horse's mouth rather than a copy from an unknown PC which has possibly added one or two interesting things to it. And do I want a copy of a news story from a neighbouring PC that is half an hour old, when the original has since been updated?

Back to the drawing board I think...
DistortedSignature
5 / 5 (1) Oct 30, 2013
Essentially, the model would enable all online content to be shared in a manner emulating the "peer-to-peer" approach taken by some file-sharing sites


It is said that in the article it was related to P2P software tscati, have you ever downloaded a torrent? One person might have an up of 450kbps but instead of being limited to people that are seeding the file you have access everyone who has a piece on the internet. You'd only need 20 others to match that speed. I'm not an expert so there might be a limitation of how many users you can download from at once or the ISP throttling your connection.

The up-to-date information raises a good point. What if a source has been updated 5 times in the past minute? What if you're grabbing bits of data of a certain time stamp and discover one with a more recent stamp, do you start over?

I wonder how it will be stored on individual computers. What if I don't want to seed any of this? Regardless this is an interesting concept.
comendant
1 / 5 (6) Oct 30, 2013
Morons who don't know anything about the subject yet comment on it and post uninformed opinions are pretty annoying. I just hope more people can contribute by actually providing something more interesting to the discussion (like DistortedSignature) than rhetorical nonsense.

I'm interested in how secure information would work in this system. It seems this concept would work greatly for open source information.
freeiam
1 / 5 (3) Oct 30, 2013
If your communicating your very much interested in the address of the other device (connecting the end points) and how do you connect to servers nearby if you don't have an address?
It seems to me that the proposal could (and should) run on the existing internet like all file sharing schemes do currently with the added benefit that it (the internet) can also be used as a communication medium. It seems that the solution is sought to low level while it should be implemented on the application level.
One of the big issues of the current internet is that devices on local (home/work) networks cannot be (easily) reached because they have no address which results in ballooning cloud services where point to point communication suffices in most cases.
When ipv6 is rolled out completely (or far enough) every device can have an IP number and point to point networks and local sharing and syncing will soar.
Also ipv6 (and v4) have other transmit modes (for example broadcasting) that can reduce the
freeiam
1.8 / 5 (4) Oct 30, 2013
... network load significantly, the fact that it is used rarely doesn't mean that it cannot be used.
Further, using bandwidth and storage of others has all kinds of implications. For example if some type of data that's incriminating is found on a machine, the owner is liable (no matter how fragmented or encrypted it is) and computers running day and night cost a lot while data above a certain fair use limit is far from cheap (and I don't like the sound of spinning and rattling hard disks) not to mention that all kinds of security risks go unnoticed because I know now (at least some of the time) when the network should be silent.
kochevnik
2 / 5 (4) Oct 30, 2013
Isn't this just a big p2p network with a search engine? Or does it use anonymous distribution like the dead Public File project? Bittorrent still uses servers so where would the metadata reside? Duplicating that would be a waste for small machines
DonGateley
1 / 5 (3) Oct 30, 2013
The Security apparatus in every nation will prohibit this unless a way to penetrate and capture anything going through it is built into it at the design level.
rwinners
not rated yet Oct 31, 2013
Actually, peer to peer networking was the first networking. Way way before MS, etc.