Google search serves users from 700 percent more locations than a year ago, study shows

Oct 24, 2013
This is a still from an animation depicting the change in Google's search infrastructure. Credit: Matt Calder / USC

Over the past 10 months, Google search has dramatically increased the number of sites around the world from which it serves client queries, repurposing existing infrastructure to change the physical way that Google processes web searches, according to a new study from USC.

From October 2012 to late July 2013, the number of locations serving Google's search infrastructure increased from from a little less than 200 to a little more than 1400, and the number of ISPs grew from just over 100 to more than 850, according to the study.

Most of this expansion reflects Google utilizing client networks (such as Time Warner Cable, for example) that it already relied on for hosting content like videos on YouTube, and reusing them to relay—and speed up—user requests and responses for search and ads.

"Google already delivered YouTube videos from within these client networks," said USC PhD student Matt Calder, lead author of the study. "But they've abruptly expanded the way they use the networks, turning their content-hosting infrastructure into a search infrastructure as well."

Previously, if you submitted a to Google, your request would go directly to a Google data center.

Now, your search request will first go to the regional network, which relays it to the Google data center. While this might seem like it would make the take longer by adding in another step, the process actually speeds up searches.

Animation, click 'Enlarge'

Data connections typically need to "warm up" to get to their top speed – the continuous connection between the client network and the Google data center eliminates some of that warming up lag time. In addition, content is split up into tiny packets to be sent over the Internet – and some of the delay that you may experience is due to the occasional loss of some of those packets. By designating the client network as a middleman, lost packets can be spotted and replaced much more quickly.

A technical report on the study will be presented at the SIGCOMM Internet Measurement Conference in Spain on October 24. Calder worked with Ramesh Govindan and Ethan Katz-Bassett of USC Viterbi, as well as John Heidemann, Xun Fan, and Zi Hu of USC Vierbi's Information Sciences Institute.

The team developed a new method of tracking down and mapping servers, identifying both when they are in the same datacenter and estimating where that datacenter is. They also identify the relationships between servers and clients, and just happened to be using it when Google made its move.

"Delayed web responses lead to decreased user engagement, fewer searches, and lost revenue," said Katz-Bassett, assistant professor at USC Viterbi. "Google's rapid expansion tackles major causes of slow transfers head-on."

The strategy seems to have benefits for webusers, ISPs and Google, according to the team. Users have a better web browsing experience, ISPs lower their operational costs by keeping more traffic local, and Google is able to deliver its content to webusers quicker.

Xun Fan, graduate student at USC Viterbi, noted that the team had not originally set out to document this growth.

"We had developed techniques to locate the servers, without requiring access to the users they serve, and it just so happened we exposed this rapid expansion," Fan said.

Next, the team will attempt to quantify exactly what the performance gains are for using this strategy, and will try to identify under-served regions.

Explore further: Google to change terms to use your identity in ads

Related Stories

Google sits on Internet-traffic throne, says report

Jul 23, 2013

The stats are in. According to Deepfield, an Internet analytics business, Google serves 25 percent of North American Internet traffic. The question becomes, glass half empty or full and overflowing?

EU says Google not doing enough in antitrust case

Jul 17, 2013

(AP)—The European Union's competition chief says Google isn't doing enough to overcome concerns that it's stifling competition, and ordered the Internet giant to come up with new ideas.

At 15, Google revisits past, eyes future

Sep 26, 2013

Google celebrated its 15th birthday Thursday with a trip down memory lane, and an update to the search engine formula which helped spawn the tech giant.

Recommended for you

WEF unveils 'crowdsourcing' push on how to run the Web

9 hours ago

The World Economic Forum unveiled a project on Thursday aimed at connecting governments, businesses, academia, technicians and civil society worldwide to brainstorm the best ways to govern the Internet.

Study: Social media users shy away from opinions

Aug 26, 2014

People on Facebook and Twitter say they are less likely to share their opinions on hot-button issues, even when they are offline, according to a surprising new survey by the Pew Research Center.

US warns shops to watch for customer data hacking

Aug 23, 2014

The US Department of Homeland Security on Friday warned businesses to watch for hackers targeting customer data with malicious computer code like that used against retail giant Target.

Fitbit to Schumer: We don't sell personal data

Aug 22, 2014

The maker of a popular line of wearable fitness-tracking devices says it has never sold personal data to advertisers, contrary to concerns raised by U.S. Sen. Charles Schumer.

Should you be worried about paid editors on Wikipedia?

Aug 22, 2014

Whether you trust it or ignore it, Wikipedia is one of the most popular websites in the world and accessed by millions of people every day. So would you trust it any more (or even less) if you knew people ...

How much do we really know about privacy on Facebook?

Aug 22, 2014

The recent furore about the Facebook Messenger app has unearthed an interesting question: how far are we willing to allow our privacy to be pushed for our social connections? In the case of the Facebook ...

User comments : 0