Measuring the Google Grid

An article by John Markoff and Saul Hansell in today’s New York Times estimates Google has over 450,000 servers spread over at least 25 locations around the world. One of the latest large data centers is located next to a hydroelectric dam in Oregon and is the size of two football fields with a new permit to grow yet again.

The best quote of the article came from Milo Medin, who called Google “the Borg.”

Business in the age of the Internet is a lot like traditional business: it’s all about location, location, location. In today’s interconnected business world, a good location provides cheap rent and electricity, is not too hot, close to good transportation (fat fiber pipes), and geographically close to users. New data centers need to come online to service the new masses of users and data storage, causing teams to scramble for more and more space to remain competitive.

According to the New York Times, web companies are still tapping into the excess capacity created during the last telecom boom and lighting up fiber that has been dark for years. When will the first towns appear and be entirely supported by a dammed river and a large data center on a now dry river bed? Perhaps buying a city and changing its name is not just a clever marketing stunt anymore, but a reality of a new era in online business.


Commentary on "Measuring the Google Grid":

  1. Guillaume Champeau on wrote:

    Very interesting data and thoughts, thanks.

    I was attending a conference on Quaero a few days ago in Paris, and Exalead’s reprensative said their technology uses far less servers for the same results. This might not be of concern for Google (this gives ways to spend their money), but it may be a strong sale point for intranet searching.

  2. Jesse on wrote:

    It’s just like I said last night at Coupa. The next couple years are gonna be a pure arms race for the usual suspects.

  3. Erica Douglass on wrote:

    As the owner of a colocation company here in San Jose (Simpli Hosting / ), I can say it’s not always that easy. There are a lot of psychological barriers in the colo market, suprisingly. One of the ones we run up against most frequently is the need people and companies have to be “close” to their servers. We get most of our business from the local market here in the Bay Area, but even those in San Francisco must overcome those psychological barriers of “Oh gosh, what happens if my server crashes… I’m an hour away and I’d have to drive all the way to SAN JOSE to fix it!”

    A datacenter in Oregon will work well for Google; they can hire the expertise they need locally. But for some smaller companies, all of the remote reboots, remote KVMs and even the 24×7 staff we have here in SJ will not help them overcome this strange “need” to be close to their servers. BTW, this need is often worse in the techies because they are over-protective of the hardware (often they’ve built it themselves and have funky kernel configs). The non-techies don’t really seem to care where the darn thing is as long as it stays online and is fast to access.

    Just my view from 5 years of selling colo and dedicated servers here inthe heart of the Valley.


  4. Doug Marshall on wrote:

    People want to be close to their servers because they know how to fix them. The guys unlucky enough to have the night shift at the colo just don’t. What’s more, it costs a fortune if you need to have that guy waddle over to your cage and reboot something, much less know what to do if, god forbid, you garfed something during an install.