Google Envisions 10 Million Servers 169
miller60 writes "Google never says how many servers are running in its data centers. But a recent presentation by a Google engineer shows that the company is preparing to manage as many as 10 million servers in the future. At this month's ACM conference on large-scale computing, Google's Jeff Dean said he's working on a storage and computation system called Spanner, which will automatically allocate resources across data centers, and be designed for a scale of 1 million to 10 million machines. One goal: to dynamically shift workloads to capture cheaper bandwidth and power. Dean's presentation (PDF) is online."
Re:fastest site on the internet gets faster? (Score:5, Funny)
They grind them up and feed them to new servers and then serve you zombie content with those.
Re:Pretty soon... (Score:3, Funny)
At least, we aren't going to have to go through the pains of upgrading to IPv6 in that case... 2^32 covers 10 million like bull covers a rabbit...
Re:Pretty soon... (Score:5, Funny)
That's the plan, I thought:
1. Cache all websites
2. Cache all users
3. Disconnect the meat beings
Oop, said too much!
Re:Disposal? (Score:2, Funny)
Re:fastest site on the internet gets faster? (Score:5, Funny)
Soylent Blue?
Boorgle (Score:4, Funny)
It's pronounced Boorgle... and resistance is futile.
new ad campaign? (Score:4, Funny)
Enough? (Score:3, Funny)
1981 [wikiquote.org]: 640K ought to be enough for anybody.
2009: 10 Million servers ought to be enough for any company.
Re:The NSA has Google beat... (Score:5, Funny)
Stupid non-standard unit. According to the official Salt Lake City Energy Blueprint, SLC has an annual electricity usage of 3.3 billion kWh, of which 17% is residential. This works out to 64 MW, or about 6 POOTs (Power Output of Togo), which is the accepted standard non-standard unit for power in this order of magnitude.
Assuming that they are referring to area, and not volume -- the Alamodome is about 40,000 square meters... the standard non-standard unit for area of this magnitude is American football fields (NOT random stadia) including endzones, which is 5351 square meters -- thus this data archive will be approximately 7+ football fields.
Yes, it would be interesting to know how much data they will be storing in this facility.
But, sheesh, I understand not wanting to use standard units as they may just confuse the scientifically illiterate... but if the NSA or some other source is going to use non-standard units, they should at least use standard non-standard units like POOTs or football fields.