Google's Chiller-Less Data Center 132
1sockchuck writes "Google has begun operating a data center in Belgium that has no chillers to support its cooling systems, which will improve energy efficiency but make weather forecasting a larger factor in its network management. With power use climbing, many data centers are using free cooling to reduce their reliance on power-hungry chillers. By foregoing chillers entirely, Google will need to reroute workloads if the weather in Belgium gets too warm. The facility also has its own water treatment plant so it doesn't need to use potable water from a local utility."
Re:Global warming (Score:1, Insightful)
Re:Unreliable... (Score:1, Insightful)
Of course, if they have to do the re-routing 10 or so times a year, they will get the kinks worked out. That is far better management scheme than having a fail-over plan that never really gets tested. Also, when temps rise, they probably won't be completely off-lining this data center, just a fraction of the containers within it.
I also wonder if they might not be fibbing a little, the air handlers come in different types. For chilled water use, they wouldn't have compressors, the chilled water is run through the heat exchanger. There are also air-handlers that use "process water" which is more like room temperature. These have built-in compressors with a freon (or whatever) loop. Freon goes though the heat exchanger and the process water is used to remove the heat. I'd bet this data center has some of this type of air handler and they would be effective even on hot days.
Re:Unreliable...Probably not (Score:5, Insightful)
I think the concept is interesting, and it makes me wonder if we'll see more datacenters built in areas of the world more conducive to projects like this in the future.
Re:Worth the tradeoff? (Score:4, Insightful)
If you're Google? Apparently the answer is "yes."
More people can and should do this. 27C is plenty cool enough for servers. It annoys me to go into a nipple crinkling datacenter knowing they're burning more juice cooling the darned thing than they are crunching the numbers. A simple exhaust fan and some air filters would be fine almost all of the time, and would be less prone to failure.
Re:Global warming (Score:3, Insightful)
Re:No chillers in Belgium (Score:1, Insightful)
Did anyone else think of weed when reading "chiller-less" and "Belgium"?
That's Netherlands.
Re:Unreliable... (Score:3, Insightful)
Servers may be able to operate at 90-100, but they simply won't last as long being cooked compared to equipment that lives at cooler temperatures. This probably doesn't matter if you're Google and don't care about burning hardware or if you have money to spare and are always installing new equipment, or would rather generate truckloads of electronics waste replacing servers faster than a cooler facility just to get a PUE to brag about. The rest of us will have to settle for server rooms with air conditioning for now.
Actually... (Score:3, Insightful)
I did read the PP and I've even replied to it.
And I thought that I was clear enough in my reply, but apparently not.
See... the game is not most power-efficient cooling, or even best cooling.
The game is "most bang per buck invested in the server infrastructure".
Now... Saving money by reducing the cooling costs by using huge passive cooling farms is a nice idea, but not as easily calculable as simply switching to cheaper electricity.
Sure, should you move your servers to Siberia you would get shitload of passive cooling, but unless polar bears are going to start using broadband internet - servers will never make it above 50% efficiency.
Cause, even on 100% usage - they will still be in the middle of the f-in desert. No local traffic. Too far from civilization for the global traffic.
Any money you would save by running those "virtualized workloads" through such power-efficient servers would be overshadowed by higher maintenance costs to the infrastructure and higher energy costs.
On the other hand - switching to servers running on cheaper electricity at the moment is a quite clear and easily calculable way to save money.
There is a compromise solution though. Mountains. Don't go north, go up.
Granted, there are not always readily available, but Europe and USA's west coast are really close to both major internet backbones and mountains.
Still... You would probably save more by "zone switching" than with passive cooling.