Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google Businesses Databases Programming Software The Internet IT

Google's Chiller-Less Data Center 132

1sockchuck writes "Google has begun operating a data center in Belgium that has no chillers to support its cooling systems, which will improve energy efficiency but make weather forecasting a larger factor in its network management. With power use climbing, many data centers are using free cooling to reduce their reliance on power-hungry chillers. By foregoing chillers entirely, Google will need to reroute workloads if the weather in Belgium gets too warm. The facility also has its own water treatment plant so it doesn't need to use potable water from a local utility."
This discussion has been archived. No new comments can be posted.

Google's Chiller-Less Data Center

Comments Filter:
  • Re:Global warming (Score:1, Insightful)

    by Fenax ( 1094827 ) on Wednesday July 15, 2009 @07:51PM (#28710351) Homepage
    Belgium is already too warm ! *here a Belgian complaining about weather*
  • Re:Unreliable... (Score:1, Insightful)

    by Anonymous Coward on Wednesday July 15, 2009 @08:02PM (#28710493)

    Of course, if they have to do the re-routing 10 or so times a year, they will get the kinks worked out. That is far better management scheme than having a fail-over plan that never really gets tested. Also, when temps rise, they probably won't be completely off-lining this data center, just a fraction of the containers within it.

    I also wonder if they might not be fibbing a little, the air handlers come in different types. For chilled water use, they wouldn't have compressors, the chilled water is run through the heat exchanger. There are also air-handlers that use "process water" which is more like room temperature. These have built-in compressors with a freon (or whatever) loop. Freon goes though the heat exchanger and the process water is used to remove the heat. I'd bet this data center has some of this type of air handler and they would be effective even on hot days.

     

  • by Banzai042 ( 948220 ) on Wednesday July 15, 2009 @08:05PM (#28710521)
    Remember that even on hot days not all of the traffic through the datacenter needs to be rerouted, and I'd imagine that a location selected for a datacenter like this was chosen for the infrequency of days that will require rerouting. Do you know how much it costs to cool a datacenter, and how much this will save? I don't, but Google probably does, and they probably wouldn't make a decision to do something like this without comparing the savings with the potential cost from decreased lifespan of computers running hot and losses due to downtime. I would also imagine that Google will be working to greatly increase stability during rerouting, given the comments from the end of TFA about other power saving uses, such as routing traffic to datacenters where it's night, meaning "free cooling" can be used since it's colder outside, and off-peak electricity rates are in effect.

    I think the concept is interesting, and it makes me wonder if we'll see more datacenters built in areas of the world more conducive to projects like this in the future.
  • by symbolset ( 646467 ) on Wednesday July 15, 2009 @08:09PM (#28710555) Journal

    If you're Google? Apparently the answer is "yes."

    More people can and should do this. 27C is plenty cool enough for servers. It annoys me to go into a nipple crinkling datacenter knowing they're burning more juice cooling the darned thing than they are crunching the numbers. A simple exhaust fan and some air filters would be fine almost all of the time, and would be less prone to failure.

  • Re:Global warming (Score:3, Insightful)

    by mister_playboy ( 1474163 ) on Wednesday July 15, 2009 @09:18PM (#28711141)
    Upper 70s??? I'd go for that. I've had about enough of this 100 degree BS here.
  • by Anonymous Coward on Thursday July 16, 2009 @01:59AM (#28713073)

    Did anyone else think of weed when reading "chiller-less" and "Belgium"?

    That's Netherlands.

  • Re:Unreliable... (Score:3, Insightful)

    by Glendale2x ( 210533 ) <[su.yeknomajnin] [ta] [todhsals]> on Thursday July 16, 2009 @05:11AM (#28714099) Homepage

    Servers may be able to operate at 90-100, but they simply won't last as long being cooked compared to equipment that lives at cooler temperatures. This probably doesn't matter if you're Google and don't care about burning hardware or if you have money to spare and are always installing new equipment, or would rather generate truckloads of electronics waste replacing servers faster than a cooler facility just to get a PUE to brag about. The rest of us will have to settle for server rooms with air conditioning for now.

  • Actually... (Score:3, Insightful)

    by denzacar ( 181829 ) on Thursday July 16, 2009 @09:45AM (#28715883) Journal

    I did read the PP and I've even replied to it.
    And I thought that I was clear enough in my reply, but apparently not.

    See... the game is not most power-efficient cooling, or even best cooling.
    The game is "most bang per buck invested in the server infrastructure".

    Now... Saving money by reducing the cooling costs by using huge passive cooling farms is a nice idea, but not as easily calculable as simply switching to cheaper electricity.
    Sure, should you move your servers to Siberia you would get shitload of passive cooling, but unless polar bears are going to start using broadband internet - servers will never make it above 50% efficiency.
    Cause, even on 100% usage - they will still be in the middle of the f-in desert. No local traffic. Too far from civilization for the global traffic.
    Any money you would save by running those "virtualized workloads" through such power-efficient servers would be overshadowed by higher maintenance costs to the infrastructure and higher energy costs.

    On the other hand - switching to servers running on cheaper electricity at the moment is a quite clear and easily calculable way to save money.

    There is a compromise solution though. Mountains. Don't go north, go up.
    Granted, there are not always readily available, but Europe and USA's west coast are really close to both major internet backbones and mountains.

    Still... You would probably save more by "zone switching" than with passive cooling.

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...