Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Businesses Databases Programming Software The Internet IT

Google's Chiller-Less Data Center 132

1sockchuck writes "Google has begun operating a data center in Belgium that has no chillers to support its cooling systems, which will improve energy efficiency but make weather forecasting a larger factor in its network management. With power use climbing, many data centers are using free cooling to reduce their reliance on power-hungry chillers. By foregoing chillers entirely, Google will need to reroute workloads if the weather in Belgium gets too warm. The facility also has its own water treatment plant so it doesn't need to use potable water from a local utility."
This discussion has been archived. No new comments can be posted.

Google's Chiller-Less Data Center

Comments Filter:
  • by LoRdTAW ( 99712 ) on Wednesday July 15, 2009 @08:26PM (#28710723)

    I don't know about natural lakes but man made ponds have been used for just that purpose.

  • Yakhchal (Score:5, Informative)

    by physicsphairy ( 720718 ) on Wednesday July 15, 2009 @08:28PM (#28710745)

    The ancient Persians had a passively cooled refrigerator called the yakhchal [wikipedia.org] which "often contained a system of windcatchers that could easily bring temperatures inside the space down to frigid levels in summer days."

    Perhaps the Google datacenter could employ some variation of their technique.

  • by Five Bucks! ( 769277 ) on Wednesday July 15, 2009 @08:29PM (#28710747)

    They do! Well... not Superior, but Lake Ontario.

    Toronto has a rather large system that uses deep, cool water as a heat sink.

    Enwave [wikipedia.org] is the company that provides this service.

  • Re:Unreliable... (Score:5, Informative)

    by j79zlr ( 930600 ) on Wednesday July 15, 2009 @08:48PM (#28710903) Homepage
    If you have chilled water, you have a chiller, which means you have compressors. Process water or ground source water usually is not cold enough to be an effective cooling medium. You want a high delta T between the entering air temp and the entering water temp to induce heat transfer. Closed loop ground source water is extremely (prohibitively) expensive and open loop is quite a maintenance hassle due to water treatment. High efficiency chillers paired with evaporative cooled water towers with economizer capability is very efficient and reliable. Usually you can get down to around 0.5kW per ton with high efficiency chillers at full load and with multiple staged compressors you can do even better with part load conditions. The cooling towers are usually pretty low with around 0.05 to 0.15kW per ton. Use VFD's on the secondary pumps and cooling tower fans, and you can get cooling in at 0.75kW per ton for the whole plant at peak and even lower and part load conditions (95% of the time).

    I just designed a data center for a large Big Ten univeristy and there were no large air handlers involved at all. The system had two 400-ton chillers with the chilled water piped directly to rack mount APC fan coils. Without "green" being the basis of design, the chiller system still operates right at about 1kW/ton.
  • by seifried ( 12921 ) on Wednesday July 15, 2009 @09:27PM (#28711209) Homepage
    You know what water costs in bulk? It adds up pretty quick. Plus they don't need potable (drinkable) water, they need water that won't clog their system up.
  • by Runaway1956 ( 1322357 ) on Wednesday July 15, 2009 @09:49PM (#28711377) Homepage Journal

    Municipal water (at least here, in the US) means "chlorinated water". Chlorine does terrible things to pipes, coolers, pumps - everything. Having your own water treatment system means the chlorine never gets in, saving bundles in maintenance. To get an idea, find two similar water cooled vehicles - one which has had chlorinated water added to the radiator routinely, and another whose owner has been more choosy. Look down into those radiators. I've actually seen copper radiators corroded out in states that use salt on their roads. (for the sake of argument, read "sodium CHLORIDE" although other salts are used on the roads)

    While chlorine would be the primary reason not to use municipal water, there are other contaminants in their water supplies as well. No boiler technician would willingly use city water, with or without chlorine, in his boiler if he can avoid it. Navy boilers run on distilled water, with desired preservative chemicals added, which translates into very long service lives.

  • Re:Unreliable... (Score:3, Informative)

    by Xiterion ( 809456 ) on Wednesday July 15, 2009 @10:02PM (#28711479)
    A ton is a measure of the amount of heat transferred. See this [wikipedia.org] for more details. It's also worth noting how much of the heat transfer is done by way of allowing the water in the system to evaporate.
  • Re:Unreliable... (Score:2, Informative)

    by cynyr ( 703126 ) on Wednesday July 15, 2009 @10:10PM (#28711541)
    the short answer is that the ton mentioned above in the HVAC industry is roughly equivlent to the amount of cooling a ton of ice (frozen water) would provide. Somedays I wish my industry would just unhitch the horse, and burn the buggy it was attached to.
  • Re:Global warming (Score:2, Informative)

    by jonadab ( 583620 ) on Wednesday July 15, 2009 @10:28PM (#28711701) Homepage Journal
    > Guess they'll be in big trouble when global warming strikes Belgium!

    If global warming ever did what the alarmists keep saying it's going to do, chillers would probably become completely irrelevant, since about two thirds of Belgium would be continuously surface-mounted with a very large water-cooling rig and heatsink, sometimes known as the North Sea.
  • Re:Unreliable... (Score:4, Informative)

    by jhw539 ( 982431 ) on Wednesday July 15, 2009 @10:43PM (#28711815)
    "Again using rules of thumb, you can assume that 80% of the electrical power delivered to the computers will be dissipated as heat."

    ? 100% of the electrical power delivered to the computer is dissipated as heat. It's the law. It will be far less than the nameplate power (that electrical uses), and perhaps 80% of what is delivered to the building (after transformer, UPS, and PDUs), but it all ends up as heat (unless you're splitting hairs about the acoustical energy emissions and velocity pressure in the exhaust, which is small and quickly converted to heat).
  • Re:Unreliable... (Score:5, Informative)

    by j79zlr ( 930600 ) on Wednesday July 15, 2009 @11:03PM (#28711969) Homepage
    The units were mounted on the roof, but were packaged AAON 2 x LL210 chillers (and a full 400 ton backup) with no exposed exterior piping. Glycol reduces the specific heat of the fluid and increases the specific gravity, so it can move less heat and takes more power to move. I only add glycol to the system if freezing is an issue.
  • Re:Unreliable... (Score:3, Informative)

    by j79zlr ( 930600 ) on Wednesday July 15, 2009 @11:13PM (#28712045) Homepage
    Our design conditions were 75degF. The server manufacturers said they can handle up to 100degF but have much longer life with cooler room temps.

    Primary loop is feeding the chiller. Most chillers don't like variable flow. The secondary loop is feeding the load.
  • by dlevitan ( 132062 ) on Wednesday July 15, 2009 @11:22PM (#28712105)

    Cornell University actually did this exact thing to cool a good chunk of the campus. It's called lake source cooling [cornell.edu]. While there will of course be some environmental impact, the energy usage is 20% of normal chillers and thus is, I'm sure, an environmental net gain.

  • Re:Unreliable... (Score:3, Informative)

    by jhw539 ( 982431 ) on Thursday July 16, 2009 @11:00AM (#28716999)
    "Servers may be able to operate at 90-100, but they simply won't last as long being cooked compared to equipment that lives at cooler temperatures."

    Operating 500 hours a year at 90F (the peak of the allowable range) is unlikely to impact longevity. 100F is outside of the allowable range. Your opinion is contradicted by what IBM, Intel, Dell, Sun, and numerous datacenter owners along with the design professionals at ASHRAE have developed over the course of several years of research and many (mostly dull) hours of debate.

    There are special cases, tape machines are glaring examples, but operating a datacenter at 80-90F does not have any correlation beyond old wive's tails with increased equipment failure. Indeed, such a 10F difference in actual component temperature (which is what matters) can occur merely between different manufacturer's case layout or the use of meshed back security rack.

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...