Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Microsoft Technology

Microsoft Unveils Zero-Water Data Centers To Reduce AI Climate Impact (yahoo.com) 70

Microsoft, trying to mitigate the climate impact of its data center building boom, is starting to roll out a new design that uses zero water to cool the facilities' chips and servers. From a report: Launched in August, the new design will eliminate the more than 125 million liters of water each data center typically uses per year, the company said in a statement. The new system use a "closed loop" to recycle water; liquid is added during construction and continually circulated -- obviating the need for fresh supplies. Data centers will still require fresh water for worker facilities like bathrooms and kitchens.

Microsoft spent more than $50 billion on capital expenditures in the fiscal year ended June 30, the vast majority related to data center construction fueled by demand for artificial intelligence services. It plans to top that figure in the current year, requiring rapidly rising amounts of energy to run the networks and water to cool equipment. Many of latest facilities are going up in hot, dry areas like Arizona and Texas, making it even more critical to find ways to conserve water. Microsoft's existing data centers will continue to use a mix of older technologies, but new projects in Phoenix and Mount Pleasant, Wisconsin, will begin using the zero-water designs in 2026.

Microsoft Unveils Zero-Water Data Centers To Reduce AI Climate Impact

Comments Filter:
  • Say what? (Score:4, Informative)

    by jenningsthecat ( 1525947 ) on Tuesday December 10, 2024 @09:51AM (#65003295)

    ...trying to mitigate the climate impact...

    .

    No! This will mitigate the ecological impact, but will do little or nothing to affect the climatic impact. It will release the same amount of heat, from the same sources of power, regardless of whether the coolant loop is closed or open. In fact, there's potential here for increasing the carbon footprint, depending on whether the water in the closed-cycle system is cooled passively or actively.

    Unless, of course, they normally use municipal drinking water for cooling. I'm pretty sure that's not the case though. Somebody please tell me they're not doing that...

    • Agreed, this is just an attempt to greenwash the project. Mitigating climate impact is good, but not building new data centers at all is much better if you're looking at climate impact. I'd say M$ is looking more at how to make more $$ than anything else.
      The coolant loop will use ethylene glycol, which is a hydrocarbon and is one of the many petroleum products the petrochem industry produces. The main producers of glycol for are S.Korea, Saudi Arabia and Singapore. It's likely that the permanent closed loop

      • by ls671 ( 1122017 )

        Indeed:

        Data centers will still require fresh water for worker facilities like bathrooms and kitchens.

        If they were serious, they would recirculate workers' pee into drinking water like they do in space ships and nuclear submarines! /s

      • by jbengt ( 874751 )

        The coolant loop will use ethylene glycol, which is a hydrocarbon and is one of the many petroleum products the petrochem industry produces.

        Did you get that information from somewhere? There's a good chance that they would use propylene glycol, even though it is less efficient, since that is considered much less acutely toxic than ethylene glycol. In any case, using glycol in a closed loop is not a big impact, even if derived from petroleum; just think of all the propylene glycol-based de-icing and anti

      • Why would you run a chiller loop at a temperature where you need anti-freeze? That would be extremely inefficient, cold water is cold enough for a heatpump chiller. Though I doubt they are even using a heatpump, it's probably just air cooling, with a water loop to get increased surface area where needed.

        • by Lalo ( 10502809 )

          Why would you run a chiller loop at a temperature where you need anti-freeze? That would be extremely inefficient, cold water is cold enough for a heatpump chiller. Though I doubt they are even using a heatpump, it's probably just air cooling, with a water loop to get increased surface area where needed.

          Air-cooled chillers sit outside and are not always turned on. Depending on the design conditions (climate), you can either add glycol to the chilled water loop, or you can apply heat trace to the exposed piping. I'd wager that glycol is rare in a chilled water loop and that heat trace is the dominant solution.

          Water-cooled chillers will sit inside and require a secondary loop for their heat rejection. In the non-evaporative case (a la Microsoft's marketing here) you're using drycoolers. Again, not all drycoo

    • If we could get them to stop using fossil fuels and actually install renewables for their full needs then they would have a neutral or negative effect on heat emissions. Solar effectively reduces albedo by causing more energy to be reradiated into space instead of stored in the land, and wind energy is already dissipated as heat.

      • They are already doing this, and have committed to being carbon negative by 2030.

        It's pretty shocking that despite massive cloud growth they are emitting 6% less CO2 today than 2020 from direct (power) emissions.

        There has been a significant increase in indirect emissions, i.e what it takes to construct facilities. I don't know how you fix that.

        https://blogs.microsoft.com/on... [microsoft.com]

        • by haruchai ( 17472 )

          "There has been a significant increase in indirect emissions, i.e what it takes to construct facilities. I don't know how you fix that"
          Low carbon concrete & steel but no sure if that's any closer today than it was when I 1st heard about it 20 years ago

          • I've read about pilot projects for green concrete, but I can't recall how it worked. I think it was some catalyst that lowered the temperature requirement for calcination down to like 700C.

            Intuition suggests concentrated solar to get rid of the fuel requirement but the CO2 from the limestone has to go somewhere.

    • by Lalo ( 10502809 )

      Unless, of course, they normally use municipal drinking water for cooling. I'm pretty sure that's not the case though. Somebody please tell me they're not doing that...

      Most data centers that use evaporation as their final heat rejection to the environment use municipal water as the water source. Some use grey water - reclaimed water from water treatment plants - but that's not common because the parallel distribution network of piping from water treatment plants is not typically as widespread.

      Cooling data centers without evaporation does increase utility energy consumption, although the gap has tightened over time.

  • Waste heat recovery (Score:5, Interesting)

    by Rauser ( 631244 ) on Tuesday December 10, 2024 @09:58AM (#65003309)
    Current water-cooling typically dumps the waste heat either to the atmosphere (via evaporation of water) or directly into a body of water such as a lake, river, ocean or holding ponds. This is true of most large heat generating facilities such as power plants, as well as data center. Having a closed loop secondary cooling circuit opens up options to recover the heat for other uses like municipal heating or cogeneration of power. Whether Microsoft is interested or positioned to do this is another question.
    • by Rei ( 128717 )

      Temperatures are far too low for economically-realistic generation of power. So low that in most cases it's not even practical for municipal heating. Like a 5-10C rise is typical, maybe 10-20 in extreme cases. You're not boiling the water in contact with GPUs, let alone creating >200C pressurized steam.

      Anyway, the water issue is generally just ragebait for people who see "large numbers of litres / gallons per year" and don't realize how small those numbers are from the perspective of industrial or esp

      • But the ragebait exists, so...

        My town is getting an AWS data center that we don't want, and we weren't allowed to wash cars or water lawns for most of last summer due to water restrictions. Likely the zoning commission they bought will give them priority.

        • by Anonymous Coward

          NOBODY should be allowed to water lawns. It's a horrible waste of treated water.

          If you don't get enough rain for your lawn, rip that shit up and replace it, either with rocks and native plants that don't need to be watered or a vegetable garden that at least gives you some food.

          I guarantee lawn watering is a FAR bigger waste of water than data centers, and we definitely have too many data centers.

          • You guarantee it, eh? So you'll give me my money back? [usgs.gov] That's 2015 but the proportions haven't changed much: industrial and agricultural usage each heavily outweighs residential usage.
      • Temperatures are far too low for economically-realistic generation of power.

        You can use it for other things though. The old coal-fired power stations not too far from where I lived as a kid in the UK used to use their low-grade hot water to warm green houses that were used to grow vegtables in the winter.

        • by jbengt ( 874751 )

          The old coal-fired power stations not too far from where I lived as a kid in the UK used to use their low-grade hot water to warm green houses that were used to grow vegtables in the winter.

          The water used for heat rejection for A/C is usually only 95F (35C) on the warm side. That's not really warm enough to economically heat remote buildings.

          • by Rei ( 128717 )

            Yep. That's like the *outflow* temperature of our geothermal heating systems here.

            35C water isn't useless - you can heat soil or raise farmed fish with it - but it's not exactly a high demand product, and in general, it's just discharged.

      • by jbengt ( 874751 )

        Anyway, the water issue is generally just ragebait for people who see "large numbers of litres / gallons per year" and don't realize how small those numbers are from the perspective of industrial or especially agricultural usage

        The quoted 125,000,000 liters per year only amounts to around 63 gallons per minute. That's not a lot of cooling tower make-up water for a decent-sized data center. Assuming water circulating at 2 gpm per Ton and 3% of that as fresh water to replace evaporation and dilute suspende

  • by chipperdog ( 169552 ) on Tuesday December 10, 2024 @10:00AM (#65003311) Homepage
    Build in Northern MN instead of AZ and TX...9 months out of the year you have ample atmospheric cooling, and abundant cooling tower water supplies for the 3 months where you need them (not called the land of 10,000 lakes for nothing).
    • Exactly. This article is fake news. It states WI. Itâ(TM)s winter and cold now they just opened the windows for free cold air. So easy
      • by jbengt ( 874751 )

        Itâ(TM)s winter and cold now they just opened the windows for free cold air. So easy

        If you do that, you will get very low humidity and static problems. Data equipment isn't as sensitive to those conditions as it used to be, but still, it could be a problem. There are ways to add humidity without adding heat, but those bring their own problems with getting dust and minerals in the air from the dissolved solids in the humidifying water unless you use RO filtered / de-ionized water, which in turn bring

  • Microsoft spent more than $50 billion on capital expenditures in the fiscal year ended June 30, the vast majority related to data center construction fueled by demand for artificial intelligence services.

    Silly humans. Rather than wait to see if "artificial intelligence" provides us any benefit, we rush headlong towards building up the supply of it because it's in such demand. Who exactly is demanding it? The evangelists like Altman? The marketing droids that see the Altman and Altman-adjacent's words as the words of the new creators of gods? The people blindly feeding training data to these machines without realizing they are, in essence, training their replacements? Just where is this demand coming from?

    At

    • These new data centers and new servers aren't just sitting idle, people are in fact using them. I personally find uses for AI multiple times a day, and even my non-technical friends are getting into the game. The demand is absolutely there.

      So what about the environmental impact? Clearly, we need to work on that. But the answer isn't "don't use AI"...that's kind of like suggesting we "stop having babies" in order to avoid overpopulating the earth.

      There are some promising technologies that purportedly reduce

      • by haruchai ( 17472 )

        "Promising" mean nothing until the economics are more favorable

        • I'd say the economics are very favorable. The big AI giants aren't building new data centers and buying nuclear power plants to power them, for chump change. They would absolutely love to save the kind of money they would save with a 90-95% savings in processor usage.

  • Running water through a series of pipes that doesn't get any foreign substance (beyond small things that come walls of pipes over time) and then dumping it back into reservoir to cool off seems like a sensible thing to do.

    What is the benefit of separating cooling into two separate circuits, where primary is closed and secondary is the one doing the "cool the primary and dump" cycle? I understand the benefits if temperature gradients are significant, if water grade in primary is very specific and needs to be

    • Re:But why? (Score:4, Informative)

      by samwichse ( 1056268 ) on Tuesday December 10, 2024 @10:23AM (#65003363)

      Evaporative cooling. Especially in a dry climate, it's super effective and water has historically been cheap.

      • by Luckyo ( 1726890 )

        Sure, but you don't have to do it that way. You can just take on cold water from a reservoir, cycle it through the heat exchangers and dump it back into a different spot in the reservoir.

        And let environment cool the water in the reservoir. Power plants do this all the time. You consume very little water, you add very little if any pollution to this water and you get a lot of cooling efficiency.

        Maybe this is because data centers don't heat water enough for this to be efficient?

        • If anything it would be more efficient because you wouldn't have the heat gradient in the secondary loop reservoir you get from a high heat source like a power plant - you wouldn't have to deal with thermal issues like algae blooms and fish kills.

          • by Luckyo ( 1726890 )

            Probably poor phrasing on my part. I was directly addressing evaporative cooling as an alternative, not primary-secondary mode of operation common to most power plants.

            I.e. heating from data center may not be sufficient to create a high enough differential between temperatures to make cooling sufficiently effective. I don't know, this isn't really my field. I do know that there is remote heating done around google data center near the cost here that takes their waste heat into a heat exchanger to heat remot

        • by Lalo ( 10502809 )

          Sure, but you don't have to do it that way. You can just take on cold water from a reservoir, cycle it through the heat exchangers and dump it back into a different spot in the reservoir.

          There's no such thing as "just." In any case, restricting site selection to being near large water reservoirs would be a real problem.

    • Re: (Score:2, Informative)

      by Junta ( 36770 )

      They like to put crap into the cooling pipes, to make sure nothing lives/grows in the water and such. They also like being more picky about the starting water in the loops that endlessly recycle the same water. So it's standard practice to have separate cooling loops with a heat exchanger.

      Also, if you ran the secondary loop wide open, there's a chance of your pipes inside the more sensitive electronics going below the dew point, which would be a gigantic bad thing. Your secondary loop is generally in a si

      • by Luckyo ( 1726890 )

        Latter hasn't been a problem in many decades because of industrial automation. It has to fail in some spectacular fashion for that scenario to occur.

        As for the first scenario, it's a bit of a "maybe but not really". One of the things that hold undesired growth at bay is constant fresh flow of water itself. It's when you recycle the water when spores get to grow. It's why it's stagnant water that is full of various biological growths, while running water tends to be fairly clean of them.

        • by Pascoea ( 968200 )

          it's a bit of a "maybe but not really".

          It's really not, though. Water touching "sensitive" parts of cooling systems needs to be heavily treated. They have to remove sediments, biologicals, undesirable dissolved minerals, and anything else that will plug up small passages.

          It costs more to install a closed-loop primary, but when your circulating coolant is some engineered fluid it doesn't need all of that constant pre-treatment and you don't have to worry about periodically cleaning your entire cooling system as much. You only have a couple of

          • by Luckyo ( 1726890 )

            Sure, but all you do is move the problems to another heat exchanger.

            • by Pascoea ( 968200 )
              You are 100% correct. And I covered why that's likely an adventageous design in my initial post.
              • by Luckyo ( 1726890 )

                We're still living in 21st century with modern industry automation. Where a handful of sensors and a proper flow control design is going to be way cheaper than building a whole separate circuit with heat exchanger between the two.

                • by Junta ( 36770 )

                  He covered issues that automation doesn't even theoretically cover.

                  Think a pipe running between DIMMs in a server, very easy for stuff to foul the flow there or to otherwise degrade the tiny bit of piping with a tiny channel for flow. With the secondary exchanges, you can have nice big plumbing and not even have to think about the water messing things up over a very long period of time.

                  • by Luckyo ( 1726890 )

                    The answer seems to be "make the channel wider, redirect the flow, add channels as needed" and so on.

                    • by Junta ( 36770 )

                      You have other non-thermal considerations. In the DIMM/CPU scenario, you have trace lengths and board density to consider.

                      If you are talking about liquid cooling of things like CRAHs, sure you can be focused on the needs of the plumbing and demanding tolerances with the water. But nowadays we are running plumbing between DIMMs, around EDSFF drives, and so on.

                    • by Luckyo ( 1726890 )

                      You seem to have in depth knowledge here. Do you run separate circuits per cabinet, or row or something similar? Or is everything just a one giant circuit? Latter seems really unnecessarily complex and risky considering things like minor leaks, shutting racks and cabinets off for maintenance and upgrades and so on.

                • by Pascoea ( 968200 )

                  I don't mean to be a dick, but can you explain what, exactly, you are proposing automating?

                  The problem to solve is really is quite simple, on the face of it: Water has "stuff" in it. In order to use it for cooling you have to clean it. Otherwise all that stuff plugs passages, builds up on surfaces, or just tends to eat any metals it comes in contact with.

                  And, for something like a CPU/GPU-scale cooler, I'm not talking drinking-water clean. That likely still has too many dissolved minerals, and the chemis

                  • by Luckyo ( 1726890 )

                    Flow paths, thermal hotspots, etc. This used to be challenging still in early 2000s for low level heat exchangers, and was mostly a solved problem even for "inside the boiler" grade ones by around 2015 or so. Though admittedly not by all. Memory of a certain French solution on this one still makes me laugh my ass off. But that's beside the point.

                    We're not talking "single GPU cooler" level of cooling here. We're talking facility wide stuff. And that can have self cleaning through a combination of flow contro

                    • by Pascoea ( 968200 )

                      That all sounds well and good, and I'm sure it could be accomplished to a certain extent. At what cost though?

                      Now, in addition to installing a huge expensive water treatment plant, you're installing a far more complicated flow management system? And yes, you still need the giant water treatment plant. I've been inside a raw-water heat exchanger. You don't want any of what you find in there anywhere near a small heat exchanger. You still need to agressively treat every drop of water that enters the primary

                    • by Luckyo ( 1726890 )

                      I don't think I'm allowed to go into the details of how the system handled it, but the short answer is "no". Automation didn't reduce the problem, because just reducing the problem wouldn't be sufficient to meet contractual obligations. It was eliminated. This was novel technology for a specific rather rare kind of a fuel, and getting it to meet EU emission standards.

                      Regardless, the point is that you in fact can keep very high level of cleanliness with modern sensors, computers and control logic. Because yo

                    • by Pascoea ( 968200 )

                      I don't think I'm allowed to go into the details of how the system handled it, but the short answer is "no". Automation didn't reduce the problem, because just reducing the problem wouldn't be sufficient to meet contractual obligations. It was eliminated. This was novel technology for a specific rather rare kind of a fuel, and getting it to meet EU emission standards.

                      Obviously I'm in no position to disagree with you. I don't know you, what kind of work you do, how involved you were in the project, that sort of thing. And I respect that what you're talking about is proprietary, I've been there too. That being said, I've been around industrial systems (coal-fired power plants, primarily) enough to know that there is no such thing "it doesn't need to be manually cleaned, ever.". If you were to tell me you got it to a point that it maintained itself well enough that it di

                    • by Luckyo ( 1726890 )

                      I keep forgetting that "no maintenance" means something different in different fields. Yearly maintenance, which includes people going into the burner and seeing if something is amiss is still necessary, but that's true for all major burners.

                      Mea culpa, I should've been more specific.

        • by Junta ( 36770 )

          Latter hasn't been a problem in many decades because of industrial automation. It has to fail in some spectacular fashion for that scenario to occur.

          I don't understand your position here, it's still an issue if you had no liquid-liquid heat exchanger. A major tool in the industrial automation is the decoupling of liquids in that head exchanger.

          If your water supply is below the dewpoint, and you have plumbing that will be in sensitive electronics, then your only control is to not flow the water at all, which means you can't cool. The components that need the cooling will not heat the plumbing that leads up to it, and even if you could assume that, still

          • by Luckyo ( 1726890 )

            >If your water supply is below the dewpoint

            Are there major data centers that aren't fully climate controlled inside? Is there a reason not to do it considering issues like condensation that may appear on expensive electronics if you don't that would unrelated to cooling solution used (rainy weather on a warm day, going into rainy weather on a cold night)?

            • by Junta ( 36770 )

              There's a fair amount of relaxed air handling toward the end of nicer PUE numbers. Even with climate controlled datacenters, as an example, I see one right now with an air temp of 21.1C and humidity of 59.8% meaning a dewpoint of 13C. The incoming water to that exchanger is 8.5C.

              Even a fairly aggressive dehumidification would land at 10C dewpoint. Very tough to keep sub-10C incoming water from wetting the plumbing.

              Now in an ideal scenario, you plug in downstream of something that happens to be hot, or you

  • Time to research solid state cooling, something with a high COP, and can scale up past the usual water chilling plants, something that sends off heat via both conduction and moving air past the heat sink, as well as something that can do PV radiation to cool things off via another method. Air is nowhere as good as water for moving massive amounts of heat, but it may be something to do.

    Of course, this is still something with an ecological impact... warmer air on a large scale can mean rain drying up.

    • We have done a lot of research on solid state cooling, so far every approach requires spending a lot of energy as they all just have poor efficiency.

      We already have the tech to cool servers without water, it only makes them bigger. This does mean the building has to have more volume, of course, and has to be designed for airflow. But that was how we were cooling servers before they went to liquid, so it's not like we don't know how.

  • WUE (Score:5, Insightful)

    by lazarus ( 2879 ) on Tuesday December 10, 2024 @11:34AM (#65003583) Journal

    Two common measurable gauges of data center efficiency are WUE (Water Usage Effectiveness) and PUE (Power Usage Effectiveness). A typical hyperscale data center design PUE is something like 1.25 where an actual PUE when the DC is fully loaded with servers is 1.4. What this means is that the the total power consumed by the entire data center is 40% more than the servers themselves consume. Obviously 1.0 would be ideal but is unreachable.

    WUE is a bit different but the goal is the same: to measure how effective the data center is at using water. The calculation is Water Usage (L) / Energy Consumed (kWh). To get a data center built there is a lengthy and expensive permitting process and local municipalities want to know the effect that the facility will have on the local water supply (aquifers, municipal water, etc). So data center builders often use air cooled chillers and closed loop chilled water loops. These systems don't use any water for cooling. They aren't new. They work in almost any climate.

    I bring all this up because evaporative cooling is on the decline due to these concerns and Microsoft is already leasing data center space in Phoenix in data centers that do not use evaporative chillers (and has been for years). So I'm at a loss to explain why we have an article about them "investing in a new design" they are already using. This is likely just a feel-good article and isn't anything new.

    Also, for those folks saying "why not just build somewhere cold" etc. For plenty of workloads that is possible (like machine learning, REST-type services and anything that is transactional that way), but for others you still need to build close to the population centers you are serving because of latency. The perfect location for a data center is one where land is reasonably inexpensive, the power is reasonably cheap, and yet is still near large populations centers. It's not easy to find ideal locations and with the DC boom resulting from COVID and now machine learning it has become much more difficult.

    • Re:WUE (Score:4, Informative)

      by timeOday ( 582209 ) on Tuesday December 10, 2024 @03:38PM (#65004127)
      here [microsoft.com] is their actual press release. A quote:

      Replacement of evaporative systems with mechanical cooling will increase our power usage effectiveness (PUE). However, our latest chip-level cooling solutions will allow us to utilize warmer temperatures for cooling than previous generations of IT hardware, which enables us to mitigate the power use with high efficiency economizing chillers with elevated water temperatures The result is a nominal increase in our annual energy usage compared to our evaporative datacenter designs across the global fleet.

      This is a bit hard to parse, but it sounds like increasing WUE often comes at the cost of decreased PUE - it takes more pumps or compressors (or something) to transfer the heat to the atmosphere using dry coolers. But chip-level cooling helps them achieve good PUE and WUE simultaneously because the heat transfer from chip to coolant is more direct (not through air driven by fans) so the temperature delta between the chip and coolant doesn't need to be so large. So you don't need to cool down the coolant so much, which saves energy.

      Well, that's my best guess at what's being described.

      • Re:WUE (Score:5, Informative)

        by lazarus ( 2879 ) on Tuesday December 10, 2024 @04:33PM (#65004209) Journal

        Thanks for the link. You are exactly correct. As usual the media butchered it (in this case Bloomberg) -- the press release makes perfect sense.

        In a typical data centre the cooling cycle is: Chillers on the roof, which either use water or air-based chilling, cool a loop of water that runs to your server rooms. These rooms have devices called CRAHs (Computer Room Air Handlers) or FWUs (Fan Wall Units) that use the chilled water to blow cold air through the room. That air gets heated up by the equipment, it rises to the ceiling and is then sucked back out and into the chilled water loop, heating it up. That is then cooled back down by the chillers on the roof again. It's amazing that we can get a PUE of 1.25 to 1.4 out of such a system but it works pretty well.

        AI is driving much higher densities in the racks. A typical air-cooled rack is something like 8-12kW full of servers but can get as high as 20 or 30kW. To cool a rack that is pushing 80kW+ you need to use liquid cooling. Lots of techniques have been tried but the one the industry is settling on is direct-to-chip which uses a device called a CDU (Coolant Distribution Unit) to take the chilled water from the pipes that run to the CRAHs, and loop that out in smaller lines to the racks where it is distributed directly to cold plates on the CPUs and GPUs. This is almost exactly like what you would find on a higher-end gaming system.

        The wonderful thing about direct-to-chip cooling is that it is much more efficient than air cooling so your PUE goes down. The more your PUE goes down the more energy you can use to power servers and the less you need to use to cool equipment. With direct-to-chip efficiencies in cooling you can also have a higher chilled water loop temperature (because more cooling is getting directly to the equipment).

        So what Microsoft is saying in a nutshell is: "Hey, we're using less water because we're building more data centers with air chillers than evaporative water chillers, but because we're also deploying more direct-to-chip installations in those DCs, it's not increasing our power consumption too much".

        One last thought: You still have to have CRAHs or FWUs in a data hall because ancillary equipment still has to be cooled down, and humans have to work in them. So unfortunately we can't get rid of the necessity to cool down the air.

        • by jbengt ( 874751 )
          Already commented so I can't mod up you or your parent, but these posts are the most informative I've come across on this story.
      • by Lalo ( 10502809 )

        increasing WUE often comes at the cost of decreased PUE - it takes more pumps or compressors (or something) to transfer the heat to the atmosphere using dry coolers.

        The efficiency of a compressor is basically proportional to the square of the difference between its inlet and discharge temperatures (to highly simplify concepts of a refrigerant cycle).

        Let's say your chilled water temperature is 60F (evaporator side).

        Scenario 1, non-evaporative heat rejection: your condenser side is tied to the ambient dry bulb temperature + say 10F. So if it's 70F outside, then the condenser side is 80F.

        Scenario 2: evaporative heat rejection: your condenser side is tied to the ambient we

  • I guess they want to ignore the obscene amount of power this data center will use. All to generate mostly useless stuff.

  • Unless this makes it more efficient, it increases costs, which means one or more of lower quality, lower profits, higher prices.

  • Everything made perfect sense until this sentence: “Many of latest facilities are going up in hot, dry areas like Arizona and Texas”. What the actual fuck? Is a tax thing to build hot thirsty facilities in a desert?
  • That is where the true climate impact lies, oh and manufacturing all those components.

    - Every time you run an AI query, the planet dies a little more.

Do you suffer painful illumination? -- Isaac Newton, "Optics"

Working...