No Stupid Questions
No such thing. Ask away!
!nostupidquestions is a community dedicated to being helpful and answering each others' questions on various topics.
The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:
Rules (interactive)
Rule 1- All posts must be legitimate questions. All post titles must include a question.
All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.
Rule 2- Your question subject cannot be illegal or NSFW material.
Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.
Rule 3- Do not seek mental, medical and professional help here.
Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.
Rule 4- No self promotion or upvote-farming of any kind.
That's it.
Rule 5- No baiting or sealioning or promoting an agenda.
Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.
Rule 6- Regarding META posts and joke questions.
Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.
On fridays, you are allowed to post meme and troll questions, on the condition that it's in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.
If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.
Rule 7- You can't intentionally annoy, mock, or harass other members.
If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.
Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.
Rule 8- All comments should try to stay relevant to their parent content.
Rule 9- Reposts from other platforms are not allowed.
Let everyone have their own content.
Rule 10- Majority of bots aren't allowed to participate here. This includes using AI responses and summaries.
Credits
Our breathtaking icon was bestowed upon us by @Cevilia!
The greatest banner of all time: by @TheOneWithTheHair!
view the rest of the comments
Other commenters correctly describe the cost analysis for using evaporative cooling, but I'll add one more reason why it's the preferred method when water is available: evaporating water can dissipate truly outlandish amounts of heat with very few moving parts.
Harkening back to high school physics class, water -- like all other substances -- has a certain thermal capacity, meaning the energy needed to increase the temperature of 1 kg of water by 1 degree C. The specific thermal capacity of water is already quite high, at 4184 J/(kg*C), besting all the common metals and only losing to lithium, hydrogen, and ammonia. In nature, this means that large bodies of water are natural moderators of temperature, because water can absorb an entire day's worth of sunlight energy but not substantially change the water temperature.
But where water really trounces the competition is its "heat of vaporization". This is the extra energy needed for liquid water to become vapor; simply bringing water to 100 C is not sufficient to make it airborne. Water has a value of 2146 kJ/kg. Simplifying to where 1 kg of water is 1 liter of water, we can convert this unit into something more familiar: 0.596 kWh/L.
What these two physical properties of water tell us is that if our city water comes out of the pipe at 20 C, then to get it to 100 C to boil, we need the difference (80) times the thermal capacity (4184 J/kg*C), which is 334,720 J/kg . Using the same simplification from earlier, that comes out to be 0.093 kWh/L. And then to actual make the boiling liquid become a vapor (so that it'll float away), we then need 0.596 kWh/L on top of that.
Let that sink in for a moment: the energy to turn water into vapor (0.596 kWh/L) is six times higher than the energy (0.093 kWh/L) to raise liquid water from 20 C to 100 C. That's truly incredible, for a non-toxic, life-compatible substance that we can (but should we?) safely dump into the environment. If you total the two values, one liter of water can dissipate 0.69 kWh of energy per liter. Nice!
In the context of a 100 megawatt data center (which apparently is what the industry considers as the smallest "hyperscale data center"), if that facility used only evaporative cooling, the water requirement would be 144,927 L/hour. That is an Olympic-size swimming pool every 6.9 ~~seconds~~ hours. Not nice!
And AI datacenters are only getting larger, with some reaching into the low single-digits of gigawatts. But what is the alternative to cooling the more-modest data center from earlier? The reality is that the universe only provides for three forms of heat transfer: conduction, convection, and radiation. The heat from data centers cannot be concentrated into a laser and radiated into space, and we don't have some sort of underground granite mountain that the data centers can conduct their heat into. Convection is precisely the idea of storing the heat into a substance (eg water, air) and then jettisoning the substance.
So if we don't want to use water, then we have to use air. But for the two qualities of water that make it an excellent substance for evaporative cooling, air doesn't come close -- 1003 J/(kg*C) and no heat of vaporization, because air is already gaseous. That means we need to move ungodly amounts of air to dissipate 100 megawatts. But humanity has already invented the means to do this, by a clever structure that naturally encourages air to flow through it.
The only caveat is that the clever structure is a cooling tower, and is characteristic of nuclear power stations. It's also used for non-nuclear power station cooling, but it's most famous in the nuclear context, where generators are well into the gigawatt range. Should AI datacenters use nuclear-sized air cooling towers instead of water evaporation? It would work, but even as someone that's not anti-nuclear, the optics of raising a cooling tower in rural America just to cool a datacenter would be untenable. And that's probably why no AI datacenter has done that.
To be abundantly clear, I'd rather not have AI datacenters at all. But since the question was why water consumption is such a big deal, it might be best to say that it's a physics problem: there isn't any other readily-available way to provide cooling for 100+ megawatts, without building a 100+ meter tower. Water is always going to be cheaper and more on-hand than concrete.
Followup: what are the impediments to using, say, seawater instead?
People mentioned corrosion which is true of all sea water systems but in evaporative systems you also have the addition of salt forming on all the evaporative surfaces which can drastically increase corrosion more than normal seawater and cause fouling
So to do this properly you would want an RO system making freshwater before the cooler which at that point it would make more sense to just have a separate company doing desalination.
Salt water is a huge pain to work with. The salt would quickly corrode any cooling systems.
And even for fresh water, you have biofouling to worry about and what to do with the water after you've used it, can't just dump it into the environment untreated.
There are already heat exchanging systems that do this with brackish water already; you don't need to treat water if all you ate doing to the water is making the water hotter or colder.
While not strictly biofouling, the marine environment can definitely be affected by introducing hotter water where it didn't exist prior, in and around the outflow pipe. Seaside nuclear power stations that use seawater cooling need to be mindful to diffuse the heated water over a large area, to minimize the ecological impact. Citation: https://ui.adsabs.harvard.edu/abs/2025EcInd.17012986J/abstract
I agree that pumping in water at a different temperature can affect the environment. It is just that a lot of people tend to conflate the effluent coming from plants like this as something which needs chemical or other treatment when the issue is thermal only.
Very similar problems arise with desalination plants, which I wrote about here: https://sh.itjust.works/comment/14613302
Ok follow up question here. Is there cause to be concerned that releasing tons and tons of steam into the environment that was not there before will cause other environmental impact beyond just the reduced water supply? Like.... If the ambient air is cooling all that water back into rain or something will that tangibly impact temperatures, or will average humidity change? Or is that part at least too small of an impact to be particularly material?
Not meaningfully, no. In the middle of a dry desert far from other bodies of water you could theoretically form cumulus clouds downwind of your site (I have heard of this happening), but it would be teeny tiny.
The amount of water evaporation is just orders of magnitude too small. The earth gets about 1kW of energy per square meter, so a 9GW data center is approximately the same amount of waste heat as 9 million square meters, which is 900 hectares.
There is almost certainly an impact somewhere, but I don't have the data to know where it is. My conjecture is that a localized mass of steam would cause convection currents and drive microweather phenomena, especially downwind of such an air cooled facility. I'm not sure rain is necessarily the result, unless there's a sizable mountain downwind, since although hot air will rise, it might run out of steam (pun intended) before cooling down enough to fully condense out. So it might just be adding a layer of humidity that floats a few hundred meters above the surface.
But even that could be devastating, if said layer blocks natural convection currents over a downwind town or city. It could act as a thermal cap, making that town warmer at night, because heat rising from the city would meet that humid layer and get absorbed by the water. The thermal capacity of water comes into play again, but this time against the city.
Heat energy is a driver for cyclones, such as when the warm, moist water of the Caribbean accelerates air as it approaches the southern USA, and only once landborne does it start to slow down due to drag and losing its energy source. I doubt we'll ever have an AI-induced hurricane, but in a situation where there's already an energetic weather event, it cannot possibly help to be adding heat to that situation.
I defer to the meteorologists to say what happens to the local weather and climate, and biologists on what happens to humans and wildlife. But I can't see it being good, no.
I doubt that it has meaningful impact on climate. Evaporation from plants and oceans is many orders of magnitude greater. The issue is pretty always about fresh water availability in the given region.
You mean 6.9 hours? You're definitely off by a few orders of magnitude there.
Darn, you're right, the hours fell off in my dimensional analysis. Corrected, although 6.9 hours for a pool isn't much time for swimming at all.
So is air cooling actually feasible but we don't do it cause it would make data centers look like nuclear reactors? Or is it just not feasible?
AFAIK it's feasible for most data centers except the where power density is so huge that you just can't do it with air cooling. That issue is most common for large scale AI data centers.
Modern CPU consumes ~150W, modern AI chip can eat 700W and they're packed as densely as possible with multiple cards slotted in every motherboard.
Air cooling is feasible, as evidenced by existing power stations that use air cooling. A lot of newer nuclear generation use water cooling, being sited along the ocean and in the multi gigawatt range. But we can also find examples of inland power stations that have no water connection, and therefore need some massive cooling towers. Here is one in Germany that has a 2.2 GW rating and a 200 meter tall tower: https://en.wikipedia.org/wiki/Niederaussem_Power_Station
This is, as you can imagine, rather expensive to build, but it's doable. Cooling a coal fire is not substantially different than cooling compute loads in a data center, as it's all just a matter of moving heat around. Will there be differences due to the base temperature of coal versus GPUs? Yes, since the ratio of input to ambient temperature matters. But on the flip side, this should make it easier to construct, as the plumbing for lower temperatures is simpler.
Mechanical engineers can chime in on feasibility for AI data centers, but seeing as it hasn't been done, it's probably still cost related.