* WTTW…
Illinois and Ohio rank fourth and fifth in the nation behind Virginia, Texas and California in terms of the number of data centers they house, whether that purpose is crypto mining, cloud computing or generative artificial intelligence.
In the past year or two, as the AI arms race has heated up, the size and scope of these centers has mushroomed, as has the pace at which these behemoths are being built, [Helena Volzer, a water policy experts at the advocacy organization Alliance for the Great Lakes] said.
A single hyperscale center of the sort operated by tech giants such as Meta or Microsoft — 10,000 square feet or more, with 5,000-plus servers — can consume 1 million to 5 million gallons of water each day. That’s 365 million gallons of water a year, Volzer said, or as much as 12,000 Americans’ annual use put together.
Not a single Great Lakes state currently has water management mechanisms in place to curb over-extraction, or what could be termed “de-watering,” before it happens, she said. The first step could be revising state groundwater management laws.
* Sun-Times…
Large data centers, many devoted to researching artificial intelligence, are expected to use more than 150 billion gallons of water across the U.S. over the next five years, according to the advocacy organization Alliance for the Great Lakes.
That’s enough water to supply 4.6 million homes.
The data centers, which also use large amounts of power, need water for cooling and because of the size of the large operations — sometimes more than 10,000 square feet — an enormous amount is needed for each site.
But in almost all instances, the amount of water that’s being withdrawn for a single data center development is unknown because secrecy agreements between government bodies and companies keep this information from being publicly disclosed, according to Helena Volzer, water policy expert with the Chicago-based group.
* Inside Climate News…
Non-disclosure agreements that companies ask municipalities to sign when they propose a data center further obscure how much water is needed and where it would come from, making it difficult to determine whether municipalities have enough supply, said Volzer, with Alliance for the Great Lakes.
To help combat that, some states in the region like Ohio and Indiana are now conducting regional water-demand studies, which would help communities determine where water is available before approving a data center. Some water managers are also conducting those studies in Illinois, but they are not required.
A bill proposed in February by Illinois state Sen. Steve Stadelman would have required data centers to disclose how much electricity and water they use, but lawmakers failed to vote on it before the legislative session ended May 31. […]
Ordinances in other Great Lakes states could serve as a model for how to regulate water diverted to data centers, she added. In Michigan, for example, companies proposing data centers must show that there is enough existing water supply to support the facility in order to get the state tax incentive.
…Adding… More about how data centers use water from Bloomberg…
Many data centers rely on evaporative cooling, or “swamp cooling,” where warm air is drawn through wet pads. Data centers typically evaporate about 80% of the water they draw, discharging 20% back to a wastewater treatment facility, according to Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside. Residential water usage, by comparison, loses just 10% to evaporation, discharging the other 90%, Ren said. (A spokesperson for Google said the company doesn’t have a standard percentage because any data center would see some variation based on factors like location, temperature and humidity.) […]
Recently, Microsoft said it developed a data center design that is closed so that water doesn’t evaporate but rather is constantly circulated between servers and chillers, without the need for refilling. The design will be deployed first in facilities in Wisconsin and Arizona, planned for 2026.
Crusoe Energy Systems, a developer behind OpenAI’s Stargate site in Abilene, also plans to use closed-loop cooling systems. But here, too, “there is a tradeoff in energy,” said Ben Kortlang, a partner at G2 Venture Partners, an investor in Crusoe. These systems are more power-hungry than evaporative methods, he said.
Click here for more info on data centers and wastewater treatment facilities.
* “OpenAI CEO Sam Altman Concedes GPT-5 Was a Misfire, Bets on GPT-6″…
One lesson from GPT-5’s launch is that people form emotional ties with AI, he noted. Some users described the new model as colder, more mechanical, and less supportive than its predecessor. After GPT-4o was deprecated, some Reddit users even said the upgrade “killed” their AI companions.
Despite the outcry on subreddits like r/MyBoyfriendisAI, r/AISoulmates, and r/AIRelationships, Altman estimated that fewer than 1% of ChatGPT users have “unhealthy relationships” with the bot but said the company is paying close attention. […]
While GPT-5 is still rolling out, Altman said that OpenAI is already looking ahead, noting the timeline between GPT-5 and 6 would be much shorter than GPT-4 and 5. However, Altman said GPU capacity may impact that calculation.
“We have better models, and we just can’t offer them because we don’t have the capacity,” Altman admitted, citing a shortage of GPUs, the powerful chips needed to run large AI systems. To solve that, Altman said OpenAI would need to spend “trillions of dollars on data center construction in the not very distant future.”