There’s hope for even the most ardent of climate change observers in the notion that if human innovation and technology got us into this mess, they can get us out of it. And few potential tools on the table are triggering quite the same blend of optimism and concern, promise and confusion, as artificial intelligence.
AI, which involves sophisticated computer systems that can mimic some aspects of human cognition, has vast potential to help humans combat climate change and be better prepared to deal with its effects. Experts are working on ways to use machine learning to help us use resources more efficiently, for example, and to more accurately predict increasingly common extreme weather events.
But before AI can be put to use in those ways, technology companies need to put the programs through intense training sessions and build or expand warehouse-scale data centers to support these systems. It takes lots of water and energy to keep densely packed computer servers for these systems cool and running smoothly.
A new study out of UC Riverside suggests technology companies, so far, aren’t doing enough to ensure such growing environmental impacts of AI are distributed equitably.
Instead, the study says, it appears tech companies are repeating some of the patterns that played out over the last century with fossil fuel companies and so many other industries: They’re opting to save money by making communities that already have strained resources and other added burdens also bear the brunt of environmental impacts associated with AI.
“The current way that we distribute AI computing based on cost is clearly disproportionately affecting certain regions, which are already stressed with resources like water or carbon problems,” said Shaolei Ren, a computer engineering professor at UC Riverside who helped author the paper.
“If AI uses resources in a wasteful manner, then that will reduce the net benefit of AI.”
As news came from the White House on Friday about top tech companies agreeing to voluntary safeguards on AI, organizations around the world — from the United Nations to the AI Now Institute — are calling for policies that also prioritize developing AI in an environmentally sustainable and equitable way.
Case study: Phoenix
An example of what can happen when environmental equity isn’t factored into the AI equation is unfolding a few hundred miles east of Ren’s university.
Phoenix and its neighboring communities have become go-to destinations for technology companies to build data centers. Land and electricity are cheaper there than in many other areas, Ren noted, while Arizona also offers attractive tax incentives for businesses.
Google broke ground earlier this month on a $1 billion data center in Mesa, just outside Phoenix. The campus, which will eventually cover 750,000 square feet, will help power Google’s existing tools and “ongoing artificial intelligence innovation,” the company said in a statement.
Microsoft opened one data center near Phoenix in 2021 and continues adding to the complex. Meta, the parent of Facebook, is building out a data center there, too, while a slew of other technology companies are also getting in on the trend.
AI requires mind-boggling levels of rapid computation, particularly in the training phase. That level of computation means big energy demands. And since electricity in much of the country still is generated by coal and other fossil fuels, that new AI is triggering substantial carbon emissions and adding to stresses on the nation’s energy grids.
Training GPT-3, a cousin to the better-known AI system ChatGPT, consumed more than 1,000 megawatt hours of electricity, Ren said. That’s the same amount of energy it takes to run more than 100 typical households for a year.
These data centers also need reliable cooling systems to prevent their rows of servers from becoming overheated. Companies typically use liquid cooling systems, where water is pumped through a closed loop to draw off the heat and keep things running smoothly.
Citing a commitment to transparency, Google released a report last fall that showed its global data centers used more than 4.3 billion gallons of water in 2021 — which, the company pointed out, is roughly equivalent to the water needed to irrigate and maintain 29 golf courses in the Southwest. Ren said the company’s center in The Dalles, Oregon accounted for roughly a third of the entire city’s annual water consumption.
Meta’s voluntary report showed 1.3 billion gallons used to cool its 17 data centers that same year.
Ren said GPT-3 training in Microsoft’s U.S. data centers are the most state-of-the-art facilities available, and even they still require 700,000 liters of water. That’s as much water as 2,000-plus people use on average each day. It’s also not including water needed to make electricity to power the data center, Ren notes, since coal and nuclear and other types of power plants require lots of water to operate.
Companies like Intel, which make the chips that are at the core of these systems, also have Phoenix-based manufacturing, a process that Ren said also is very water and energy intensive.
But Phoenix is feeling the strain of global warming like few other places in the United States. The region has been experiencing particularly severe water shortages for years due to ongoing droughts that threaten to cut off supplies from the Colorado River and other sources. Phoenix also is in the middle of a record-setting heat wave, with temperatures peaking at or above 110 degrees every day this month.
Such conditions have some residents and political leaders pushing back on the flood of tech companies looking to build data centers in their backyard.
Water worries, for example, have delayed Microsoft’s plans to build out its data center campus. And early this summer, Arizona Gov. Katie Hobbs spelled out plans to limit new construction around Phoenix after a study by state officials found there isn’t enough groundwater left to meet projected demand over the next century.
Sorting through ‘solutions’
At some point, Ren said caps on water usage and rising water costs might make places like Phoenix less attractive places to build data centers. But judging by the number of projects now in the works, clearly that day isn’t today.
Instead, technology companies have so far largely responded to resource pressures there and elsewhere by pledging to draw power from renewable sources such as solar projects, to pay for water restoration projects elsewhere and to pivot to air cooling for the dominant system to keep their servers from overheating.
Air cooling systems only work if the outside temperature is 85 degrees or less, though, Ren pointed out. The low in Phoenix for the next couple of weeks isn’t projected to fall below 87 degrees.
Also, air cooling systems require about 10% more energy, per Google. Unless companies take steps to ensure they’re getting power from renewable sources, Ren said any switch to air cooling systems will decrease their water footprint onsite while actually increasing both their offsite water footprint and their carbon emissions.
So what can technology companies do to reduce environmental impacts of AI and ensure those effects are more equitably distributed?
First, Ren said, they could choose to build data centers in places where water and heat waves aren’t such serious concerns. They also can commit to building the most advanced systems possible, Ren said, which are tied to renewable energy projects and include mechanical upgrades to allow for water to be used for cooling a couple times before it’s discharged.
Both of those moves would likely increase data center construction costs. That’s where tax incentives and grants can come in, which — coupled with buying some goodwill from consumers while lowering their own potential financial risks associated with climate change — can help offset those price differences.
On the flipside, parts of Europe are starting to make companies factor climate costs into their projects through mechanisms such as carbon taxes. That can push them to either lower the impacts of their projects, move them elsewhere or suck up the higher price tag, with that money going to climate-fighting programs.
Such strategies might help when it comes to future data center projects. But Ren and his co-writers argue there are operational changes companies can make at existing centers to reduce environmental impacts and distribute them in smarter ways.
“Because those large companies have lots of data centers throughout the world, they can actually shift the workloads, or shift the AI computing, from one data center to another without it being noticed, without affecting the user experience at all,” Ren said. “And so there’s a clear decision of how do we distribute this AI computing across different data centers so that the environmental costs will be more equitably distributed across different regions?”
The same AI these systems are powering, for example, could be used to figure out which data centers can be operated with the smallest carbon and water footprints at any given time and to shift workloads to those places.
Right now, Ren said systems are optimized to shift workloads every few minutes to wherever electricity is cheapest. But if they were trained to incorporate climate data into that calculation, they could, say, shift AI training programs from a facility in Phoenix during the summer to one in Washington. Or they could migrate daily workloads from a center in Virginia, that’s being largely powered by coal, to one in Texas that’s relying largely on solar energy.
A few companies, including Google and Microsoft, are trying to schedule their workloads based on the real-time availability of renewable energy, Ren said. But so far, those plans are still experimental and not standard practice.
When asked about how climate issues play into their decisions about where to build data centers and the steps they’re taking to make sure advancements related to AI won’t create new environmental injustices, a Google spokesperson deferred to the company’s fall statement touting a “climate-conscious approach” that includes using recycled water for cooling whenever possible. Meta didn’t respond by deadline. And a Microsoft spokesperson said, “We don’t have anything to share at this time.”
Despite his concerns about how companies are building out AI systems today, Ren said he remains optimistic about the future of machine learning as a tool to help fight climate change.
His own team at UC Riverside is using AI to plan for intense AI development work that requires lots of energy at times when the carbon footprint will be smallest. He also pointed to machine learning’s ability to, say, help a farmer use much less water by incorporating advanced weather forecasts, or to help more efficiently manage air conditioning systems in buildings.
“AI’s usage is a concern,” Ren said. “But it has greater potential to reduce the other sectors’ environmental costs.”