It’s hard to believe that ChatGPT, the hugely popular AI chatbot, was launched less than three years ago. In a startling short span of time, OpenAI’s DALL-E, Google’s Gemini, Anthropic’s Claude, and other AI tools have spurred new and pressing conversations around the role of technology in work, education, and daily life. Workers worry that AI will displace certain jobs, as companies begin hiring for AI-assisted positions. High school and college students now regularly use ChatGPT to take notes, complete problem sets, and write essays. The technology has even sparked debate in the realm of art, as painters and writers contemplate the consequences for their livelihoods of instant content generation.
Not everyone is happy about these developments. According to the law firm Baker & Hostetler, at least 12 U.S. lawsuits have been launched against AI firms for copyright infringement, as plaintiffs from the New York Times to Getty Images claim that large models trained on text and images from the Internet violate federal law. Others raise concerns about misinformation: AI tools are not designed to produce fact-checked, accurate content, and can even make up information that doesn’t exist.
Climate advocates, meanwhile, have sounded the alarm on AI’s unknown and growing environmental impacts. Utilities from Georgia to Texas are planning to build new natural gas power plants and keep aging coal plants online to feed energy demand from data centers running AI models, threatening state climate targets. New infrastructure could strain water resources in drought-prone areas as banks of servers gobble more and more water for cooling. And while AI as a tool can serve environmental goals, such as optimizing solar power production and supercharging climate modelling, it can also aid oil and gas companies in fossil fuel extraction and production.
Clean energy experts and lawmakers have called for a range of policies to address these impacts, including requiring greater transparency from companies and even setting a moratorium on new data centers. Some researchers have said that much like the general hype around AI, the technology’s environmental implications could potentially be overblown. In a recent report, analysts at the Bipartisan Policy Center wrote that factors like manufacturing capacity limits and the sustainability of generative AI business models could ultimately constrain future growth in the sector. In the near term, however, a major uptick in AI data centers seems certain. The Trump administration has made AI development a key policy priority, and tech companies have pledged hundreds of billions of dollars in AI investments over the next four years.
Yet the public remains largely in the dark when it comes to understanding the consequences of this boom. In the United States, tech companies aren’t required to disclose how much energy and water data centers and the AI models they run actually use, making the total environmental footprint of these new systems difficult to pin down. That has left advocates, academics, and regulators alike scrambling to understand this new technology — and figure out how to minimize its harms on the environment and communities.
Despite its lofty name, there’s no consensus on what the term “artificial intelligence” actually means. In February 2023, an attorney at the Federal Trade Commission wrote that AI is essentially “a marketing term” with “many possible definitions.” Amba Kak, the executive director of the AI Now Institute, describes AI as algorithms that process large amounts of data, such as text and images, to generate predictions, scores, or rankings.
Algorithms vary in size and computational power, and it’s the largest models, like ChatGPT, that have gained the most attention for gobbling up large amounts of power. Chatbots are a type of AI called large language models, which are trained on vast amounts of text data and predict the next word in a sequence based on the input they’ve received. Other such generative models that produce images or software code work similarly.
Developing such a complex model involves huge power inputs. The research institute Epoch AI estimated that one training run for the latest version of ChatGPT consumes enough electricity to power about 20,000 U.S. homes for one day. And once an AI tool goes online, even more power is needed to actually use it. The total amount of power usage is likely significant, given that ChatGPT alone boasts an estimated 300 million users per week.
For some businesses, AI has already driven a surge in greenhouse gas emissions. Last year, Google’s annual sustainability report revealed that the company’s carbon emissions had climbed nearly fifty percent over the past five years as a result of growing electricity consumption by data centers and supply chain emissions. Google stated that reaching its climate goal of net-zero emissions by 2030 could be difficult given “the uncertainty around the future environmental impact of AI, which is complex and difficult to predict.”
A 2024 study by the Department of Energy’s Lawrence Berkeley National Laboratory projected that data centers will use anywhere between 6.7 percent to 12 percent of total U.S. electricity by 2028, compared with 4.4 percent today. Climate advocates worry that unrestrained AI growth will lead to widespread construction of fossil fuel plants to meet rising power demand from data centers. But the wide range presented by federal scientists underlines how quickly the field is evolving, and how unpredictable the future of AI sector growth remains.
It’s important to understand that AI is only a subset of data centers, the category often used by policymakers when looking at future power demand, said Tanya Das, director of AI and energy technology policy at the Bipartisan Policy Center. Data centers also host websites, cloud storage, media streaming, and cryptocurrency mining, with AI making up only about ten percent of all facilities, Das said. When we talk about growing electricity use by data centers, “Some of it is driven by AI, but much of it is also driven by our larger reliance on digital services, and on the Internet and the cloud as a whole,” she said.
In the past few years, alarming headlines about AI’s skyrocketing electricity demand have abounded. But in a February report by Das’s organization and the data firm Koomey Analytics, researchers cautioned that empirical data doesn’t support this narrative, and that in fact the future of the AI sector is highly uncertain. One reason this idea has taken root is grid forecasts from “influential management consulting and investment advising firms,” as the report puts it. Last November, for example, Boston Consulting Group said that data centers could use as much electricity as two-thirds of all U.S. households by 2030.
But Jonathan Koomey, one of the report authors, said to Canary Media that these forecasts tend to extrapolate recent growth rates into the future, which isn’t necessarily accurate. Consulting firms could also be motivated to create hype in order to attract more business opportunities: “You get attention when you make aggressive forecasts,” he said.
In reality, the future of AI power demand is far less straightforward, Das said. Although the United States is experiencing significant electricity demand growth for the first time in decades, multiple reasons are behind the spike. One is increased electrification in the transportation and building sectors, as consumers purchase electric vehicles and swap out gas furnaces for heat pumps. The growth of domestic manufacturing is another. Policies like the Inflation Reduction Act and the 2022 CHIPS Act created incentives for clean energy and semiconductor companies to move overseas factories back home. Climate change has also driven higher residential power use, as households crank up heating and cooling to cope with more extreme weather.
Das and fellow researchers estimate that data centers will constitute at most a quarter of new electricity demand by 2030, with those other sources making up the rest. For AI in particular, there are many unknown factors that will determine how much the sector will grow. One is the uncertainty of future demand for AI services. “The industry’s current growth projections are aggressive, but whether they materialize depends on businesses realizing positive economic returns from AI investments and on whether users’ concerns about accuracy and reliability can be adequately addressed,” wrote the authors of the Bipartisan Policy Center report.
Another factor is potential supply chain constraints. Data centers require chips, servers, and other equipment like backup power generators. The speed of data center buildout depends on the manufacturing capacity of a handful of companies, Das said. Nvidia, the world’s primary provider of AI hardware, for example, mostly relies on a single semiconductor manufacturer in Taiwan. Yet another unknown is advances in computing efficiency. Historically, computing systems become more and more energy efficient as technology progresses, and the same has already begun to happen with AI.
The anxiety around AI reminds Das of outlandish narratives during the dot-com boom that the Internet would eventually use up half of all global electricity. “I think we’re in a similar moment right now, where there are some alarmist projections being made,” she said. “But in all likelihood, I think this is going to be a really manageable level of [electricity] load growth.”
At the local level, however, AI is already having major impacts on the communities living near data centers. The facilities are concentrated in a growing number of hotspots across the country, including Virginia, Texas, and Georgia. Ohio, Iowa, Arizona, Indiana, and Nevada have also seen a spike in data center proposals since January 2023. A 30-square mile area in Loudoun County, Virginia, known as “Data Center Alley” is home to the world’s biggest concentration of servers, with more than 200 structures consuming roughly the same amount of electricity as the city of Boston, according to Reuters. In 2023, data centers used around a quarter of Virginia’s electricity.
Some tech companies intend to power new data centers with zero-emission technology. Microsoft, Google, and Amazon, for example, have struck deals to operate some of their data centers with nuclear power. But other firms have embraced natural gas. Last December, Meta announced a $10 billion AI data center in northeast Louisiana that will be powered with new gas generation, although the company pledged to use more renewable energy later on. In Memphis, environmental groups have decried the use of on-site gas turbines to power a data center run by xAI, a company founded by Elon Musk.
Climate advocates worry that utilities preparing for a wave of new data centers will build gas plants that lock in decades of greenhouse gas emissions and air pollution risks—regardless of how much the AI sector ends up expanding. Utilities receiving proposals for data centers have already vastly increased forecasts for how much power they’ll need to provide in the coming years. In January, a report by the consultancy group Grid Strategies found that, in the past two years, utilities have upped their load-growth forecasts for the next five years by nearly five times, in response to new data center proposals as well as manufacturing.
Meeting that power demand will almost certainly require building new gas generation. Across the country, utilities are planning to build or extend the life of nearly 20 gigawatts of gas power plants, according to Canary Media. But tech companies, looking for the best deal, often submit duplicate data center proposals to multiple utilities. Only a fraction of those facilities will ultimately get built, meaning utilities are likely planning for more power capacity than they’ll end up needing. “The data center people are shopping these projects around, and maybe they approach five or more utilities. They’re only going to build one data center,” Koomey told Canary Media. “But if all five utilities think that interest is going to lead to a data center, they’re going to build way more capacity than is needed.”
Despite potentially inflated forecasts, gas expansion to serve data centers is already underway. According to a January report by the Center for Biological Diversity, six major utilities from California to the Carolinas are planning to build at least 22 gas projects through 2033, in part to power new data centers. Georgia is one of several states that has experienced recent growth in electricity use. In 2023, Georgia Power, the state’s largest utility, substantially increased projections of energy demand, forecasting a growth of about 75 percent in total electricity generation by 2033. As a result, Georgia Power is “fast-tracking the construction of three new methane and oil-burning units at Plant Yates and delaying the closure of two of its coal-fired units, Plants Bowen and Scherer, from 2027 to 2035,” according to the CBD.
New gas plants could disproportionately harm low-income communities and communities of color already overburdened with pollution. In South Carolina, reporting by Capital B News revealed that data center proposals would reopen at least two power plants in rural Black communities. The majority of the state’s power plants are located in areas with an above-average percentage of Black residents. Discounted utility rates for data centers also mean that local residents will end up footing the bill for those new power plants, Capital B News found.
Experts warn that an unrestrained buildout of data centers could conflict with statutory climate goals. A Virginia state report from December found that data centers could double the state’s power consumption by 2033 and nearly triple it by 2040. The utility Dominion Energy has proposed building 5.9 gigawatts of new gas plants in Virginia by 2039, which could jeopardize the state’s commitment to achieve 100 percent carbon-free electricity by 2050. Last July, an investigation by the Seattle Times and ProPublica found that Washington state’s tax incentives for data centers had encouraged a spike in energy demand that threatened its goal to reach carbon-neutral electricity by 2030.
Data center servers generate a lot of heat, and one of the most efficient ways to cool them is by using water. Researchers at the Lawrence Berkeley National Laboratory estimate that data centers consumed 66 billion liters of water in 2023. By 2028, that amount could double—or even quadruple. Most of the water used by data centers evaporates, displacing water from the local sources it came from. That poses serious problems in regions that already face water shortages, says Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside, who studies the environmental impacts of AI. In places where water is scarce, like Arizona or Chile, data centers end up competing with households and farms for the resource. As companies flock to areas with cheap land, the problems could get worse. A May investigation by Bloomberg News found that roughly two-thirds of new data centers built or in development since 2022 are located in areas experiencing water stress.
AI’s water footprint is just as murky as its energy consumption, and in many instances, residents have a difficult time obtaining accurate information about their local data centers. In 2022, after a legal battle between the Oregonian newspaper and the city of The Dalles, residents there learned that Google’s data centers had grown to the point that they constituted 29 percent of the town’s total water consumption. The Oregon city, with a population of 16,000, had been experiencing drought for years.
In 2023, Ren and fellow researchers estimated that global AI demand would account for 4.2 to 6.6 billion cubic meters of water consumption in 2027—equal to the total annual water use of Denmark four to six times over. He has also examined the water usage of individual AI models. Last year, Ren’s team calculated how much water OpenAI’s ChatGPT-4 model used for a standard query. They found that generating a 100-word email using ChatGPT required a little more than a bottle of water. If 10 percent of working Americans did that once weekly for a year, the total water consumption would equal the amount used by all Rhode Island households for 1.5 days.
Ren’s team also ran the numbers for ChatGPT-4’s power use, finding that generating a 100-word email uses the equivalent of 14 LED light bulbs for one hour. If 10 percent of all working Americans generate a 100-word email once weekly for a year, that adds up to more than 121,000 megawatt-hours of electricity—equal to the power used by all households in Washington, D.C., for 20 days.
Ren is confident in these estimates, in part because his team cross-checked their numbers with research published by Microsoft, a major investor in OpenAI. But in general, research on AI’s energy consumption can still end up far from reality because the public knows little about how exactly models are designed and used in real life, he said. Companies, for example, have to ensure that these models are responding to users almost instantaneously, which could pose even higher water and energy requirements.
AI’s environmental harms extend far beyond water and energy use. Data centers also add to air pollution because they rely on diesel-guzzling backup generators, which ensure that they can keep running in the case of a power outage. Although in theory, facilities only rarely turn these on, diesel generators still emit a substantial amount of pollution: “A typical diesel generator can release 200 to 600 times more [nitrogen oxide emissions] than a natural gas power plant producing the same amount of electricity,” wrote Ren and Caltech professor Adam Wierman in a recent blog. Gas plants that power some data centers, meanwhile, also release harmful air pollutants, including fine particulate matter and nitrogen oxides. A study by Ren and Wierman found that in 2023, air pollution attributed to data centers accounted for about $6 billion in public health damages in the United States. “This is well recognized in the public health community, but it seems that it’s not being recognized in the tech industry yet,” said Ren.
In recent years, state lawmakers have led a push to require greater transparency from AI companies and establish minimum clean energy requirements for data centers. In February, in response to the Seattle Times and ProPublica investigation mentioned earlier, Washington governor Bob Ferguson signed an executive order to launch a study on the impact of data centers on energy use and state tax revenue. Legislators in Virginia introduced more than a dozen bills to address concerns around data centers’ energy and water use this year, although none succeeded. In Connecticut, lawmakers have put forth a bill that would require data centers to run on at least 50 percent renewable energy.
In New York, a new bill would require data centers to submit annual reports on their energy use and other impacts and use 100 percent renewable energy by 2040. Policymakers have also taken action at the federal level by introducing a bill last year that would mandate a study of AI’s environmental impacts. In Indiana, some advocates have pushed for a moratorium on new data centers until their impacts on residents are better understood. Policy innovators in the United States could also learn from actions taken abroad: Bloomberg News reported in mid-May that the European Union will propose a measure by the end of 2026 to curb water use in data centers.
But even in the absence of strong regulation, gas and electric companies and public utility commissions can still take commonsense measures to address the environmental implications of AI. Facing an influx of duplicate data center proposals, utilities in places like Georgia have started taking steps to more accurately predict power demand, such as assigning probabilities to projects at different stages of development. “We’re seeing utilities start to get smarter on this issue and present more accurate proposals to their state public utility commissions,” Das says. “We’re trying to encourage public utility commissions to take a measured approach to viewing these proposals.”
Tech companies, for their part, can also take charge by scrutinizing their utilities’ decarbonization plans, committing to sourcing clean energy for data centers, and using batteries rather than diesel generators for backup power, Sierra Club advocates wrote in a report last year. They can also work with utilities to push for more aggressive climate action. “Large customers are often influential stakeholders with decisionmakers,” the authors wrote. “Large customers’ voices are needed to push utilities toward a system-wide transition to clean energy.”
Policymakers can also take steps to wield AI as a tool for furthering climate goals while addressing more immediate concerns posed by the impending buildout of data centers. AI has helped scientists manage power grids to conserve energy, provide more accurate weather forecasts, and improve cooling efficiency at data centers. Or as Representative Don Beyer (D-VA) says, “While recognizing the ways AI can help us decrease emissions in other sectors and develop innovative climate solutions, we need to ensure we are being responsible with the adverse impacts it may have on our environment now.”