Intelligence Report
It’s hard to believe that ChatGPT, the hugely popular AI chatbot, was launched less than three years ago. In a startling short span of time, OpenAI’s DALL-E, Google’s Gemini, Anthropic’s Claude, and other AI tools have spurred new and pressing conversations around the role of technology in work, education, and daily life. Workers worry that AI will displace certain jobs, as companies begin hiring for AI-assisted positions. High school and college students now regularly use ChatGPT to take notes, complete problem sets, and write essays. The technology has even sparked debate in the realm of art, as painters and writers contemplate the consequences for their livelihoods of instant content generation.
Not everyone is happy about these developments. According to the law firm Baker & Hostetler, at least 12 U.S. lawsuits have been launched against AI firms for copyright infringement, as plaintiffs from the New York Times to Getty Images claim that large models trained on text and images from the Internet violate federal law. Others raise concerns about misinformation: AI tools are not designed to produce fact-checked, accurate content, and can even make up information that doesn’t exist.
Climate advocates, meanwhile, have sounded the alarm on AI’s unknown and growing environmental impacts. Utilities from Georgia to Texas are planning to build new natural gas power plants and keep aging coal plants online to feed energy demand from data centers running AI models, threatening state climate targets. New infrastructure could strain water resources in drought-prone areas as banks of servers gobble more and more water for cooling. And while AI as a tool can serve environmental goals, such as optimizing solar power production and supercharging climate modelling, it can also aid oil and gas companies in fossil fuel extraction and production.
Clean energy experts and lawmakers have called for a range of policies to address these impacts, including requiring greater transparency from companies and even setting a moratorium on new data centers. Some researchers have said that much like the general hype around AI, the technology’s environmental implications could potentially be overblown. In a recent report, analysts at the Bipartisan Policy Center wrote that factors like manufacturing capacity limits and the sustainability of generative AI business models could ultimately constrain future growth in the sector. In the near term, however, a major uptick in AI data centers seems certain. The Trump administration has made AI development a key policy priority, and tech companies have pledged hundreds of billions of dollars in AI investments over the next four years.
Yet the public remains largely in the dark when it comes to understanding the consequences of this boom. In the United States, tech companies aren’t required to disclose how much energy and water data centers and the AI models they run actually use, making the total environmental footprint of these new systems difficult to pin down. That has left advocates, academics, and regulators alike scrambling to understand this new technology — and figure out how to minimize its harms on the environment and communities.
Despite its lofty name, there’s no consensus on what the term “artificial intelligence” actually means. In February 2023, an attorney at the Federal Trade Commission wrote that AI is essentially “a marketing term” with “many possible definitions.” Amba Kak, the executive director of the AI Now Institute, describes AI as algorithms that process large amounts of data, such as text and images, to generate predictions, scores, or rankings.
Algorithms vary in size and computational power, and it’s the largest models, like ChatGPT, that have gained the most attention for gobbling up large amounts of power. Chatbots are a type of AI called large language models, which are trained on vast amounts of text data and predict the next word in a sequence based on the input they’ve received. Other such generative models that produce images or software code work similarly.
Developing such a complex model involves huge power inputs. The research institute Epoch AI estimated that one training run for the latest version of ChatGPT consumes enough electricity to power about 20,000 U.S. homes for one day. And once an AI tool goes online, even more power is needed to actually use it. The total amount of power usage is likely significant, given that ChatGPT alone boasts an estimated 300 million users per week.
For some businesses, AI has already driven a surge in greenhouse gas emissions. Last year, Google’s annual sustainability report revealed that the company’s carbon emissions had climbed nearly fifty percent over the past five years as a result of growing electricity consumption by data centers and supply chain emissions. Google stated that reaching its climate goal of net-zero emissions by 2030 could be difficult given “the uncertainty around the future environmental impact of AI, which is complex and difficult to predict.”
A 2024 study by the Department of Energy’s Lawrence Berkeley National Laboratory projected that data centers will use anywhere between 6.7 percent to 12 percent of total U.S. electricity by 2028, compared with 4.4 percent today. Climate advocates worry that unrestrained AI growth will lead to widespread construction of fossil fuel plants to meet rising power demand from data centers. But the wide range presented by federal scientists underlines how quickly the field is evolving, and how unpredictable the future of AI sector growth remains.
It’s important to understand that AI is only a subset of data centers, the category often used by policymakers when looking at future power demand, said Tanya Das, director of AI and energy technology policy at the Bipartisan Policy Center. Data centers also host websites, cloud storage, media streaming, and cryptocurrency mining, with AI making up only about ten percent of all facilities, Das said. When we talk about growing electricity use by data centers, “Some of it is driven by AI, but much of it is also driven by our larger reliance on digital services, and on the Internet and the cloud as a whole,” she said.
In the past few years, alarming headlines about AI’s skyrocketing electricity demand have abounded. But in a February report by Das’s organization and the data firm Koomey Analytics, researchers cautioned that empirical data doesn’t support this narrative, and that in fact the future of the AI sector is highly uncertain. One reason this idea has taken root is grid forecasts from “influential management consulting and investment advising firms,” as the report puts it. Last November, for example, Boston Consulting Group said that data centers could use as much electricity as two-thirds of all U.S. households by 2030.
But Jonathan Koomey, one of the report authors, said to Canary Media that these forecasts tend to extrapolate recent growth rates into the future, which isn’t necessarily accurate. Consulting firms could also be motivated to create hype in order to attract more business opportunities: “You get attention when you make aggressive forecasts,” he said.
In reality, the future of AI power demand is far less straightforward, Das said. Although the United States is experiencing significant electricity demand growth for the first time in decades, multiple reasons are behind the spike. One is increased electrification in the transportation and building sectors, as consumers purchase electric vehicles and swap out gas furnaces for heat pumps. The growth of domestic manufacturing is another. Policies like the Inflation Reduction Act and the 2022 CHIPS Act created incentives for clean energy and semiconductor companies to move overseas factories back home. Climate change has also driven higher residential power use, as households crank up heating and cooling to cope with more extreme weather.
Das and fellow researchers estimate that data centers will constitute at most a quarter of new electricity demand by 2030, with those other sources making up the rest. For AI in particular, there are many unknown factors that will determine how much the sector will grow. One is the uncertainty of future demand for AI services. “The industry’s current growth projections are aggressive, but whether they materialize depends on businesses realizing positive economic returns from AI investments and on whether users’ concerns about accuracy and reliability can be adequately addressed,” wrote the authors of the Bipartisan Policy Center report.
Another factor is potential supply chain constraints. Data centers require chips, servers, and other equipment like backup power generators. The speed of data center buildout depends on the manufacturing capacity of a handful of companies, Das said. Nvidia, the world’s primary provider of AI hardware, for example, mostly relies on a single semiconductor manufacturer in Taiwan. Yet another unknown is advances in computing efficiency. Historically, computing systems become more and more energy efficient as technology progresses, and the same has already begun to happen with AI.
The anxiety around AI reminds Das of outlandish narratives during the dot-com boom that the Internet would eventually use up half of all global electricity. “I think we’re in a similar moment right now, where there are some alarmist projections being made,” she said. “But in all likelihood, I think this is going to be a really manageable level of [electricity] load growth.”
At the local level, however, AI is already having major impacts on the communities living near data centers. The facilities are concentrated in a growing number of hotspots across the country, including Virginia, Texas, and Georgia. Ohio, Iowa, Arizona, Indiana, and Nevada have also seen a spike in data center proposals since January 2023. A 30-square mile area in Loudoun County, Virginia, known as “Data Center Alley” is home to the world’s biggest concentration of servers, with more than 200 structures consuming roughly the same amount of electricity as the city of Boston, according to Reuters. In 2023, data centers used around a quarter of Virginia’s electricity.
Some tech companies intend to power new data centers with zero-emission technology. Microsoft, Google, and Amazon, for example, have struck deals to operate some of their data centers with nuclear power. But other firms have embraced natural gas. Last December, Meta announced a $10 billion AI data center in northeast Louisiana that will be powered with new gas generation, although the company pledged to use more renewable energy later on. In Memphis, environmental groups have decried the use of on-site gas turbines to power a data center run by xAI, a company founded by Elon Musk.
Climate advocates worry that utilities preparing for a wave of new data centers will build gas plants that lock in decades of greenhouse gas emissions and air pollution risks—regardless of how much the AI sector ends up expanding. Utilities receiving proposals for data centers have already vastly increased forecasts for how much power they’ll need to provide in the coming years. In January, a report by the consultancy group Grid Strategies found that, in the past two years, utilities have upped their load-growth forecasts for the next five years by nearly five times, in response to new data center proposals as well as manufacturing.
Meeting that power demand will almost certainly require building new gas generation. Across the country, utilities are planning to build or extend the life of nearly 20 gigawatts of gas power plants, according to Canary Media. But tech companies, looking for the best deal, often submit duplicate data center proposals to multiple utilities. Only a fraction of those facilities will ultimately get built, meaning utilities are likely planning for more power capacity than they’ll end up needing. “The data center people are shopping these projects around, and maybe they approach five or more utilities. They’re only going to build one data center,” Koomey told Canary Media. “But if all five utilities think that interest is going to lead to a data center, they’re going to build way more capacity than is needed.”
Despite potentially inflated forecasts, gas expansion to serve data centers is already underway. According to a January report by the Center for Biological Diversity, six major utilities from California to the Carolinas are planning to build at least 22 gas projects through 2033, in part to power new data centers. Georgia is one of several states that has experienced recent growth in electricity use. In 2023, Georgia Power, the state’s largest utility, substantially increased projections of energy demand, forecasting a growth of about 75 percent in total electricity generation by 2033. As a result, Georgia Power is “fast-tracking the construction of three new methane and oil-burning units at Plant Yates and delaying the closure of two of its coal-fired units, Plants Bowen and Scherer, from 2027 to 2035,” according to the CBD.
New gas plants could disproportionately harm low-income communities and communities of color already overburdened with pollution. In South Carolina, reporting by Capital B News revealed that data center proposals would reopen at least two power plants in rural Black communities. The majority of the state’s power plants are located in areas with an above-average percentage of Black residents. Discounted utility rates for data centers also mean that local residents will end up footing the bill for those new power plants, Capital B News found.
Experts warn that an unrestrained buildout of data centers could conflict with statutory climate goals. A Virginia state report from December found that data centers could double the state’s power consumption by 2033 and nearly triple it by 2040. The utility Dominion Energy has proposed building 5.9 gigawatts of new gas plants in Virginia by 2039, which could jeopardize the state’s commitment to achieve 100 percent carbon-free electricity by 2050. Last July, an investigation by the Seattle Times and ProPublica found that Washington state’s tax incentives for data centers had encouraged a spike in energy demand that threatened its goal to reach carbon-neutral electricity by 2030.
Data center servers generate a lot of heat, and one of the most efficient ways to cool them is by using water. Researchers at the Lawrence Berkeley National Laboratory estimate that data centers consumed 66 billion liters of water in 2023. By 2028, that amount could double—or even quadruple. Most of the water used by data centers evaporates, displacing water from the local sources it came from. That poses serious problems in regions that already face water shortages, says Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside, who studies the environmental impacts of AI. In places where water is scarce, like Arizona or Chile, data centers end up competing with households and farms for the resource. As companies flock to areas with cheap land, the problems could get worse. A May investigation by Bloomberg News found that roughly two-thirds of new data centers built or in development since 2022 are located in areas experiencing water stress.
AI’s water footprint is just as murky as its energy consumption, and in many instances, residents have a difficult time obtaining accurate information about their local data centers. In 2022, after a legal battle between the Oregonian newspaper and the city of The Dalles, residents there learned that Google’s data centers had grown to the point that they constituted 29 percent of the town’s total water consumption. The Oregon city, with a population of 16,000, had been experiencing drought for years.
In 2023, Ren and fellow researchers estimated that global AI demand would account for 4.2 to 6.6 billion cubic meters of water consumption in 2027—equal to the total annual water use of Denmark four to six times over. He has also examined the water usage of individual AI models. Last year, Ren’s team calculated how much water OpenAI’s ChatGPT-4 model used for a standard query. They found that generating a 100-word email using ChatGPT required a little more than a bottle of water. If 10 percent of working Americans did that once weekly for a year, the total water consumption would equal the amount used by all Rhode Island households for 1.5 days.
Ren’s team also ran the numbers for ChatGPT-4’s power use, finding that generating a 100-word email uses the equivalent of 14 LED light bulbs for one hour. If 10 percent of all working Americans generate a 100-word email once weekly for a year, that adds up to more than 121,000 megawatt-hours of electricity—equal to the power used by all households in Washington, D.C., for 20 days.
Ren is confident in these estimates, in part because his team cross-checked their numbers with research published by Microsoft, a major investor in OpenAI. But in general, research on AI’s energy consumption can still end up far from reality because the public knows little about how exactly models are designed and used in real life, he said. Companies, for example, have to ensure that these models are responding to users almost instantaneously, which could pose even higher water and energy requirements.
AI’s environmental harms extend far beyond water and energy use. Data centers also add to air pollution because they rely on diesel-guzzling backup generators, which ensure that they can keep running in the case of a power outage. Although in theory, facilities only rarely turn these on, diesel generators still emit a substantial amount of pollution: “A typical diesel generator can release 200 to 600 times more [nitrogen oxide emissions] than a natural gas power plant producing the same amount of electricity,” wrote Ren and Caltech professor Adam Wierman in a recent blog. Gas plants that power some data centers, meanwhile, also release harmful air pollutants, including fine particulate matter and nitrogen oxides. A study by Ren and Wierman found that in 2023, air pollution attributed to data centers accounted for about $6 billion in public health damages in the United States. “This is well recognized in the public health community, but it seems that it’s not being recognized in the tech industry yet,” said Ren.
In recent years, state lawmakers have led a push to require greater transparency from AI companies and establish minimum clean energy requirements for data centers. In February, in response to the Seattle Times and ProPublica investigation mentioned earlier, Washington governor Bob Ferguson signed an executive order to launch a study on the impact of data centers on energy use and state tax revenue. Legislators in Virginia introduced more than a dozen bills to address concerns around data centers’ energy and water use this year, although none succeeded. In Connecticut, lawmakers have put forth a bill that would require data centers to run on at least 50 percent renewable energy.
In New York, a new bill would require data centers to submit annual reports on their energy use and other impacts and use 100 percent renewable energy by 2040. Policymakers have also taken action at the federal level by introducing a bill last year that would mandate a study of AI’s environmental impacts. In Indiana, some advocates have pushed for a moratorium on new data centers until their impacts on residents are better understood. Policy innovators in the United States could also learn from actions taken abroad: Bloomberg News reported in mid-May that the European Union will propose a measure by the end of 2026 to curb water use in data centers.
But even in the absence of strong regulation, gas and electric companies and public utility commissions can still take commonsense measures to address the environmental implications of AI. Facing an influx of duplicate data center proposals, utilities in places like Georgia have started taking steps to more accurately predict power demand, such as assigning probabilities to projects at different stages of development. “We’re seeing utilities start to get smarter on this issue and present more accurate proposals to their state public utility commissions,” Das says. “We’re trying to encourage public utility commissions to take a measured approach to viewing these proposals.”
Tech companies, for their part, can also take charge by scrutinizing their utilities’ decarbonization plans, committing to sourcing clean energy for data centers, and using batteries rather than diesel generators for backup power, Sierra Club advocates wrote in a report last year. They can also work with utilities to push for more aggressive climate action. “Large customers are often influential stakeholders with decisionmakers,” the authors wrote. “Large customers’ voices are needed to push utilities toward a system-wide transition to clean energy.”
Policymakers can also take steps to wield AI as a tool for furthering climate goals while addressing more immediate concerns posed by the impending buildout of data centers. AI has helped scientists manage power grids to conserve energy, provide more accurate weather forecasts, and improve cooling efficiency at data centers. Or as Representative Don Beyer (D-VA) says, “While recognizing the ways AI can help us decrease emissions in other sectors and develop innovative climate solutions, we need to ensure we are being responsible with the adverse impacts it may have on our environment now.”
BRIEFING As AI expands its role in daily life, climate advocates have sounded the alarm on the technology’s growing impacts on energy, water, and public health. Yet the public remains largely in the dark when it comes to understanding the full consequences of this boom.
The Price of the Beautiful Bill’s Epic Wrong Turn on Climate Change
The stark juxtaposition of two unrelated events on July 4 exposed the bankruptcy of U.S. climate policy. In Washington, D.C., Republicans celebrated passage of the One Big Beautiful Bill Act, which rolled back tax credits and subsidies for clean energy enacted just three years earlier. The same day, an unprecedented rainfall in Texas caused the Guadalupe River to rapidly overflow its banks, with the loss of over a hundred lives and extensive property damage.
No one in the Trump administration acknowledged a causal connection between climate change and the devastating flooding in Kerr County. Yet for years, scientists and federal agencies have warned that extreme natural events (flooding from torrential rainfall and sea-level rise, wildfires, drought, life-threatening heat waves) are on the rise because of increasing global temperatures.
The Texas tragedy was not an isolated nor unexpected event. Along with other weather-related disasters, a growing number of regions have experienced unusually heavy precipitation during short periods and rapidly rising waters in nearby rivers that inundate vulnerable communities and endanger life and property.
The public health and economic impacts of these destructive events are large and growing. According to a 2024 analysis by the National Oceanic and Atmospheric Administration, over the previous five years, the United States experienced 190 separate billion-dollar-plus disasters, which totaled to $746.7 billion in damages on the U.S. economy. These costs were more than double the annual average of disaster-related costs during 1980–2024. As global temperatures continue to rise, so will the staggering human and economic price of climate-related disasters.
Accelerating the transition to clean energy will not immediately reverse the buildup of GHGs in the atmosphere and the related increase in extreme weather events. But a sustained commitment to replace fossil fuels with non-emitting energy sources will over time slow and potentially stop that buildup, moderating increases in destructive climate impacts.
The tax incentives and subsidies in the 2022 Inflation Reduction Act stimulated a surge in clean energy investment which, according to the Energy Information Agency, resulted in record levels of solar and battery installations and electric vehicle sales in the first half of 2025. But as the Rhodium Group recently concluded, the Beautiful Bill’s rollbacks of IRA provisions will reduce the build-out of new clean power generating capacity by 53-59 percent by 2035, putting more than $500 billion of clean energy and transportation investment at risk.
Rhodium has calculated that the Trump bill will cause an increase in GHG emissions of 315-574 million metric tons in 2035 compared to the IRA baseline, slowing U.S. progress in reducing its carbon footprint and making it impossible to achieve the Biden administration’s 2035 target of a 60-62 percent emissions decline from 2005 levels.
Many Republicans in Congress are concerned about the rise in extreme weather events, and some have acknowledged the connection between these events and climate change. But Republicans were silent about these issues during the debate on the Beautiful Bill and voted in lockstep for its passage after unsuccessfully trying to protect job-creating clean energy projects in their districts.
Why did Republicans acquiesce in this reckless reversal of climate policies that were backed by years of scientific research and a global consensus in support of confronting the climate threat? The simple answer is fear of reprisals by President Trump. Since the beginning of his first term in 2017, Trump has disparaged climate concerns as the “New Green Scam” and is now relentlessly purging climate research and mitigation programs, and even any mention of global warming, from the work of federal agencies.
Leading Trump officials have railed against “climate change religion,” “climate fanaticism,” “the climate change alarm industry,” “woke chimeras of the Left,” and “exaggerated and implausible climate threats.” The president has called for the “dominance” of fossil fuels and attacked renewable energy as unreliable, unsightly, and damaging to the environment in its own right.
It is not surprising that the president’s climate denialism, fossil fuel boosterism, and hatred of clean energy would translate into a wholesale repeal of the IRA incentives for low carbon technologies. It’s also not surprising that, after regularly capitulating to Trump demands, Republicans who should know better would again be cowed into submission by their fear of a vengeful president. But in the end the One Big Beautiful Bill will be remembered as an unforced error of historic proportions that left the country more vulnerable to costly and life-threatening natural disasters like the tragic flooding in Kerr County, Texas.
The Price of the Beautiful Bill’s Epic Wrong Turn on Climate Change.
The State of Climate Disclosure Laws in 2025
Gender-Based Violence and Climate Change
Gender can define those most vulnerable to climate change. Climate migration demonstrates this vulnerability all too well. When extreme weather events occur, shelters offer little security from sexual violence. When population displacement is triggered by climate impacts, violence is not uncommon en route to and within relocated settlements. When natural disasters force family separation, a rise in sexual trafficking and gender-based labor exploitation has been observed among trapped populations. When climate-related events disrupt socialized gender roles, frustration lends to domestic violence. These are all situations in which gender-based violence, or GBV, comes to light.
According to the Convention on the Elimination of All Forms of Discrimination Against Women, GBV refers to “violence that is directed against a woman because she is a woman or [violence] that affects women disproportionately.” The UN Human Rights Council expands that to include “any harmful act directed against individuals or groups of individuals on the basis of their gender.” Thus, GBV may be directed against men, boys, and individuals who identify with an alternate gender concept.
While the gender convention and the international human rights regime will remain central to addressing GBV and underlying factors, such as inequality and discrimination, the international climate change regime offers some avenues for incorporating GBV considerations. Having adopted over 80 decisions with gender-related mandates and seen a rise in gender-responsive language in several of the climate convention’s annual conference documents, plus the Paris Agreement, the climate change legal regime is developing a strong track record. Whether the inclusion of gender-responsive language has translated to systematic addressing of gender inequality issues, including GBV in climate change contexts, remains a work in progress. Existing monitoring and implementation frameworks, such as the Nationally Determined Contributions under Paris, offer opportunities to translate that language into action.
Recognition of the intersection between GBV and climate change is of growing interest to governments of climate vulnerable states, especially those with a long history of GBV. This coincides with the realization that climate change may not only generate new GBV issues, it may also unearth and exacerbate existing ones. In a region known for its history of GBV, Samoa and Fiji stand out for their use of NDCs to tackle GBV. Both countries included gender-responsive language in their climate goals, which subsequently led to dedicated national security policy that integrated gender and climate.
Climate finance mechanisms—like the Green Climate Fund and the Global Environment Facility —have also demonstrated a willingness to address gender inequality. While these mechanisms have not specifically included GBV, conditioning finance on the inclusion of GBV as a consideration in project implementation is relatively simple to put into practice.
Furthermore, Climate Change Gender Action Plans, known as ccGAPs, are a mechanism developed by IUCN offering countries a framework for embedding the goals of the climate convention’s Gender Action Plan in a way that aligns with country circumstances, while such work is supported by IUCN resources and expertise.
Already utilized by over 20 countries, ccGAPs have shown success in prompting the inclusion of gender considerations in NDCs. This mechanism warrants consideration for broader adoption by the international climate change regime.
Most importantly, addressing GBV in the climate change nexus necessitates elevating “invisible leaders” in this space to positions of decisionmaking power.
While it is true that climate change does not discriminate but that its impacts are often discriminatory, it is also true that addressing the gender dimensions of those impacts is both achievable and necessary.
Environmental Defense Is National Security
Last November, I heard Sherri Goodman, appointed in 1993 as the first-ever deputy undersecretary of defense for environmental security, speak about her new book, Threat Multiplier. After the event, one of her former DOD colleagues called Goodman “whip smart” and a “doer.” These qualities undergirded her efforts to move the Pentagon’s institutional behemoth toward investing first in environmental stewardship and, eventually, in climate action.
The task was, and remains, daunting. With a budget in 2024 of nearly $900 billion, DOD employs some 1.4 million active duty personnel and 950,000 civilians who operate at over 750 installations in the United States and across the globe. Its mission: “To provide the military forces needed to deter war and to protect the security of the United States.”
How Goodman helped introduce environmental stewardship and climate action into this ponderous institution presents lessons both in political craftsmanship and in creating conceptual linkages that affirm why environmental and climate action are integral to national security. Goodman’s task, as she saw it, was not primarily about saving the planet; it was about securing a nation and its people. In her view the effort was not, as some colleagues thought, mission creep.
Having come to DOD as a Clinton administration appointee from the Senate Committee on Armed Services, Goodman had deep knowledge both of military substance and political processes. She arrived at the Pentagon when most of the armed forces leadership thought of environmental investment as unrelated to security. How could she shift military awareness toward understanding that environmental stewardship is “not only good for trees and turtles” but also “good for the health and safety of our troops and the communities they served?”
Even before the effects of climate change had become apparent, Goodman had several immediate tasks in the 1990s. First, she had to ensure DOD properties transferred to other agencies and entities in the Base Realignment and Closure process were cleaned up and ready. Second, in an era of increasing focus on environmental protections and sustaining endangered species, she had to find ways to align such stewardship with the military’s security mission. Without such alignment, base commanders and DOD officials were likely to see such efforts as burdensome.
Goodman understood the importance of illustrating co-benefits—how measures to protect red-cockaded woodpecker habitat could serve as real-world landscape obstacles important to training exercises. She also understood the importance of collaboration, including with communities adjacent to properties being transferred under BRAC.
The shifting awareness of environmental investments as intrinsic to the defense mission was more than window-dressing. As Goodman recounts, cleaning up military waste is essential to the health of troops and their families. Sustaining clean water by shifting away from lead bullets helps maintain a healthy military. Eventually, in tackling climate change, Goodman and her DOD allies showed that reliance on fossil fuels in Iraq and elsewhere was both costly and put lives at risk—“one soldier was killed for every twenty-four convoys to resupply fuel or water” in Afghanistan. And costs were exorbitant. Getting fuel to remote locations in Iraq and Afghanistan for each gallon cost “a whopping $400 by the time the costs of transport and security were fully factored in.”
In the effort to shift away from fossil fuels, as with her other efforts, Goodman put a premium on data. One result: a study entitled “More Capable Warfighting Through Reduced Fuel Burden.”
Even with co-benefits, cost savings, and mission alignment, shifting an entire institutional culture is challenging. Though many military leaders began to embrace the linkages between the military mission, environmental stewardship, and climate action, embrace of this vision was neither immediate nor uniform.
During my eight years at the Department of the Interior, we sometimes struggled to come to agreement with DOD on “how clean is clean enough” in removing risks of unexploded ordinance on former military lands transferred to the Fish and Wildlife Service and the Bureau of Land Management. I recall BLM’s challenges at the transferred Fort Ord lands, arguing that firefighting on these lands required more extensive cleanup that the military was reluctant to undertake.
We faced DOD sluggishness in efforts to clean up lead paint devastating to Laysan albatross on Midway Island. I met with some military commanders that continued to see protections of endangered species as a distracting nuisance; others, as Goodman recounts, enthusiastically embraced environmental protections and found ways to make such protections, as in the case of the woodpecker, part of their training.
With the surge of climate change effects and despite deep partisan divisions on the issue, the case has become even more compelling for the military to invest in environmental protection, climate mitigation, and climate adaptation. Goodman has played a fundamental role in this evolution: climate change is a “threat multiplier,” a term Goodman invented that has been embraced not only by the U.S. military but by other nations, as well.
Goodman had ample direct evidence to substantiate this perspective. Laying out the case she articulated to military leadership both during and after her time at DOD, she recounts the effects of climate change on military installations, on other nations and the safety and stability of their populations, on geopolitical dynamics, and much more. Many of these effects were not hypothetical or the result of predictive modeling. They unfolded real-time.
Several dramatic events underscore threats to the military from climate change. Hurricane Sandy devastated Coast Guard Station Sandy Hook, rendering it inoperable. Goodman reports Admiral Phillips of the Coast Guard lamenting that “we were not able to execute our duties.” Contemplating the effects, Admiral Phillips observed: “It’s a series of cascading casualties with increasing significance over time, and we are behind the rate of change.”
Similar risks were (and are) evident elsewhere. Norfolk, Virginia, home of a large Navy base, faces the highest rates of sea-level rise on the East Coast, imperiling operations. Hurricane Michael decimated Tyndall Air Force Base in Florida in 2018. The list goes on, prompting former defense secretary Leon Panetta to opine, as Goodman recounts, that rather than wishing for the past, we need to prepare for a “future climate Pearl Harbor.”
Goodman’s threat multiplier framework extends beyond climate impacts to military installations. It extends to the nature of national security challenges. Goodman points to the Arctic, where melting sea ice “is the proverbial ‘canary’ in the earth climate system,” with significant geopolitical implications. Rear Admiral David Titley told Goodman, “We are witnessing a failure of imagination” in contemplating how climate change affects military readiness. Goodman suggests that “climate change has produced a different ‘battlefield’ for just about every scenario a planner can now imagine…altering the geostrategic landscape.” Chinese vessels now ply the Arctic; Russia and others eye potential fuels and minerals extraction on the Arctic seabed—all these activities in close proximity to the United States.
Goodman also points to natural resource conditions across the globe—whether in the form of chronic drought, extreme wildland fires, flooding, and more—and what she calls a “direct correlation between stability and access to natural resources.” These conditions make climate change “inextricably linked” to national security, as they can bring civil unrest and heighten risks of conflict.
Despite the fits and starts and uneven embrace of investment in environmental protections and climate action across a sprawling national defense system, Goodman’s message is optimistic. She is optimistic because of the intrinsic linkage of climate and environmental investments and national security. She is optimistic because the military is a force of innovation, as in its pioneering of new liquid fuels for naval and aviation operations that reduce greenhouse gases. She is optimistic because so many allies are cooperating on climate action. She sees investments in innovation as “opportunity multipliers.”
But Goodman is also realistic. The event at which she spoke about the threat multiplier of climate change took place November 6, one day after the election. With a new administration for which climate action is viewed with skepticism or hostility, what, asked the audience, are the prospects for continued progress by the military in advancing clean energy, climate-resilient infrastructure, and more?
Her message is twofold. First, current actions to invest in climate resilience, greenhouse gas reductions, energy efficiency, and more are fundamentally imperatives for national security. The nation simply cannot have its military installations regularly flooding, its electric power going out for days on end, or troops at acute risk in transporting fossil fuels to battlefields. Business as usual is simply not sustainable, as one Navy vice admiral told Goodman. Second, sometimes labels matter. Some DOD activities come under labels of water security, or energy efficiency, or electric power reliability. All of these endeavors are part of a climate action portfolio, but they do not require that label for their justification. They will, Goodman argues, continue.
Threat Multiplier, packed full of facts about military operations, is also a story of political acuity in which the author understood how to emphasize “mission first,” multi-benefits, collaboration, and innovation to drive change. Looking at the results of her efforts, Goodman embraces Madeleine Albright’s quip: “I’m an optimist who worries a lot.”
Lynn Scarlett was deputy secretary of the interior in the Bush II administration.
Lynn Scarlett On Environment as a “Threat Multiplier”
Earth’s Climate Sensitivity and Related Institutional Sensitivity
Carbon dioxide is on track to double its pre-industrial level. It is now 420 parts per million, up from 280 ppm. How much will doubling carbon dioxide increase temperature? Though simple to pose, the question of climate sensitivity is devilishly complex to answer.
Earlier this year, in a cutting edge Science Advance paper, the paleoclimatologist Vincent Cooper and colleagues estimate climate sensitivity of 2.9°C. Their paper has data and computer models galore. Their statistics describe not just Earth’s current temperature and climate, but also temperature and ice sheets 21,000 years ago.
Those reconstructed temperatures, derived from marine microfossils, separately estimate past ocean temperatures in the west Pacific, east Pacific, and southern oceans near Antarctica. Their computer model of coupled atmosphere/ocean/ice climate dynamics is grounded in the chemistry, physics, and climatology of carbon dioxide, water, and other greenhouse gases, and contains a detailed description of continents, oceans, and ice fields.
A central difficulty faced by Cooper, and indeed by all scientific studies of climate sensitivity, is that the direct impact of carbon dioxide emissions get amplified. With no amplification pathways, doubling carbon dioxide would cause only about 1.1°C warming.
Yet actual climate sensitivity is at least double what carbon dioxide acting alone would cause. Water mediates many of those amplification pathways. Depending on whether it is liquid, vapor, ice, or clouds, water can either diminish or amplify the temperature increase from carbon dioxide alone. For example, ice reflects sunlight, but water vapor is itself a greenhouse gas.
Perhaps worse, those amplification pathways not only increase Earth’s temperature, but also increase error. Amplifying error is not good. In a study that extrapolates Earth’s climate out many decades, it is inevitable that small initial errors in temperature data or model specification will get amplified into big errors.
The Cooper paper is a tour de force. Yet, their estimate has wide error bounds, with a 95 percent chance that climate sensitivity is somewhere between 2.1°C and 4.1°C. I doubt future climate studies will materially reduce these error bounds.
What I found most striking about the Cooper paper is a pressing issue they do not mention at all—how will carbon dioxide alter human institutions? This might be dubbed institutional sensitivity. As an exemplar, consider the ecosystem of institutions that govern water in the western United States. These include many factors: prior appropriation water right law, irrigation districts, the California Water Project, the Clean Water Act and its rules and case law, the Army Corps of Engineers, federal, state, and private governance of dams, state governance of groundwater, economic markets, and so forth.
These institutions are being stressed. Under threat of immediate water curtailments on hundreds of thousands of acres of Idaho farmland, in June past senior and junior water rights holders agreed to a one-year mitigation plan that will allow the Idaho Department of Water Resources to lift its proposed curtailment orders.
To be clear, it is not at all obvious that the Idaho water shortages are related to climate change. Over millennia, even in the absence of anthropogenic carbon dioxide, the western United States has been subject to regular extreme droughts and floods. Moreover, climate models typically do not allow a specific drought or flood to be attributed to climate change.
Even so, while the direct impacts of carbon dioxide on temperature, rainfall, and overall climate are quite worrisome, I suspect our more immediate concern should be on how carbon dioxide impacts human institutions. Institutions, not temperature, will mediate most all climate impacts on humanity. Human institutional ecosystems now dominate natural ecosystems. They resolve conflicts and promote cooperative use and sharing of water and all other critical natural resources. Were they to fail under the pressure of climate change, there is an extreme downside.
Alas, we cannot quantify this downside risk. The Cooper paper is grounded on data going back tens of thousands of years. By contrast, essentially all western water institutions came into existence post-Civil War. And while the Cooper analysis of climate sensitivity is based on well verified scientific principles, such as quantum mechanics and black body radiation, no comparable general principles might guide an investigation of institutional sensitivity.
When change—-climate or otherwise—-carries a material risk of catastrophic failure, the only rational politics and policy is one of extreme conservatism—-conserving the existing climate, and thereby conserving existing natural ecosystems and existing human institutions.
Earth’s Climate Sensitivity and Related Institutional Sensitivity.