Trend

AI’s Growing Environmental Costs

Trend Universe AI’s Growing Environmental Costs
Data centers are responsible for approximately three percent of global electricity consumption and almost five percent of all the energy used in the U.S. Jovelle Tamayo/The New York Times.

About This Trend

While AI offers vast potential, its environmental cost is undeniable. Training and running large AI models requires enormous computational resources with massive energy and water consumption — a single ChatGPT query uses nearly 10 times more electricity than a regular Google search. As the popularity of AI grows, its energy footprint could soon rival that of entire nations.

Central to AI's energy consumption are the data centers housing the servers that power these models. Data centers are responsible for approximately three percent of global electricity consumption, and the International Energy Agency predicts AI's electricity demand will double by 2026. This has delayed the decommissioning of coal plants previously slated for closure. The microchips used in AI servers depend on rare earth elements, often mined in environmentally destructive ways. Data centers generate large amounts of electronic waste, and the mercury and lead in servers add to the environmental damage.

Potential indirect environmental consequences of AI include diverting attention and investment away from critical areas such as climate technology. As more companies integrate AI into their operations, the demand for data centers is only expected to rise, posing a challenge for communities and regions committed to reducing their carbon footprint.

In response to AI's growing environmental costs, companies and researchers are exploring ways to mitigate its impact. In Europe, net-zero efforts seek to reduce data center emissions through renewable energy use and heat capture. Along those lines, Google has signed a significant offshore wind agreement to power its data centers in the Netherlands, Meta is developing direct air capture (DAC) technology to reduce carbon emissions, and Norway's Green Mountain has agreed to send waste data center heat to warm the world's largest land-based trout farm.

Another approach is reducing AI's energy needs. Neuromorphic computing systems modeled on the human brain can increase operational efficiencies, reducing the resource-intensive processes typically associated with AI development. These technological advancements could help communities balance AI's benefits with its resource consumption. At the local level, planners can implement regulations to determine data center locations and establish development requirements.

Trend Reports

2025 Trend Report for Planners Cover
2024 Trend Report for Planners Cover
2023 Trend Report for Planners Cover
2022 Trend Report for Planners Cover
APA's foresight research is made possible in part through our partnership with the Lincoln Institute of Land Policy.