AI’s Thirst for Energy and Its Environmental Shadow
Artificial intelligence (AI) is rapidly transforming industries and aspects of our daily lives, but this revolution comes with a significant and growing challenge: its immense energy consumption and broader environmental impact. This isn’t just about rising electricity bills; it extends to vast water usage, increasing electronic waste, and a growing contribution to greenhouse gas emissions.
As AI models become more complex and integrated into everything we do, a critical question emerges: can we sustain this technological leap without severely harming the planet?
The Numbers Don’t Lie: AI’s Escalating Energy Demand
The computational power required for advanced AI is on an almost vertical ascent, some estimating it doubles every few months. This rapid growth threatens to outpace even the most optimistic energy planning.
To put this into perspective, AI’s future energy needs could potentially consume as much electricity as entire countries like Japan or the Netherlands, or large US states such as California. This highlights the significant strain AI could place on existing power grids.
The year 2024 saw a significant 4.3% global electricity demand surge, partly driven by AI’s expansion, alongside electric vehicles and increased factory activity. As of 2022, data centres, AI, and cryptocurrency mining already accounted for nearly 2% of global electricity use (around 460 terawatt-hours – TWh).
By 2024, data centres alone used about 415 TWh (roughly 1.5% of the global total), growing at 12% annually. While AI’s direct share is currently small (around 20 TWh or 0.02% globally), this figure is projected to increase dramatically.
Future Projections Are Eye-Opening:
- By late 2025, global AI data centres could require an additional 10 gigawatts (GW) of power, exceeding the total capacity of a state like Utah.
- By 2026, global data centre electricity use might reach 1,000 TWh, comparable to Japan’s current consumption.
- By 2027, AI data centres’ power hunger could hit 68 GW, close to California’s total capacity in 2022.
Looking towards 2030, the numbers become even more staggering. Global data centre electricity consumption is predicted to double to roughly 945 TWh, nearly 3% of all electricity used worldwide. OPEC forecasts data centre use could triple to 1,500 TWh by then. Goldman Sachs suggests global power demand from data centres could increase by up to 165% compared to 2023, with AI-specific data centres seeing a demand surge of over four times. Some estimates even suggest data centres could account for up to 21% of global energy demand by 2030 when accounting for the energy to deliver AI services to users.
Training vs. Usage: Where the Energy Goes
AI’s energy use primarily divides into two areas: training and usage (inference). Training large models like GPT-4 requires a colossal amount of energy; training GPT-3 is estimated to have used 1,287 megawatt-hours (MWh), with GPT-4 needing potentially 50 times more.
However, running these trained models day-to-day can consume over 80% of AI’s total energy. A single ChatGPT query is reported to use about ten times more energy than a Google search (around 2.9 Wh vs. 0.3 Wh). The race in generative AI is driving the construction of increasingly powerful, and thus more energy-hungry, data centres.
Can We Power AI – And Ourselves?
This is the critical question facing our planet’s energy systems. Meeting AI’s growing appetite sustainably requires rapidly increasing and diversifying our energy generation.
Renewables: A Key Piece of the Puzzle
Renewable energy sources like solar, wind, hydro, and geothermal are vital. In the US, renewables are projected to increase from 23% of generation in 2024 to 27% by 2026. Tech giants are making substantial commitments; Microsoft plans to purchase 10.5 GW of renewable energy between 2026 and 2030 for its data centres. AI itself can potentially boost renewable energy efficiency, possibly reducing energy use by up to 60% in areas by optimizing storage and grid management.
However, renewables face challenges, particularly intermittency. Data centres need constant power, which solar and wind cannot always provide. Current battery storage solutions are often costly and require significant space. Connecting large new renewable projects to the grid can also be a slow process.
Nuclear Power: A Steady Alternative?
Nuclear power is gaining appeal as a stable, low-carbon option for AI’s massive energy needs, offering crucial 24/7 power. Small Modular Reactors (SMRs) are particularly discussed for their potential flexibility and enhanced safety. Companies like Microsoft, Amazon, and Google are actively exploring nuclear options. Matt Garman, head of AWS, called nuclear a “great solution” and an “excellent source of zero carbon, 24/7 power,” emphasizing its role in long-term energy planning.
Yet, nuclear power is not without hurdles. Building new reactors is time-consuming and expensive, involving complex regulatory processes. Public opinion remains a challenge, despite advancements in modern reactor safety. The speed of AI development also contrasts with the long timelines for nuclear plant construction, potentially leading to increased reliance on fossil fuels in the short term. Locating data centres near nuclear plants also raises concerns about electricity prices and grid reliability for others.
Beyond Kilowatts: AI’s Wider Environmental Shadow
AI’s environmental impact extends significantly beyond just electricity consumption. Data centres require vast amounts of water for cooling, with an average data centre using about 1.7 litres per kilowatt-hour of energy consumed.
In 2022, Google’s data centres used about 5 billion gallons of fresh water, a 20% increase from the previous year. Some estimates suggest up to two litres of water per kWh are needed for cooling. Globally, AI infrastructure could soon use six times more water than Denmark’s total consumption.
The rapid evolution of AI hardware, especially GPUs and TPUs, contributes to a growing electronic waste (e-waste) problem. Data centres could contribute up to five million tons of e-waste annually by 2030. Manufacturing AI chips and data centre components also depletes natural resources, requiring mining for critical minerals using methods that can harm the environment. Producing one AI chip can require over 1,400 litres of water and 3,000 kWh of electricity, driving the need for more semiconductor factories, often powered by gas plants.
Carbon emissions are another major concern. When AI is powered by fossil fuel-generated electricity, it exacerbates climate change. Training a large AI model can emit as much CO2 as hundreds of US homes in a year. Environmental reports from tech companies show AI’s impact; Microsoft’s yearly emissions rose about 40% between 2020 and 2023, largely due to data centre expansion for AI. Google reported a nearly 50% rise in total greenhouse gas emissions over five years, with AI data centre power demands being a major factor.
Can We Innovate Our Way Out?
Despite the challenges, promising innovations and strategies are emerging.
Efforts are focused on making AI algorithms more energy-efficient through techniques like model pruning, quantisation, and knowledge distillation. Designing smaller, specialized AI models also reduces power consumption. Within data centres, power capping and dynamic resource allocation optimize energy use. AI-aware software can schedule less urgent tasks for times when renewable energy is abundant or grid demand is low. AI can even improve the efficiency of data centre cooling systems.
On-device AI, where processing happens locally on a phone or device rather than in cloud data centres, can significantly reduce energy use by utilizing chips designed for efficiency. Regulation and policy are also becoming crucial. Governments are beginning to address AI’s energy and environmental accountability. Standardized measurement and reporting of AI’s footprint are vital steps. Policies encouraging hardware longevity and recyclability can tackle e-waste. Energy credit trading systems could incentivize greener AI technologies. A recent deal between the UAE and the US to build a large AI campus highlights the global importance and the need to prioritize energy and environmental concerns in such projects.
Finding a Sustainable Future for AI
AI offers immense potential, but its rapidly increasing energy demand is a significant obstacle, with projections reaching levels comparable to entire countries. Meeting this demand requires a diversified energy mix. Renewables are key for the long term but have intermittency and scaling challenges. Nuclear power, including SMRs, offers a reliable, low-carbon option gaining interest from tech companies, but faces safety, cost, and timeline hurdles.
Crucially, AI’s impact extends beyond electricity to water consumption, e-waste, resource depletion in manufacturing, and carbon emissions. Addressing its ecological footprint requires a holistic approach.
The good news is that numerous innovations are underway, from energy-efficient algorithms and smart data centre management to on-device AI and supportive policies. Growing awareness of AI’s environmental impact is driving necessary discussions around regulation and sustainability.
Tackling AI’s energy and environmental challenges demands urgent collaboration among researchers, the tech industry, and policymakers. By prioritizing energy efficiency in AI development, investing in sustainable energy, managing hardware responsibly, and implementing supportive policies, we can work towards a future where AI’s potential is realized without damaging our planet. The global race for AI leadership must also be a race for sustainable AI.

Leave a Reply