Part 2 of AI and Sustainability: The Trade-offs, Risks, and the Path Forward
The AI Road Bumps
Beyond the risks mentioned in a previous article, “Double-Edged Sword: How AI Can Become a Risk for Businesses”, there are several other considerations that may determine if full-scale AI adoption would be beneficial for your business, specifically from a sustainability perspective.
First, data warehouses require significant energy. Take the example raised by an Electric Power Research Institute (EPRI) 2024 report on artificial intelligence (AI) power needs, which indicated that a single ChatGPT query uses five times the energy needs of a single Google search. Again, as AI development progresses, its energy demand grows exponentially with model size (i.e., capabilities). Data centres and the advanced chips that drive AI are thus accelerating the already fast-growing global electricity demand. The World Economic Forum (WEF) 2025 report estimates that by 2035, data centres in the US alone could account for 8.6% of total electricity use, which is more than double their current share. Globally, data centres consumed around 415-terawatt hours (TWh) in 2024, and this is expected to more than double by 2030, according to the 2025 International Energy Agency (IEA) report on AI and Energy. In addition, a 2024 US Congress report on AI energy needs indicates that in 2024, new hyperscale data centres were built with energy capacities from 100 MW to 1,000 MW each, roughly equivalent to the load from 80,000 to 800,000 homes!
Beyond the massive energy needs of AI, the IEA report further highlights the huge amount of water required to cool AI servers in warehouses, which adds stress to water-scarce regions and risks causing neighbouring communities to face water shortages. It is projected that AI warehouses will withdraw 6.6 billion m³ of water annually by 2027, equivalent to the annual water needs of several million people.
Further, AI data centres create a “Carbon Lock-In Risk” as the AI boom outpaces renewable energy deployment. This means that data centres are being built faster than green energy grids can scale, thereby recreating a dependency on coal- and gas-fired plants, increasing Scope 3 emissions, and undermining net-zero ESG pledges for many organisations reliant on AI.
Striking a Balance
AI data warehouses’ growing electricity demand is both a challenge and an opportunity. It can accelerate the development and commercialisation of green grid technologies, thereby reshaping energy systems for the better, but only if neighbouring communities have an increasing ownership stake that keeps pace with the growth rate of data centres. The challenge here is to make sure that AI innovation strengthens the entire energy ecosystem, powering not only the algorithms of the future but also the societies they claim to serve.
A Deloitte 2025 report on Energy Systems indicated that by 2030, AI-enabled energy savings may reach more than 3,700 terawatt hours (TWh), largely exceeding the technology’s projected energy consumption. Under the 10% high growth scenario, EPRI’s projections show data centre electricity usage rising to an average of 296.4 TWh/year by 2030. The Deloitte report also anticipates that AI will deliver more than US$200 billion of annual cost savings by 2030 and almost US$500 billion by 2050. As for ESG, the report touts AI-driven emission reductions that could reach up to 660 megatons of carbon dioxide equivalent (MtCO2eq) in 2030, making it a significant contribution to global greenhouse gas mitigation efforts. Given these expectations, it is highly likely that the AI bandwagon will sweep more organisations in the foreseeable future.
Already, major global tech companies have begun to set the stage to mitigate the adverse effects of their own AI development. The 2025 WEF report indicates that Amazon has matched 100% of its operations’ global energy consumption with electricity generated by renewable energy systems. Microsoft has since established a supplier code of conduct, requiring suppliers to transition to 100% carbon-free electricity by 2030, and has itself committed to doing the same. Google has not been left behind, and it remains committed to the goal of relying on carbon-free electricity by 2030, as well as launching efforts to improve its AI model efficiency and data centre energy consumption.
Guardrails, however, are needed to govern the growth of AI data centres. According to the 2025 WEF report, AI should be pursued cautiously as a sustainability tool, with:
- Shared infrastructure planning: Align AI-related investments with regional grid expansion and resilience initiatives.
- Transparent energy sourcing: Encourage open reporting on electricity use, carbon intensity, and clean power procurement. ESG reporting should be aligned with IFRS S1 & S2, thereby building investor confidence and mitigating greenwashing risks. This also requires integration of carbon accounting into AI operational ESG metrics.
- Incentives for co-benefits: Support projects that advance both corporate energy reliability and local community decarbonisation goals.
- Integrated policy frameworks: Embed AI load forecasts into national and regional energy transition plans to avoid local grid bottlenecks.
Other measures to consider include:
- Mandatory renewable sourcing for all AI energy operations.
- Investment in “Green AI” hardware and cooling technologies.
- Implementing existing and emerging global standards for responsible AI deployment.
Without these guardrails, AI could undermine climate goals rather than advance them.
