The accelerating adoption of artificial intelligence (AI) is driving up energy consumption and greenhouse gas (GHG) emissions across the AI value chain, raising urgent concerns about its long-term sustainability and the environmental impact on tech giants.

In response, new research outlines key strategies to decarbonise AI, focusing on improvements in energy sourcing, data centre infrastructure, and technology innovation, reveals GlobalData.

The research group’s latest Strategic Intelligence report – GlobalData’s Decarbonizing AI Framework – identifies the areas where the energy use of AI-related activities can be reduced and outlines the strategies for achieving these reductions, thus enhancing the overall sustainability and operational efficiency of data centres (DCs).

Martina Raveni, Strategic Intelligence analyst at GlobalData, says: “Growing power demands for AI applications drive up data centres’ energy consumption and carbon footprint. Data centres have various options to shift to low-carbon energy. Renewable energy and small modular nuclear reactors are emerging as the most viable solutions for sustainable power generation.

As AI and generative AI workloads grow more complex and resource-intensive, the energy required to train and run these models places a significant strain on data centre capacity.

Designing, building, and managing data centre facilities sustainably is essential, with key strategies including placing data centres in cooler climates or areas rich in renewable energy, and using liquid cooling systems in AI processing zones to lower power consumption.

“The most significant advances to improve AI model efficiency come from innovative chip manufacturing and design, small language models (SLMs) trained for specific industries and tasks, and optimising training and inference processes,” says Raveni. “On the training side, reinforcement learning and joint embedding predictive architecture (JEPA) reduce energy consumption. On the inference side, innovative techniques can reduce energy costs per query.”