It’s easy to say that 2023 was the year of AI, or generative AI to be more specific. Adjusting to its demands – and exploiting its potential – will continue to be a big ask for the datacentre industry this year, compounded by the need for stable energy supply, cooling and the potential use of alternative energy sources.
By Faith Waithaka, cloud and service provider segment sales sead: Anglophone Africa at Schneider Electric
As it stands, enterprises continue to adopt cloud computing and engage in digital transformation. However, for each of new standard datacentre, Schneider Electric believes there will three additional AI datacentres offering higher server densities.
According to our recently released paper The AI Disruption: Challenges and Guidance for Data Center design, it is estimated that AI represents 4,5 GW of power consumption (that’s the demand) today, projected to grow at a CAGR of 25% to 33%, resulting in a total consumption of 14 GW to 18.7 GW by 2028. This growth is two to three times that of overall datacentre power demand CAGR of 10%.
Here, our whitepaper outlines several key considerations, addressing the four physical infrastructure categories: power, cooling, racks and software tools.
But that’s not the only way AI will be impacting this expanding industry. Schneider Electric has predicted that AI will enable operators to better manage their assets.
This year we’ll see AI-driven tools that provide greater intelligence to utility and backup power control, cooling control, cross-domain operational optimisation (digital twins), datacentre design and construction, maintenance, robotics, and more.
Alternative energy in datacentres
Diesel is often the go-to energy resource when the country is experiencing power supply challenge, particularly with major solar installations still underway at datacentres across the country.
Green (renewable) diesel has emerged as an option. Produced from different cellulosic biomass such as crop residue, forestry waste or woody biomass, it has identical chemical properties to petroleum diesel. It can therefore be in its pure form or blended with petroleum diesel as a viable alternative energy resource.
Liquid cooling – a feasible option
With datacentre workloads ever-increasing due to the abovementioned requirements of AI, the average rack power draw has shot up considerably. And with more power draw comes more waste heat from the rack and the physical floorspace.
Up to recently, when racks consumed up to 20kW, air-based cooling methodologies could be relied on to keep the IT hardware operating safely and efficiently. But as some racks start to exceed 30kW or more, new cooling approaches need to be used.
Liquid cooling leverages the higher thermal transfer properties of water or other fluids to support efficient and cost-effective cooling of high-density racks and can be up to 3000 times more effective than using air.
Ultimately, our whitepaper explores the intersection of AI datacentre infrastructure, addressing key considerations such as:
* Guidance on the four key AI attributes and trends that underpin physical infrastructure challenges in power, cooling, racks and software management.
* Recommendations for assessing and supporting the extreme rack power densities of AI training servers.
* Guidance for achieving a successful transition from air cooling to liquid cooling to support the growing thermal design power (TDP) of AI workloads.
* Proposed rack specifications to better accommodate AI servers that require high power, cooling manifolds and piping, and large number of network cables.
* Guidance on using data centre infrastructure management (DCIM), electrical power management system (EPMS) and building management system (BMS) software for creating digital twins of the data centre, operations and asset management.
* Future outlook of emerging technologies and design approaches to help address AI evolution.