
In a recent OpEd, Joe Craparotta of Schneider Electric identified many of the challenges faced by data centres as the use of AI skyrockets. Here are a few of his (lightly edited) thoughts.
Investments in generative AI reached US$25.2 billion in 2023, nearly nine times the amount invested in 2022, and approximately 20 times the funding seen in 2019.
This rapid growth presents data centre companies with opportunities to innovate, expand their service offerings, and cater to the evolving needs of AI-driven applications and enterprises. AI currently requires 4.3GW of data centre power, projected to reach up to 18GW by 2028. This surge surpasses current data centre power demand growth rates, presenting capacity and sustainability challenges
AI workloads are expected to grow two to three times faster than legacy data centre workloads, representing 15 to 20% of all data centre capacity by 2028. More workloads will also start moving closer to users at the edge to reduce latency and enhance performance.
Currently, most data centres can only support rack power densities of about 10 to 20 kW, however many AI implementations require many GPUs and rack densities from 25 kW to 120 kW, depending on the GPU model and quantity. Transitioning from low-density to high-density configurations can help address these challenges.
With the substantial increases in power consumption, AI data centres will generate substantial heat, necessitating the use of liquid cooling to ensure optimal performance, sustainability, and reliability. In this situation, liquid cooling offers many benefits, including higher energy efficiency, smaller footprint, lower total cost of ownership (TCO), enhanced server reliability, and lower noise levels.
Clearly, it's crucial to evaluate AI's broader impact on energy consumption and the environment. Gartner reveals that 80% of CIOs will have performance metrics tied to the sustainability of the IT organisation by 2027. According to the Sustainability Index, 2024, nearly one in 10 business decision-makers around Australia are already using AI as a resource for decarbonisation transformation
Data centres operate with significant energy demands, posing challenges to environmental sustainability. Optimising energy efficiency, lowering carbon emissions, and enhancing operational resilience, are essential to enable data centres to operate responsibly, fostering a more sustainable future.
The demand for AI and the evolution of the data centre are interconnected elements shaping the digital landscape. Increased workloads, especially deep learning AI models, require significant computing resources to train. This requires data centres that can support the performance requirements of AI workloads.
As AI technology advances, it will continue to influence the design and operation of data centres. While these advancements bring efficiency and innovation, they also pose challenges related to energy consumption, and power and cooling systems.