How must data centres evolve to meet AI workloads?

By Darren Watkins, chief revenue officer at VIRTUS Data Centres.

Artificial Intelligence (AI) and Machine Learning (ML) are becoming mainstream and they can’t be avoided or ignored. This is having a knock-on effect on the infrastructure that powers our lives as these technologies will only work if there are powerful computers that can process millions of data points every single second. As a result, the growth of AI and ML, along with the continuing increase in cloud and enterprise workloads, means that the need for computing power is also growing - and the most efficient way to achieve computing power at scale is in data centres.

Demand for data centre space has been growing at an exponential rate for decades, driven by the digital economy. However, the overlay of AI and ML deployments are increasing demand further and faster across the world. Statista predicts that the data centre market will reach US$349.20bn in 2024, growing to US4438.70bn by 2028, causing data centre providers to re-evaluate their strategies. 

Traditionally the backbone of many technological advancements, as computing power is the fuel of our technologically advanced society, data centres are now faced with the imperative to be more than infrastructure providers. They have a new critical challenge where they need to provide more than the essential network and infrastructure supporting data storage, management and cloud services in an always-on manner. 

The rapid growth of AI and ML means that data centres need to be to be even more agile, innovative and collaborative to power this new era. This includes managing sustainable power at scale and implementing designs that support rapid and scaleable AI deployments, whilst consciously aligning with values that benefit the data centre provider, the customer and wider society responsibilities.

Location: where to build?

Many existing European hyperscale facilities simply aren’t capable of meeting the short-term future demands of AI and there is a shortage of the right type of supply i.e. large scale facilities with renewable energy close to, but not in, a major European metropolitan city. So where can data centres be built to accommodate this growing demand for computing power?

Over the past few years, location has been a very specific, deliberate choice as the  technological landscape was meticulously mapped to minimise latency. Today, with power constrained central metropolitan areas and the integration of AI and ML workloads which are less latency sensitive, are orchestrating a shift in priorities. These advanced AI workloads challenge the traditional principles that often dictated optimal data centre locations. The result is a profound reconsideration of what defines an ideal site, and new locations are opening up as potential locations for data centres. 

This shift isn't about lessening the importance of low latency - it's about recognising the evolving needs of integrating AI and ML. The move towards larger campuses is a calculated strategy that acknowledges the non-linear cost relationship inherent in these operations; larger megascale campuses capable of 200-500MWs can often afford providers - and therefore customers - greater efficiencies. This bold step challenges the long-standing industry norm, presenting a compelling argument that prioritising sheer scale over proximity and access to renewable energy can yield more efficient and sustainable outcomes.

Sustainability is even more important

While some may consider access to power, water and connectivity traditional requirements, from a customer's perspective that will remain unchanged. However, for data centre providers, with the increase in computing power required to enable AI and ML workloads, comes an increase in the power needed to operate facilities. 

Worldwide, data centres consume about 200 terawatt hours of power per year - more than some countries. And the forecast is for significant growth over the next decade, with some predicting that by 2030, computing and communications technology will consume between eight percent and 20 percent of the world’s electricity, with data centres accounting for a third of that. This requires the technology industry and data centre providers to be even more innovative to lower Power Usage Effectiveness (PUE) and Water Usage Effectiveness (WUE) and in turn reduce their reliance on diesel generators.

Power must come from a renewable source and be used efficiently, and this means the facility must be designed to be as efficient and sustainable as possible.  Sourcing only 100% renewable energy and contracting with Power Purchase Agreements (PPAs) to use dedicated solar and wind farms to power data centres are all critical initiatives which the most sustainable data centre providers are embracing. In some countries like Germany there are laws regarding the power usage effectiveness (PUE) of data centres to drive responsible behaviour. 

With regards to cooling, unfortunately it is impossible to beat the laws of thermodynamics; heat generated by the computer systems still needs to be removed from a facility using power. However, there are methods that are more efficient than others; for example, removing the heat direct from the chip involves liquid and not air – and it requires design changes to the infrastructure to enable this. Another alternative is immersion cooling which is generally more suited to a bare-metal solution. 

Reuse of waste heat has been in the headlines recently and is another way to achieve PUE. It’s an interesting discussion as traditional data centres do not produce heat of a high enough grade to be very useful. However, higher density solutions to support the new AI / ML workloads will provide useful heat and, in some countries legislation is being introduced into municipalities to ensure they invest in the capability to reuse waste heat.

In this new era the industry is also placing an unprecedented emphasis on the benefits a data centre can bring to the local community beyond waste heat reuse. This includes striving to build facilities that are harmonious with the local environment, reducing the negative aesthetics of data centre buildings, providing local employment and potential upgrades to the local infrastructure.

The spotlight on sustainability is not just a buzzword but a strategic acknowledgment that data centres, powered by renewable energy, are integral to a future where efficiency and environmental consciousness go hand in hand. The technology industry and data centres must demonstrate a real commitment to sustainability and recognise the crucial role energy efficiency plays in the ongoing transformation of data centre operations. And the move towards larger campuses needs to align seamlessly with the imperative to reduce environmental impact. 

It is clear that the data centre landscape is undergoing a profound evolution. The integration of AI and ML workloads, the redefinition of scalability, and the strategic development of AI ready megascale campuses collectively mark a new chapter in the story of data centres. This is not merely about keeping up with demand; it's about steering a course towards a data-driven future that is as dynamic as it is sustainable. 

By David de Santiago, Group AI & Digital Services Director at OCS.
By Krishna Sai, Senior VP of Technology and Engineering.
By Danny Lopez, CEO of Glasswall.
By Oz Olivo, VP, Product Management at Inrupt.
By Jason Beckett, Head of Technical Sales, Hitachi Vantara.
By Thomas Kiessling, CTO Siemens Smart Infrastructure & Gerhard Kress, SVP Xcelerator Portfolio...
By Dael Williamson, Chief Technology Officer EMEA at Databricks.