Data centre planning for an AI-dominated future

Operators wanting to make the most of 5G are embracing AI and newer RAN strategies like Cloud RAN and vRAN. Further down the line, operators are looking at developing network-as-a-service (NaaS) offerings. How much data capacity will this require?

Right now, we don’t know for certain. But we can reasonably guess that it will mean a lot more data centres. So, what sort of data centres can deliver the necessary compute power to support operators’ long-term strategies? And what will it mean for ongoing attempts to address carbon emissions if data centre growth undermines plans for carbon neutrality?

One view is that telcos will need access to new, massive hyperscale data centres to meet future demand. But that takes time: the builder has to secure power, buy land, and sign contracts for construction, a process that could take years, after which demand might not justify a wait. That’s worth remembering: it’s not by any means certain that data demand will skyrocket for years to come.

So, there needs to be a shorter-term option.

In fact, there is. Operators can test demand over much shorter timescales by taking compute from someone else: buying capacity from a pool of established regional data centres, or building their own, very small ones. These micro edge or localised data centres are an option, alongside, or even instead of, a hyperscale facility.

This is already being demonstrated in Germany, where 1&1 AG, the first European mobile network based on innovative Open RAN technology, successfully launched in 2023.

“Four core data centres, 24 decentralised data centres and over 500 edge data centres are being created in the 1&1 Open RAN. They are connected to the 1&1 antenna sites via fibre optic cables over distances not exceeding ten kilometres. Solely gigabit antennas mounted on slim antenna masts are used in the 1&1 Open RAN. Applications running in the network benefit from extremely short transmission paths, rendering the1&1 Open RAN ready for real-time applications without any further adaptations.”[1]

A similar example comes from Africa, where Amazon Web Services (AWS) has launched its second Wavelength Zone Edge location in Africa – Senegal to be precise, in partnership with Orange Wavelength. Wavelength embeds AWS compute and storage services at the edge of communications service providers’ 5G networks. It minimises latency and the network hops required to connect from a 5G device to an application hosted on AWS because application traffic can reach servers without leaving the mobile provider’s network. Amazon’s compute and storage services are located within the telco providers’ data centres.

These are not necessarily alternatives to long-term investment. They could work alone or alongside a hyperscale play. They do, however, allow telcos to get data-reliant AI-based services up and running more quickly – and, importantly, with minimal latency. Meanwhile telcos can (tentatively) sign up for a larger facility in the longer term and see if AI demand eventually justifies exercising an option for that location.

It’s a multi-location strategy: hedging bets rather than waiting for one location to be built and pooling available resources if required. As long as the pools are large enough, then a telco does not have to find extra capacity at short notice.

Meanwhile, looking at a variety of options could also cut emissions. There’s much talk already about the possibility of connecting a large AI data centre or hyperscaler to a district heat system as an alternative to gas powering local housing. At about 50 MW such a data centre would certainly throw out a lot of heat, so reusing it in this way could be a positive move in environmental terms, though it would involve a lot of infrastructure.

But an even more compelling approach could be a network of 1MW micro edge data centres, six-metre shipping containers, each of which produces 250 kilowatts of heat that can then serve (for example) a boiler within an old people’s home. Or one that gives away free heat to a swimming pool but gets free cooling from the same source. This latter example, incidentally, could make the micro data centres running on renewable energy carbon negative!

But that still leaves quite a few areas of uncertainty. Is a strategy of pooling to support AI and RAN compute needs going to work for you? Would micro data centres be a worthwhile investment? When could NaaS be a saleable option? When and how could hyperscale data centres be a consideration? Could heat transfer really mitigate the carbon emissions that come from growth?

And there are other, less immediately urgent considerations that could be relevant over time. Will new chipset designs change the heat equation? Will battery energy storage solutions become more viable? How do you avoid underinvesting to manage spikes in compute demand? Can you sell excess when demand is low – at night, say?

There isn’t a simple answer to any of these questions. To find out what works for you means having the right information and asking the right questions at the right time.

These are conversations Real Wireless has every day: with data centre providers, AI operators, service providers, network planners, engineers, economists and regulators. We know how difficult it is to find the answers you need to ensure a viable compute strategy. But we also know the right questions to ask to make it more likely.

Posted by Paul Rhodes

Read bio