Digital Platforms and Services

AI datacentres take the nuclear option

By Martyn Warwick

Oct 22, 2024

A model of a small modular reactor designed by X-energy. Source: Amazon.

  • The US national grid is under increasing pressure from insatiable demand for power to run AI datacentres
  • The current model is already unsustainable and datacentres are reaching their maximum size potential
  • The search is on for alternative power sources and methods to lessen the burden on the grid
  • The answer seems to be small modular reactors (SMRs), the focus of recent announcements from Google and Amazon

The power and utility of AI, be it unimodal or multimodal, is ultimately limited by the volume, and quality, of the data that is scraped, stored and manipulated to provide relevant and useful outputs to end users. The immense quantity of data required for AI systems to operate increases with every minute of every day, seemingly ad infinitum, and that in turn is putting increasing pressure on the resources required to support that data processing. Nuclear power, it seems, is one of the ways to relieve some of that pressure. 

The data underpinning generative AI (GenAI) has to be stored and processed somewhere, and that means construction of bigger datacentres around the world, all of which have to access power sufficient to keep them cool (literally!), calm and collected. Datacentres are very expensive to build, operate and manage, and it is increasingly obvious that the current model supporting GenAI is becoming unsustainable and will have to be replaced: That will come at gigantic cost and huge disruption but there will be no option in the medium to long term. 

The data warehouses that enable the likes of ChatGPT are on the verge of exceeding their feasible size limits, Mark Russinovich, CTO at Microsoft Azure, told the Semafor news website recently, and he’s not alone. What is needed with some urgency, according to Russinovich, is a new model and method of connecting multiple datacentres so that they can handle the practically inordinate processing load that will characterise future generations of the GenAI we now take for granted and the emergence of the still-undefined and theoretical artificial general intelligence (AGI) of the future that may be capable of exceeding the intellectual capabilities of humanity – see Is multimodal AI a dead-end on the road to AGI?

AGI is a long way away, and may never be realised, and as things stand today with the model we have and seem to be stuck with, even the most advanced AI is centred on large language models (LLMs) that are trained in datacentre facilities running hundreds of thousands of graphics processing units (GPUs) that can be chained to run as a single gigantic computer.

Datacentre annual global energy consumption to double in the next five years

All of this requires astonishing levels of power. Research firm IDC expects global datacentre electricity consumption to more than double between 2023 and 2028, notching up a compound annual growth rate (CAGR) of 19.5% over the five-year period to reach 857 terawatt-hours (TWh) by 2028. (1TWh is 1 billion kilowatt hours, roughly the equivalent of power sufficient to fully light 1 million average homes 24 hours a day for a full year.) 

The amount of power required to perform massive AI operations is already putting immense strain on the ageing and often neglected national grid in the US, which is home to most of the world’s hyperscale datacentres. The available megawattage devoted to datacentre operations is at its peak and that is causing all manner of problems for the utility firms and the datacentre sector.  

According to the analyst house RystadEnergy, headquartered in Oslo, Norway, since 2010 total US electricity demand has been relatively stable at some 4,000 TWh but it is changing as the proliferation of giant datacentres and the increasing popularity of electric vehicles (EVs) drive demand for the extra generation of electricity. RystadEnergy’s research shows that the two sectors will add 290 TWh of new power demand by 2030. 

RystadEnergy adds that electricity demand for datacentres and the chip foundries that provide them with GPU microprocessors will increase cumulative demand by 177 TWh between 2023 to 2030, reaching a total of 307 TWh from the 130TWh required in 2023.

Meanwhile, global management consultancy, Bain and Company, has published a report  showing that “surging datacentre power consumption could require more than $2tn in new energy generation resources worldwide. Furthermore, US energy demand could outstrip supply within the next few years; meeting demand would require utilities to boost annual generation by up to 26% by 2028.” That would be a huge task.

The report adds: “Facing potentially overwhelming demand from datacentres, US utilities are grappling with a challenge few executives anticipated: After years of navigating flat or shrinking demand, many organisations have forgotten what it takes to grow. What’s more, they’re finding that capitalising on this new opportunity will require a rapid, unprecedented transformation of their operating and business models.”

It continues: “For nearly two decades, efficiency and operational nimbleness were paramount while US electricity demand plateaued as a result of advances in energy efficiency and distributed generation offsetting economic growth. But the era of stagnant demand is over. The late 2022 breakthrough in generative AI and the ensuing datacentre boom blindsided utilities just as demand was also rising because of repatriated manufacturing, industrial policy, and vehicle electrification.

“Now, US utilities are bracing for demand to outstrip supply during the next few years, with datacentres accounting for the bulk of the increase. Supply, both in generation and transmission, will take years to catch up… by 2028, utilities would need to increase annual energy generation by between 7% and 26% above the 2023 total to meet projected demand. That’s far beyond the largest five-year generation boost of about 5% that US utilities achieved from 2005 through 2023.”

The report also finds that datacentre annual global energy consumption could “double by 2027 from 2023 levels, growing at a compound annual rate of 10% to 24% and potentially surpassing 1 million gigawatt hours in 2027 [even greater than IDC’s projection]. Serving a 1-gigawatt datacentre requires the capacity of about four natural gas plants or around half of a large nuclear plant.”

The Biden administration, well aware of the looming crisis in the US grid, passed the 2022 Inflation Reduction Act, which set aside $3bn for AI infrastructure such as new power cables, but that sum is a drop in the ocean of the investment required. To put things into perspective, in partnership with the US-headquartered BlackRock, Inc, the world’s biggest asset manager, with $10tn to call on, Microsoft has launched its own $30bn AI infrastructure fund. The company has also been negotiating the re-opening of the Three Mile Island nuclear power plant – yes, the one that went into partial ‘China Syndrome’ meltdown back in March 1979 – and simultaneously is working to form partnerships to foster green energy initiatives. The problem is urgent and daunting.

Google says its first small modular reactor (SMR) will be in place and working by 2030

In his interview with Semafor, Microsoft Azure’s Russinovich said one answer to the problem might be constructing datacentres in multiple locations so as to spread the load across the power grids of different cities or regions. In a masterpiece of understatement he said, “It would be technically challenging, but it may be necessary. I think it’s inevitable, especially when you get to the kind of scale that these things are getting to.”

However, connecting datacentre networks across long distances is fraught with problems, not least because maintaining the speed of AI data transfer across painstakingly synchronised fibre optic cable would be of paramount importance. The greater the distance, the higher the possibility that faults and mistimings would arise.

Certainly, datacentres are trying to reduce their energy consumption. For example, many are migrating from pumped air systems to liquid-cooling processes but even liquid cooling  draws a huge amount of power from the grid and is financially inefficient beyond a certain size.

The only solution in the short to medium term appears to be nuclear power, and in the absence of any breakthrough in the quest for nuclear fusion, nuclear fission it will be, along with its attendant problems and dangers. 

In a recent blog, Google announced it had reached “the world’s first corporate agreement” to “purchase nuclear energy from multiple small modular reactors (SMRs) to be developed by Kairos Power” of California.

Water is used as a coolant in conventional nuclear reactors, but Kairos Power reactors use molten fluoride salts instead. The salts have exceptional chemical stability and a great capacity for transferring heat at high temperature and the ability to retain fission products. Many studies have shown that molten fluoride salts are compatible with conventional high-temperature structural materials, such as stainless steel, and are ideal in terms of reliability and a long-operational life in a commercial environment. 

The initial phase will see the first Kairos Power’s SMRs in situ and operational by 2030. Thereafter, additional reactors will be deployed at intervals until 2035. Google says the deal “will enable up to 500 MW of new 24/7 carbon-free power to US electricity grids and help more communities benefit from clean and affordable nuclear power.” 

Google’s announcement was quickly followed by news from Amazon that it has “signed three new agreements to support the development of nuclear energy projects – including enabling the construction of several new small modular reactors.” 

In Washington state it has struck an agreement with Energy Northwest, a consortium of state public utilities, for the development of four advanced SMRs, while Amazon is also making an investment in X-energy, a developer of next-generation SMR reactors and fuel: X-energy’s advanced nuclear reactor design will be used in the Energy Northwest project.

In addition, Amazon has signed an agreement with utility company Dominion Energy in Virginia, where many large datacentres are located, to “explore the development of an SMR project near Dominion’s existing North Anna nuclear power station. This will bring at least 300 megawatts of power to the Virginia region, where Dominion projects that power demands will increase by 85% over the next 15 years,” noted Amazon. 

Matt Garman, CEO of Amazon Web Services (AWS), said: “Nuclear is a safe source of carbon-free energy that can help power our operations and meet the growing demands of our customers while helping us progress toward our Climate Pledge commitment to be net-zero carbon across our operations by 2040. One of the fastest ways to address climate change is by transitioning our society to carbon-free energy sources, and nuclear energy is both carbon free and able to scale – which is why it’s an important area of investment for Amazon. Our agreements will encourage the construction of new nuclear technologies that will generate energy for decades to come.”

It’s not an instant solution to a very pressing problem, but it is a start.

Martyn Warwick, Editor in Chief, TelecomTV

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.

Subscribe

Cookies

TelecomTV uses cookies and third-party tools to provide functionality, personalise your visit, monitor and improve our content, and show relevant adverts.