Data centres will put increasing demands on Australia’s energy grid as the rise of AI computing continues. There is a real risk of power shortages for data centre operators as Australia transitions towards renewable energy infrastructure.
Ben Crowe, associate director for cloud and colocation at Vertiv in Australia and New Zealand, said providers of large data centre campuses could consider building their own localised source of power or microgrid to improve data centre resilience.
Data centres will also have to think carefully about data centre infrastructure design in the future, as the shift to AI computing with denser but more powerful and hotter racks demands building in new liquid cooling technologies as well as traditional air cooling.
Jump to:
There is no clear information telling Australians how much energy the local data centre industry uses, according to the CSIRO. The International Energy Agency estimated in 2020 that data centres use 1–1.5% of energy worldwide, which has not changed much in the last decade.
But data centre energy usage could be about to spike. While advances in energy efficiency have offset growth in usage in recent times, Gartner recently predicted that the appetite for AI computing power could see AI alone consume 3.5% of the world’s electricity by 2030.
Vertiv’s Ben Crowe said the power demand for Australian data centres has been growing. While a big data centre facility a few years ago consumed 5 megawatts of power, large multi-building campus facilities now account for 350 megawatts, adding up to a big demand for energy.
AI will demand even more energy. The current average rack density in local data centres — or the amount of power consumed by a rack — is about 10 kilowatts. AI systems will increase this power consumption to 40 kilowatts, or four times the power consumption per rack.
“The demand is growing pretty substantially, and it’s probably going to go higher,” Crowe said.
The growth in data centre power demand, just when Australia is moving to phase out baseload power generation from coal, could lead to risks for power consumers, including data centre customers. For instance, there have been recent warnings of blackouts over the summer of 2023.
SEE: AWS and Equinix offer strategies to mitigate the impact of data centre outages in Australia.
Crowe said he was not expecting blackouts for the next year or two because a lot of baseload power facilities remain online. However, he warned that there could be implications for data centre customers in the future if local investment in renewable power does not keep up.
“We could see a higher number of days with high heat, with a lot of air conditioners turned on,” said Crowe. “Power authorities might start reaching out to data centres to tell them to turn on their on-premises power generation technology because they can’t supply the energy.”
Crowe said that, while each data centre customer’s requirements and provider agreements will be different, ultimately data centres are designed as Tier 3 and Tier 4 uptime institutions, which would mean they have the capacity to run facilities with backup power generation.
While the risk to customer data is minimal, Crowe said that in many cases, this would require using dirtier fuel sources like diesel to run generators, which would not be the most sustainable outcome for either data centre providers themselves or environment, social and governance-conscious customers.
The local data centre market may start to build more localised power storage facilities, or microgrids, near data centre infrastructure that has a lot of energy demand. These technologies will reduce reliance on the energy grid, building further resilience into data centre facilities.
Along these lines, Tesla famously helped stabilise South Australia’s energy grid with a massive lithium-ion battery, while this year Western Australia is using a solar and battery microgrid to power its largest rail infrastructure project to avoid diesel backups and ensure power delivery.
The CSIRO said the local data centre industry was increasingly considering developing microgrid technologies, which could enable them to provide energy to the grid or store it, while safeguarding their own supply. Vertiv recently unveiled a microgrid offering in the U.S.
Australia needs to build more than 10,000 kilometres of transmission lines to connect new renewable energy generation according to the Australian Energy Market Operator. Delays in building these networks as coal-fired power stations reach their use by dates could lead to power shortages or blackouts, the very thing new renewable energy is meant to mitigate.
AI computing will generate much more heat within data centres. While air conditioning and cooling has played a big role in the industry to date, Crowe said the “laws of thermodynamics” mean technologies like immersion or direct-to-chip cooling will need to be considered.
Immersion and direct-to-chip cooling methods, which pass liquids closer than ever before to equipment to absorb and disperse the heat generated by computing, will allow data centres to cool server equipment that is operating at higher temperatures more effectively.
Crowe said that, while there will always be a requirement for air cooling technologies as part of the cooling mix, it is always going to have limitations.
“Water, or liquid in general, is a better medium for transferring heat,” said Crowe. “For higher heat load technologies, generally, a liquid cooled solution is probably going to be more prevalent and suitable. I think we’re going to see more adoption in liquid technologies, but we’re still going to see a lot of air-cooled technology for the foreseeable future.”
The data centre market will need to think carefully about design and strategy as some facilities reach end of life. While newer data centres will be easier to adapt for AI, Crowe said it will be interesting to see how the market as a whole balances a mix of technologies to manage AI.
SEE: Explore these best practices for data centre migration.
According to Crowe, this will include considerations like what mix of air and liquid cooling technologies to install to manage the heat of AI computing and whether facilities will need to be upgraded to handle the higher density and heavier equipment that will need to be housed in each hall.
“You’ve still got a lot of demand at the moment for utilising existing technologies, which are really good,” Crowe said. “But there’s a lot of conversation now going on around whether a shift might be necessary from some data centres in two to three years’ time.
“People are then saying, ‘If we’re going to a liquid or immersion cooling solution, how do we build that into our current facilities? What kind of designs and structures do we have to make to make all that work?’ So yeah, it’s a really interesting place at the moment.”
AI is due to drive power consumption within single data centre racks from about 10 kilowatts to 40 kilowatts. This could see data centre providers butting up against the agreed energy consumption limits they have in place, while using less space within their facility.
“If the power authority has given you 40 megawatts of capacity, that full amount of power can now get utilised within one data hall,” Crowe said. “That’s going to be another challenge — companies getting more power into their buildings, asking for more power from the power authority.”