AI, Energy, and the DeepSeek Disruption: Have We Been Planning for the Wrong Future?
For months, the consensus among policymakers, investors, and energy market analysts has been clear: AI is going to consume staggering amounts of electricity. Governments, including the UK, have been preparing for a future in which data centers are as critical to national infrastructure as power plants or transport networks. The idea is simple—AI models require vast computational power, and the energy demand for training and running these systems will drive electricity consumption through the roof.
The UK government has already classified data centers as Critical National Infrastructure, anticipating that AI’s energy demand could rise from 1% to 6% of total UK electricity use by 2030. Meanwhile, projects like Stargate, a $500 billion AI data center initiative, suggest that industry leaders expect data processing needs to continue scaling exponentially, requiring vast new infrastructure.
But what if that assumption is completely wrong?