
Background information
AI boom: could SSDs and HDDs become the new graphics cards?
by Kevin Hofer
The announcement by Open AI and Nvidia sounds insane: 100 billion US dollars for AI infrastructure that’ll consume an electricity output equivalent to ten nuclear power plants. No question, electricity is becoming the tech industry’s most valuable resource.
The numbers are staggering: 10 gigawatts correspond to between 4 and 5 million GPUs – equal to Nvidia’s total GPU shipments for this year. And while we’re still all getting used to gaming PCs with 1000-watt power supplies, OpenAI is planning infrastructure totalling 10,000,000,000 watts. It would put all existing data centres in its shadow: those draw between 50 and 100 megawatts, i.e. a hundred times less.
Financing is particularly perfidious: Nvidia is investing 100 billion dollars in Open AI, which will then spend the money on Nvidia hardware. It’s as if the graphics card manufacturer is financing PC dealers who’ll then buy exclusively its products. This way, Nvidia is cementing its quasi-monopoly in AI accelerators.
But even with an unlimited budget, the AI giants are coming up against physical limits: electricity is a limited commodity and power plants aren’t built in a day. The next bottleneck is looming: electricity. In a world where AI infrastructures meet the electricity needs of entire countries, energy is becoming a strategic resource.
The first gigawatt stage is scheduled to go online in 2026 – Nvidia’s Vera Rubin platform. In the AI era, the question of «How many AI accelerators do we need?» will probably be replaced by «How many nuclear power plants do we need?»
From big data to big brother, Cyborgs to Sci-Fi. All aspects of technology and society fascinate me.
Interesting facts about products, behind-the-scenes looks at manufacturers and deep-dives on interesting people.
Show allSam Altman and Jensen Huang have announced a strategic partnership that’s massive even by tech industry standards. Their planned AI infrastructure is expected to generate 10 gigawatts – equivalent to the electricity requirements of multiple large cities. Nvidia’s CEO Huang put it in a nutshell: «This is a giant project.»
Tech companies are becoming power guzzlers because of AI, chasing gigawatt capacities and signing contracts with nuclear power plants. Microsoft has already reactivated a reactor, and Amazon is also investing in nuclear power.
Nvidia estimates that one gigawatt’s worth of data centre capacity will cost between 50 and 60 billion US dollars, 35 billion of which is intended for Nvidia hardware. Looks like 100 million US dollars won’t be the final price tag, then. For ten gigawatts, total investments over 500 billion US dollars should be expected. By comparison, Switzerland’s gross domestic product amounted to around 800 billion US dollars in 2021.
Samsung Galaxy A16
128 GB, Black, 6.70", Dual SIM, 4G