TECH NEWS – Nvidia’s current dominant position is set to disappear, and an analyst has pointed out another growing problem, and not just in AI-related products.
Chosun, a South Korean publication, reported that Nvidia’s AI accelerators are being used by a large number of people, but many fear that this will lead to high consumption. Nvidia’s A100 and H100 MI GPUs consume a lot of power, and thousands and thousands of them are being used in data centers, so the electricity bill is not cheap, and on the other hand, it has a very negative impact on the environment.
According to a semiconductor (chip) analyst, international data center power consumption will grow between 85 and 134% by 2027, which is as much as the Netherlands, Argentina and Sweden consume. If Nvidia brings such power-hungry GPUs to the forefront, lower-power alternatives could gain traction and significantly change the market landscape, challenging the dominance of the “greens”.
Nvidia’s AI accelerators can consume up to 13000 GWh per year.One GWh is equal to 1 million kWh, and the average U.S. household in 2022 used 10791 kWh per year.The consumption of Nvidia’s products could increase significantly with the introduction of Blackwell GPUs, as they can consume up to 1000W (1 kW) in some configurations (the GeForce RTX 4090 already consumes a staggering 450W, enough to power an entire PC!)There has also been talk of nuclear-powered data centers, and this is the subject of a project being developed by Microsoft and OpenAI.This is still several years away, so Nvidia will have to find some sort of temporary solution if it is going to get so carried away with power consumption.(The same can be said of Intel’s processors.) The company will lose its monopoly over time for other reasons: apart from the fact that nothing lasts forever, the performance of competitors and the passivity of companies in the supply chain will also play a role…
Leave a Reply