TECH NEWS – Elon Musk’s company has huge expectations, but considering that in the first year of 2024 alone, Tesla has shown a 130% increase in computing capacity for training artificial intelligence, the company may be able to reach that goal.
The company is hoping for a growth rate of nearly 500% for the year. That’s certainly a lot, but Elon Musk’s ambitions are perhaps understandable. Tesla has between 30 and 350 thousand Nvidia H100 GPUs. Musk responded to a tweet, which has since disappeared, that the correct measurement would put Tesla second and xAI third, presumably in terms of computing power. Tesla said in its quarterly report that it had increased its AI computing capacity by the equivalent of 40,000 Nvidia H100 GPUs.
In January, a $500 million investment (the equivalent of about 10,000 H100 GPUs) was made in Tesla’s Dojo supercomputer, and at the time Musk said that more would be spent on Nvidia hardware during 2024, as he said the stakes to be competitive in AI were the equivalent of many billions of dollars per year. By the end of 2024, Tesla plans to increase its AI computing capacity by 467%, which translates to about 85,000 Nvidia H100 GPUs.
This is not accurate. Tesla would be second highest and X/xAI would be third if measured correctly.
— Elon Musk (@elonmusk) April 8, 2024
TESLA $TSLA A.I. training capacity jumped from roughly 15,000 H100 GPU equivalents at the end of 2023, to nearly 40,000 H100s in Q1 2024.
By the end of this year, @elonmusk says Tesla will have 85,000 Nvidia H100 equivalents powering Tesla's A.I. program (nearly +500% Y/Y 🤯) pic.twitter.com/7i8tewGKml
— Stock Talk (@stocktalkweekly) April 23, 2024
Because of the aggressive expansion, Tesla will have to sacrifice its free cash flow: in the first quarter of 2024, it was negative $2.5 billion, driven by a $2.7 billion increase in inventory and a $1 billion investment in AI infrastructure in the first quarter. However, Nvidia’s plans could change all that, as the “greens” will release a new superchip, the GB200 Grace Blackwell, sometime this year, which will offer more computing power (it can run 27 trillion parameter AI models; its speed in providing chatbot responses, for example, is 30 times faster), presumably at a higher price, and probably with higher power consumption, which could be dangerous; this has been in the news recently.
Musk is going all in on AI.
Leave a Reply