Nvidia Enters The ChatGPT Craze; There Might Be A Shortage Again!

TECH NEWS – According to the company, demand for their GPUs is outstripping supply, as the tech giants will soon be deploying thousands of artificial intelligence chips.

 

In addition to ChatGPT, other language, image, and video-generating tools rely on the processing power of AI, and this is where Nvidia’s strength lies. That’s why major tech companies using ChatGPT (such as Microsoft) are using Nvidia GPUs to provide the power they need to meet their growing AI needs. It could lead to a shortage of graphics cards equipped with this technology in the coming months.

FierceElectronics reported that OpenAI was training ChatGPT on 10K Nvidia GPUs, but since it has come to the attention of the public, the system has become overloaded. That’s why OpenAI has introduced ChatGPT Plus, which costs $20 per month, and not only gives you access to the servers even in case of overload but also faster response times and early access to new features and improvements for your money. “It is possible that ChatGPT or other deep learning models could be trained or run on GPUs from other vendors in the future. However, Nvidia GPUs are currently widely used in the deep learning community due to their high performance and CUDA support. CUDA is a parallel computing platform and programming model developed by Nvidia, allowing efficient computation on its GPUs. Many deep learning libraries and frameworks, such as TensorFlow and PyTorch, have built-in support for CUDA and are optimized for Nvidia GPUs.,” the site says.

Forbes reports that both Microsoft and Google will integrate a ChatGPT-like LLM (language learning model) into their search engines. For Google, if it were to add this technology to every search, it would need 512,820 A100 HGX servers with a total of 4,102,568 A100 GPUs, which would be a capital expenditure of about $100 billion in network and server costs, and the site says that will never happen. And Investing.com points out that instead of the 10,000 Nvidia GPUs used in the ChatGPT beta, the company is now up to 25,000: “We think that GPT 5 is currently being trained on 25k GPUs – $225 mm or so of Nvidia hardware – and the inference costs are likely much lower than some of the numbers we have seen. Further, reducing inference costs will be critical in resolving the ‘cost of search’ debate from cloud titans.”

Gamers beware: Nvidia’s GPU supply is sparse in these first three months due to the Chinese New Year, which will affect high-end cards the most. And these have better AI capacity than server hardware at a fraction of the price and can be bought up by companies…

Source: WCCFTech

Spread the love
Avatar photo
Anikó, our news editor and communication manager, is more interested in the business side of the gaming industry. She worked at banks, and she has a vast knowledge of business life. Still, she likes puzzle and story-oriented games, like Sherlock Holmes: Crimes & Punishments, which is her favourite title. She also played The Sims 3, but after accidentally killing a whole sim family, swore not to play it again. (For our office address, email and phone number check out our IMPRESSUM)

No comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

theGeek TV