TECH NEWS – Samsung and SK Hynix have to raise prices around SSDs and DRAMs. The cost of HBM3 has gone up five times due to the Nvidia-ChatGPT duo.
We recently reported on how Nvidia stood by OpenAI’s ChatGPT and highlighted how companies could snap up their graphics cards because of their performance in AI. DRAM vendors are reacting to it, as they have started to sell their high-bandwidth memory (HBMs) at higher prices. These are needed to create Nvidia’s MI GPUs.
BusinessKorea, a South Korean site, has reported that SK Hynix and Samsung have increased the price of their memory components, so they are charging a lot for HBMs. Nvidia has reportedly asked SK Hynix to increase the production of HBM3. Still, Intel, for example, is also planning to expand this way, so it’s not sure the company can produce enough with its capacity. So it could charge up to five times more for HBM3 memory.
“The advent of ChatGPT, an artificial intelligence (AI) chatbot, provides opportunities for Korean memory semiconductor makers to create a new business. ChatGPT learns a lot of data through super-large artificial intelligence (AI) and answers questions naturally. DRAM data processing speed has become important for better and faster ChatGPT services. Korean companies are producing all of the high-performance DRAMs essential for this. Nvidia, the world’s largest GPU company, has been asking SK Hynix to supply the latest product, HBM3 chips. Intel, the world’s No. 1 server CPU company, is also working hard to sell products equipped with SK Hynix’s HBM3. An industry insider said, “The price of HBM3 increased up to five times compared to the highest performance DRAM.”” the site said.
HBM2 and HBM2e, the two older standards that have popped up in Nvidia’s Ampere and Volta GPUs, also provide AI options and are in high demand. As SK Hynix covers 60-70% of the market, it is not excluded that there will be a price increase here as well…