As the demand for AI chips increases with the popularity of large language models, so does the demand for computing memory.
In response to this demand, South Korea’s SK Hynix, which supplies memory chips to AI vendors such as Nvidia, said it plans to invest about $75 billion in AI chips through 2028.
The move on June 30 comes after South Korea announced last month that it would begin providing financial assistance totaling about $19 billion to affected chipmakers.
SK Hynix plans to invest about 80% of the $75 billion in high-bandwidth memory (HBM) chips.
High-bandwidth memory chips consume less energy than other chips, and different layers of AI chips can be stacked on top of each other.
Demand for memory chips
Other memory chip providers, including Micron, Samsung and SK Hynix, have also sold out of their chips for 2024 and 2025.
As a result, demand for AI memory chips is growing, especially as AI technology is integrated into various sectors such as mobile devices and contact centers.
“Fundamentally, we’re going to see continued demand and growth in this space,” said Dan Newman, an analyst at Futurum Group. “We’re seeing a front-loaded surge in demand for AI chips.”
Despite major companies like Google, Meta and Microsoft buying thousands of GPUs from Nvidia, AI systems still need plenty of memory, Neumann added.
“Everybody’s talking about whether they can get enough Nvidia chips, but they also need access to enough memory,” he said. “As we continue to train larger models and power AI in more applications, we’re going to need access to the compute and memory that we need to do that.”
The nearly unquenchable demand for computing and access to memory is what is driving AK Hynix and competitors including Samsung and Micron to expand production of AI memory chips.
New Releases
For example, Micron is building an HBM test and production line in Idaho. Micron also disclosed that it received $6.1 billion from the U.S. CHIPS and Science Act.
Samsung has also reportedly decided to start construction of a new semiconductor factory after delays.
“The war over memory chips is happening because ultimately AI needs memory chips,” said R. “Ray” Wang, founder of Constellation Research. “As AI sales increase, HBM will increase.”
Ultimately, the interest in memory chips shows that rapid developments in generative AI technology will benefit not just vendors like Nvidia, Intel and AMD, but also those that supply power and memory to data centers.
“This is an opportunity,” Newman said. “Memory is in the best period of any chip right now.”
Esther Ajao is a news writer and podcast host at TechTarget Editorial covering artificial intelligence software and systems.