Close Menu
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Boeing plane deliveries are the highest in years. Now it’s ramping up

January 11, 2026

From lab to market, China accelerates frontier tech deployment-Xinhua

January 11, 2026

The cinema of falsehoods and fantasies

January 11, 2026
Facebook X (Twitter) Instagram
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us
Facebook X (Twitter) Instagram Pinterest Vimeo
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports
Nabka News
Home » AI memory is sold out, causing an unprecedented surge in prices
Tech

AI memory is sold out, causing an unprecedented surge in prices

i2wtcBy i2wtcJanuary 10, 2026No Comments7 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
Follow Us
Google News Flipboard Threads
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link


Eugene Mymrin | Moment | Getty Images

All computing devices require a part called memory, or RAM, for short-term data storage, but this year, there won’t be enough of these essential components to meet worldwide demand.

That’s because companies like Nvidia, Advanced Micro Devices and Google need so much RAM for their artificial intelligence chips, and those companies are the first ones in line for the components.

Three primary memory vendors — Micron, SK Hynix and Samsung Electronics — make up nearly the entire RAM market, and their businesses are benefitting from the surge in demand.

“We have seen a very sharp, significant surge in demand for memory, and it has far outpaced our ability to supply that memory and, in our estimation, the supply capability of the whole memory industry,” Micron business chief Sumit Sadana told CNBC this week at the CES trade show in Las Vegas.

Micron’s stock is up 247% over the past year year, and the company reported that net income nearly tripled in the most recent quarter. Samsung this week said that it expects its December quarter operating profit to nearly triple as well. Meanwhile, SK Hynix is considering a U.S. listing as its stock price in South Korea surges, and in October, the company said it had secured demand for its entire 2026 RAM production capacity.

Now, prices for memory are rising.

TrendForce, a Taipei-based researcher that closely covers the memory market, this week said it expects average DRAM memory prices to rise between 50% and 55% this quarter versus the fourth quarter of 2025. TrendForce analyst Tom Hsu told CNBC that type of increase for memory prices was “unprecedented.”

Three-to-one basis

Chipmakers like Nvidia surround the part of the chip that does the computation — the graphics processing unit, or GPU — with several blocks of a fast, specialized component called high-bandwidth memory, or HBM, Sadana said. HBM is often visible when chipmakers hold up their new chips. Micron supplies memory to both Nvidia and AMD, the two leading GPU makers.

Nvidia’s Rubin GPU, which recently entered production, comes with up to 288 gigabytes of next-generation HBM4 memory per chip. HBM is installed in eight visible blocks above and below the processor, and that GPU will be sold as part of single server rack called NVL72, which fittingly combines 72 of those GPUs into a single system. By comparison, smartphones typically come with 8 or 12GB of lower-powered DDR memory.

Nvidia founder and CEO Jensen Huang introduces the Rubin GPU and the Vera CPU as he speaks during Nvidia Live at CES 2026 ahead of the annual Consumer Electronics Show in Las Vegas, Nevada, on Jan. 5, 2026.

Patrick T. Fallon | AFP | Getty Images

But the HBM memory that AI chips need is much more demanding than the RAM used for consumers’ laptops and smartphones. HBM is designed for high-bandwidth specifications required by AI chips, and it’s produced in a complicated process where Micron stacks 12 to 16 layers of memory on a single chip, turning it into a “cube.”

When Micron makes one bit of HBM memory, it has to forgo making three bits of more conventional memory for other devices.

“As we increase HBM supply, it leaves less memory left over for the non-HBM portion of the market, because of this three-to-one basis,” Sadana said.

Hsu, the TrendForce analyst, said that memory makers are favoring server and HBM applications over other clients because there’s higher potential for growth in demand in that business and cloud service providers are less price-sensitive.

In December, Micron said it would discontinue a part of its business that aimed to provide memory for consumer PC builders so the company could save supply for AI chips and servers.

Some inside the tech industry are marveling at how much and how quickly the price of RAM for consumers has increased.

Dean Beeler, co-founder and tech chief at Juice Labs, said that a few months ago, he loaded up his computer with 256GB of RAM, the maximum amount that current consumer motherboards support. That cost him about $300 at the time.

“Who knew that would end up being ~$3,000 of RAM just a few months later,” he posted on Facebook on Monday.

Micron is building the biggest-ever U.S. chip fab, despite China ban

‘Memory wall’

AI researchers started to see memory as a bottleneck just before OpenAI’s ChatGPT hit the market in late 2022, said Majestic Labs co-founder Sha Rabii, an entrepreneur who previously worked on silicon at Google and Meta.

Prior AI systems were designed for models like convolutional neural networks, which require less memory than large language models, or LLMs, that are popular today, Rabii said.

While AI chips themselves have been getting much faster, memory has not, he said, which leads to powerful GPUs waiting around to get the data needed to run LLMs.

“Your performance is limited by the amount of memory and the speed of the memory that you have, and if you keep adding more GPUs, it’s not a win,” Rabii said.

The AI industry refers to this as the “memory wall.”

Erik Isakson | Digitalvision | Getty Images

“The processor spends more time just twiddling its thumbs, waiting for data,” Micron’s Sadana said.

More and faster memory means that AI systems can run bigger models, serve more customers simultaneously and add “context windows” that allow chatbots and other LLMs to remember previous conversations with users, which adds a touch of personalization to the experience.

Majestic Labs is designing an AI system for inference with 128 terabytes of memory, or about 100 times more memory than some current AI systems, Rabii said, adding that the company plans to eschew HBM memory for lower-cost options. Rabii said the additional RAM and architecture support in the design will enable its computers to support significantly more users at the same time than other AI servers while using less power.

Sold out for 2026

Wall Street has been asking companies in the consumer electronics business, like Apple and Dell Technologies, how they will handle the memory shortage and if they might be forced to raise prices or cut margins. These days, memory accounts for about 20% of the hardware costs of a laptop, Hsu said. That’s up from between 10% and 18% in the first half of 2025.

In October, Apple finance chief Kevan Parekh told analysts that his company was seeing a “slight tailwind” on memory prices but he downplayed it as “nothing really to note there.”

But in November, Dell said it expected its cost basis for all of its products to go up as a result of the memory shortage. COO Jefferey Clarke told analysts that Dell planned to change its mix of configurations to minimize the price impacts, but he said the shortage will likely affect retail prices for devices.

“I don’t see how this will not make its way into the customer base,” Clarke said. “We’ll do everything we can to mitigate that.”

Even Nvidia, which has emerged as the biggest customer in the HBM market, is facing questions about its ravenous memory needs — in particular, about its consumer products.

At a press conference Tuesday at CES, Nvidia CEO Jensen Huang was asked if he was concerned that the company’s gaming customers might be resentful of AI technology because of rising game console and graphics cards prices that are being driven by the memory shortage.

Huang said Nvidia is a very large customer of memory and has long relationships with the companies in the space but that, ultimately, there would need to be more memory factories because the needs of AI are so high.

“Because our demand is so high, every factory, every HBM supplier, is gearing up, and they’re all doing great,” Huang said.

At most, Micron can only meet two-thirds of the medium-term memory requirements for some customers, Sadana said. But the company is currently building two big factories called fabs in Boise, Idaho, that will start producing memory in 2027 and 2028, he said. Micron is also going to break ground on a fab in the town of Clay, New York, that he said is expect to come online in 2030.

But for now, “we’re sold out for 2026,” Sadana said.



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
i2wtc
  • Website

Related Posts

Tech

AI question every job candidate on interview should prepare to answer

January 10, 2026
Tech

Amazon plans first big-box retail store in Chicago suburb

January 9, 2026
Tech

OpenAI and Softbank Group announce $1 billion investment in SB Energy

January 9, 2026
Tech

Intel stock jumps 7% after CEO meets with Trump

January 9, 2026
Tech

Amazon Pharmacy starts offering Novo Nordisk’s Wegovy weight-loss pill

January 9, 2026
Tech

5 things to know before the stock market opens Friday

January 9, 2026
Add A Comment
Leave A Reply Cancel Reply

Top Posts

House Republicans unveil aid bill for Israel, Ukraine ahead of weekend House vote

April 17, 2024

Prime Minister Johnson presses forward with Ukraine aid bill despite pressure from hardliners

April 17, 2024

Justin Verlander makes season debut against Nationals

April 17, 2024

Tesla lays off 285 employees in Buffalo, New York as part of major restructuring

April 17, 2024
Don't Miss

Trump says China’s Xi ‘hard to make a deal with’ amid trade dispute | Donald Trump News

By i2wtcJune 4, 20250

Growing strains in US-China relations over implementation of agreement to roll back tariffs and trade…

Donald Trump’s 50% steel and aluminium tariffs take effect | Business and Economy News

June 4, 2025

The Take: Why is Trump cracking down on Chinese students? | Education News

June 4, 2025

Chinese couple charged with smuggling toxic fungus into US | Science and Technology News

June 4, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to NabkaNews, your go-to source for the latest updates and insights on technology, business, and news from around the world, with a focus on the USA, Pakistan, and India.

At NabkaNews, we understand the importance of staying informed in today’s fast-paced world. Our mission is to provide you with accurate, relevant, and engaging content that keeps you up-to-date with the latest developments in technology, business trends, and news events.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Boeing plane deliveries are the highest in years. Now it’s ramping up

January 11, 2026

From lab to market, China accelerates frontier tech deployment-Xinhua

January 11, 2026

The cinema of falsehoods and fantasies

January 11, 2026
Most Popular

China’s top political advisory body starts annual session-Xinhua

March 4, 2025

China’s top legislature concludes annual session-Xinhua

March 11, 2025

People enjoy spring blossoms across China-Xinhua

March 18, 2025
© 2026 nabkanews. Designed by nabkanews.
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us

Type above and press Enter to search. Press Esc to cancel.