Close Menu
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Timeline of Iran’s nuclear programme

June 13, 2025

Here’s how to turn off public posting on the Meta AI app

June 13, 2025

WNBA, Scripps renew media rights deal

June 13, 2025
Facebook X (Twitter) Instagram
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us
Facebook X (Twitter) Instagram Pinterest Vimeo
Nabka News
  • Home
  • News
  • Business
  • China
  • India
  • Pakistan
  • Political
  • Tech
  • Trend
  • USA
  • Sports
Nabka News
Home » AMD reveals next-generation AI chips with OpenAI CEO Sam Altman
Tech

AMD reveals next-generation AI chips with OpenAI CEO Sam Altman

i2wtcBy i2wtcJune 12, 2025No Comments7 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
Follow Us
Google News Flipboard Threads
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link


Lisa Su, CEO of Advanced Micro Devices, testifies during the Senate Commerce, Science and Transportation Committee hearing titled “Winning the AI Race: Strengthening U.S. Capabilities in Computing and Innovation,” in Hart building on Thursday, May 8, 2025.

Tom Williams | CQ-Roll Call, Inc. | Getty Images

Advanced Micro Devices on Thursday unveiled new details about its next-generation AI chips, the Instinct MI400 series, that will ship next year.

The MI400 chips will be able to be assembled into a full server rack called Helios, AMD said, which will enable thousands of the chips to be tied together in a way that they can be used as one “rack-scale” system.

“For the first time, we architected every part of the rack as a unified system,” AMD CEO Lisa Su said at a launch event in San Jose, California, on Thursday.

OpenAI CEO Sam Altman appeared on stage on with Su and said his company would use the AMD chips.

“When you first started telling me about the specs, I was like, there’s no way, that just sounds totally crazy,” Altman said. “It’s gonna be an amazing thing.”

AMD’s rack-scale setup will make the chips look to a user like one system, which is important for most artificial intelligence customers like cloud providers and companies that develop large language models. Those customers want “hyperscale” clusters of AI computers that can span entire data centers and use massive amounts of power.

“Think of Helios as really a rack that functions like a single, massive compute engine,” said Su, comparing it against Nvidia’s Vera Rubin racks, which are expected to be released next year.

OpenAI CEO Sam Altman poses during the Artificial Intelligence (AI) Action Summit, at the Grand Palais, in Paris, on February 11, 2025. 

Joel Saget | Afp | Getty Images

AMD’s rack-scale technology also enables its latest chips to compete with Nvidia’s Blackwell chips, which already come in configurations with 72 graphics-processing units stitched together. Nvidia is AMD’s primary and only rival in big data center GPUs for developing and deploying AI applications.

OpenAI — a notable Nvidia customer — has been giving AMD feedback on its MI400 roadmap, the chip company said. With the MI400 chips and this year’s MI355X chips, AMD is planning to compete against rival Nvidia on price, with a company executive telling reporters on Wednesday that the chips will cost less to operate thanks to lower power consumption, and that AMD is undercutting Nvidia with “aggressive” prices.

So far, Nvidia has dominated the market for data center GPUs, partially because it was the first company to develop the kind of software needed for AI developers to take advantage of chips originally designed to display graphics for 3D games. Over the past decade, before the AI boom, AMD focused on competing against Intel in server CPUs.

Su said that AMD’s MI355X can outperform Nvidia’s Blackwell chips, despite Nvidia using its “proprietary” CUDA software.

“It says that we have really strong hardware, which we always knew, but it also shows that the open software frameworks have made tremendous progress,” Su said.

AMD shares are flat so far in 2025, signaling that Wall Street doesn’t yet see it as a major threat to Nvidia’s dominance.

Andrew Dieckmann, AMD’s general manger for data center GPUs, said Wednesday that AMD’s AI chips would cost less to operate and less to acquire.

“Across the board, there is a meaningful cost of acquisition delta that we then layer on our performance competitive advantage on top of, so significant double-digit percentage savings,” Dieckmann said.

Over the next few years, big cloud companies and countries alike are poised to spend hundreds of billions of dollars to build new data center clusters around GPUs in order to accelerate the development of cutting-edge AI models. That includes $300 billion this year alone in planned capital expenditures from megacap technology companies.

AMD is expecting the total market for AI chips to exceed $500 billion by 2028, although it hasn’t said how much of that market it can claim — Nvidia has over 90% of the market currently, according to analyst estimates.

Both companies have committed to releasing new AI chips on an annual basis, as opposed to a biannual basis, emphasizing how fierce competition has become and how important bleeding-edge AI chip technology is for companies like Microsoft, Oracle and Amazon.

AMD has bought or invested in 25 AI companies in the past year, Su said, including the purchase of ZT Systems earlier this year, a server maker that developed the technology AMD needed to build its rack-sized systems.

“These AI systems are getting super complicated, and full-stack solutions are really critical,” Su said.

What AMD is selling now

Currently, the most advanced AMD AI chip being installed from cloud providers is its Instinct MI355X, which the company said started shipping in production last month. AMD said that it would be available for rent from cloud providers starting in the third quarter.

Companies building large data center clusters for AI want alternatives to Nvidia, not only to keep costs down and provide flexibility, but also to fill a growing need for “inference,” or the computing power needed for actually deploying a chatbot or generative AI application, which can use much more processing power than traditional server applications.

“What has really changed is the demand for inference has grown significantly,” Su said.

AMD officials said Thursday that they believe their new chips are superior for inference to Nvidia’s. That’s because AMD’s chips are equipped with more high-speed memory, which allows bigger AI models to run on a single GPU.

The MI355X has seven times the amount of computing power as its predecessor, AMD said. Those chips will be able to compete with Nvidia’s B100 and B200 chips, which have been shipping since late last year.

AMD said that its Instinct chips have been adopted by seven of the 10 largest AI customers, including OpenAI, Tesla, xAI, and Cohere.

Oracle plans to offer clusters with over 131,000 MI355X chips to its customers, AMD said.

Officials from Meta said Thursday that they were using clusters of AMD’s CPUs and GPUs to run inference for its Llama model, and that it plans to buy AMD’s next-generation servers.

A Microsoft representative said that it uses AMD chips to serve its Copilot AI features.

Competing on price

AMD declined to say how much its chips cost — it doesn’t sell chips by themselves, and end-users usually buy them through a hardware company like Dell or Super Micro Computer — but the company is planning for the MI400 chips to compete on price.

The Santa Clara company is pairing its GPUs alongside its CPUs and networking chips from its 2022 acquisition of Pensando to build its Helios racks. That means greater adoption of its AI chips should also benefit the rest of AMD’s business. It’s also using an open-source networking technology to closely integrate its rack systems, called UALink, versus Nvidia’s proprietary NVLink.

AMD claims its MI355X can deliver 40% more tokens — a measure of AI output — per dollar than Nvidia’s chips because its chips use less power than its rival’s.

Data center GPUs can cost tens of thousands of dollars per chip, and cloud companies usually buy them in large quantities.

AMD’s AI chip business is still much smaller than Nvidia’s. It said it had $5 billion in AI sales in its fiscal 2024, but JP Morgan analysts are expecting 60% growth in the category this year.

WATCH: AMD CEO Lisa Su: Chip export controls are a headwind but we still see growth opportunity

AMD CEO Lisa Su: Chip export controls are a headwind but we still see growth opportunity



Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email WhatsApp Copy Link
i2wtc
  • Website

Related Posts

Tech

Here’s how to turn off public posting on the Meta AI app

June 13, 2025
Tech

Archer Aviation drops 15% on $850 million share sale

June 13, 2025
Tech

Scale AI founder Wang announces exit for Meta part of $14 billion deal

June 13, 2025
Tech

Scale AI promotes strategy chief Droege to CEO as Wang heads for Meta

June 13, 2025
Tech

Coinbase beefs up subscription plan by offering American Express card

June 12, 2025
Tech

Google suffers cloud outage, disruptions for many internet services

June 12, 2025
Add A Comment
Leave A Reply Cancel Reply

Top Posts

Timeline of Iran’s nuclear programme

June 13, 2025

House Republicans unveil aid bill for Israel, Ukraine ahead of weekend House vote

April 17, 2024

Prime Minister Johnson presses forward with Ukraine aid bill despite pressure from hardliners

April 17, 2024

Justin Verlander makes season debut against Nationals

April 17, 2024
Don't Miss

Trump says China’s Xi ‘hard to make a deal with’ amid trade dispute | Donald Trump News

By i2wtcJune 4, 20250

Growing strains in US-China relations over implementation of agreement to roll back tariffs and trade…

Donald Trump’s 50% steel and aluminium tariffs take effect | Business and Economy News

June 4, 2025

The Take: Why is Trump cracking down on Chinese students? | Education News

June 4, 2025

Chinese couple charged with smuggling toxic fungus into US | Science and Technology News

June 4, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

About Us
About Us

Welcome to NabkaNews, your go-to source for the latest updates and insights on technology, business, and news from around the world, with a focus on the USA, Pakistan, and India.

At NabkaNews, we understand the importance of staying informed in today’s fast-paced world. Our mission is to provide you with accurate, relevant, and engaging content that keeps you up-to-date with the latest developments in technology, business trends, and news events.

Facebook X (Twitter) Pinterest YouTube WhatsApp
Our Picks

Timeline of Iran’s nuclear programme

June 13, 2025

Here’s how to turn off public posting on the Meta AI app

June 13, 2025

WNBA, Scripps renew media rights deal

June 13, 2025
Most Popular

China denies claims it disrupted Ukraine peace summit

June 4, 2024

Allies warn former fighter pilots not to train Chinese soldiers

June 5, 2024

China’s exports grow 7.6% in May, beating expectations

June 7, 2024
© 2025 nabkanews. Designed by nabkanews.
  • Home
  • About NabkaNews
  • Advertise with NabkaNews
  • DMCA Policy
  • Privacy Policy
  • Terms of Use
  • Contact us

Type above and press Enter to search. Press Esc to cancel.