Advertisement

AMD reveals next-generation AI chips with OpenAI CEO Sam Altman


Thank you for reading this post, don't forget to subscribe!

Lisa Su, CEO of Superior Micro Gadgets, testifies throughout the Senate Commerce, Science and Transportation Committee listening to titled “Profitable the AI Race: Strengthening U.S. Capabilities in Computing and Innovation,” in Hart constructing on Thursday, Could 8, 2025.

Tom Williams | CQ-Roll Name, Inc. | Getty Photos

Superior Micro Gadgets on Thursday unveiled new particulars about its next-generation AI chips, the Intuition MI400 sequence, that can ship subsequent 12 months.

The MI400 chips will be capable of be assembled right into a full server rack referred to as Helios, AMD mentioned, which can allow 1000’s of the chips to be tied collectively in a approach that they can be utilized as one “rack-scale” system.

“For the primary time, we architected each a part of the rack as a unified system,” AMD CEO Lisa Su mentioned at a launch occasion in San Jose, California, on Thursday.

OpenAI CEO Sam Altman appeared on stage on with Su and mentioned his firm would use the AMD chips.

“If you first began telling me concerning the specs, I used to be like, there isn’t any approach, that simply sounds completely loopy,” Altman mentioned. “It is gonna be an incredible factor.”

AMD’s rack-scale setup will make the chips look to a person like one system, which is essential for many synthetic intelligence clients like cloud suppliers and firms that develop giant language fashions. These clients need “hyperscale” clusters of AI computer systems that may span whole information facilities and use huge quantities of energy.

“Consider Helios as actually a rack that features like a single, huge compute engine,” mentioned Su, evaluating it towards Nvidia’s Vera Rubin racks, that are anticipated to be launched subsequent 12 months.

OpenAI CEO Sam Altman poses throughout the Synthetic Intelligence (AI) Motion Summit, on the Grand Palais, in Paris, on February 11, 2025. 

Joel Saget | Afp | Getty Photos

AMD’s rack-scale expertise additionally allows its newest chips to compete with Nvidia’s Blackwell chips, which already are available configurations with 72 graphics-processing models stitched collectively. Nvidia is AMD’s main and solely rival in massive information middle GPUs for creating and deploying AI functions.

OpenAI — a notable Nvidia buyer — has been giving AMD suggestions on its MI400 roadmap, the chip firm mentioned. With the MI400 chips and this 12 months’s MI355X chips, AMD is planning to compete towards rival Nvidia on value, with an organization government telling reporters on Wednesday that the chips will price much less to function because of decrease energy consumption, and that AMD is undercutting Nvidia with “aggressive” costs.

To date, Nvidia has dominated the marketplace for information middle GPUs, partially as a result of it was the primary firm to develop the form of software program wanted for AI builders to reap the benefits of chips initially designed to show graphics for 3D video games. Over the previous decade, earlier than the AI increase, AMD centered on competing towards Intel in server CPUs.

Su mentioned that AMD’s MI355X can outperform Nvidia’s Blackwell chips, regardless of Nvidia utilizing its “proprietary” CUDA software program.

“It says that we’ve got actually sturdy {hardware}, which we all the time knew, however it additionally exhibits that the open software program frameworks have made large progress,” Su mentioned.

AMD shares are flat up to now in 2025, signaling that Wall Road would not but see it as a serious risk to Nvidia’s dominance.

Andrew Dieckmann, AMD’s common manger for information middle GPUs, mentioned Wednesday that AMD’s AI chips would price much less to function and fewer to accumulate.

“Throughout the board, there’s a significant price of acquisition delta that we then layer on our efficiency aggressive benefit on high of, so vital double-digit proportion financial savings,” Dieckmann mentioned.

Over the following few years, massive cloud firms and international locations alike are poised to spend tons of of billions of {dollars} to construct new information middle clusters round GPUs as a way to speed up the event of cutting-edge AI fashions. That features $300 billion this 12 months alone in deliberate capital expenditures from megacap expertise firms.

AMD is anticipating the full marketplace for AI chips to exceed $500 billion by 2028, though it hasn’t mentioned how a lot of that market it will probably declare — Nvidia has over 90% of the market at present, in response to analyst estimates.

Each firms have dedicated to releasing new AI chips on an annual foundation, versus a biannual foundation, emphasizing how fierce competitors has grow to be and the way essential bleeding-edge AI chip expertise is for firms like Microsoft, Oracle and Amazon.

AMD has purchased or invested in 25 AI firms prior to now 12 months, Su mentioned, together with the buy of ZT Techniques earlier this 12 months, a server maker that developed the expertise AMD wanted to construct its rack-sized techniques.

“These AI techniques are getting tremendous difficult, and full-stack options are actually important,” Su mentioned.

What AMD is promoting now

At present, probably the most superior AMD AI chip being put in from cloud suppliers is its Intuition MI355X, which the corporate mentioned began delivery in manufacturing final month. AMD mentioned that it could be obtainable for lease from cloud suppliers beginning within the third quarter.

Corporations constructing giant information middle clusters for AI need alternate options to Nvidia, not solely to maintain prices down and supply flexibility, but additionally to fill a rising want for “inference,” or the computing energy wanted for truly deploying a chatbot or generative AI utility, which may use rather more processing energy than conventional server functions.

“What has actually modified is the demand for inference has grown considerably,” Su mentioned.

AMD officers mentioned Thursday that they imagine their new chips are superior for inference to Nvidia’s. That is as a result of AMD’s chips are outfitted with extra high-speed reminiscence, which permits greater AI fashions to run on a single GPU.

The MI355X has seven occasions the quantity of computing energy as its predecessor, AMD mentioned. These chips will be capable of compete with Nvidia’s B100 and B200 chips, which have been delivery since late final 12 months.

AMD mentioned that its Intuition chips have been adopted by seven of the ten largest AI clients, together with OpenAI, Tesla, xAI, and Cohere.

Oracle plans to supply clusters with over 131,000 MI355X chips to its clients, AMD mentioned.

Officers from Meta mentioned Thursday that they have been utilizing clusters of AMD’s CPUs and GPUs to run inference for its Llama mannequin, and that it plans to purchase AMD’s next-generation servers.

A Microsoft consultant mentioned that it makes use of AMD chips to serve its Copilot AI options.

Competing on value

AMD declined to say how a lot its chips price — it would not promote chips by themselves, and end-users often purchase them via a {hardware} firm like Dell or Tremendous Micro Laptop — however the firm is planning for the MI400 chips to compete on value.

The Santa Clara firm is pairing its GPUs alongside its CPUs and networking chips from its 2022 acquisition of Pensando to construct its Helios racks. Meaning better adoption of its AI chips must also profit the remainder of AMD’s enterprise. It is also utilizing an open-source networking expertise to carefully combine its rack techniques, referred to as UALink, versus Nvidia’s proprietary NVLink.

AMD claims its MI355X can ship 40% extra tokens — a measure of AI output — per greenback than Nvidia’s chips as a result of its chips use much less energy than its rival’s.

Knowledge middle GPUs can price tens of 1000’s of {dollars} per chip, and cloud firms often purchase them in giant portions.

AMD’s AI chip enterprise continues to be a lot smaller than Nvidia’s. It mentioned it had $5 billion in AI gross sales in its fiscal 2024, however JP Morgan analysts predict 60% progress within the class this 12 months.

WATCH: AMD CEO Lisa Su: Chip export controls are a headwind however we nonetheless see progress alternative

AMD CEO Lisa Su: Chip export controls are a headwind but we still see growth opportunity