X

OpenAI Partners With Broadcom To Deploy Custom AI Chips

OpenAI says developing its own chips will let the company use what it learned developing frontier AI models.

Headshot of Imad Khan
Headshot of Imad Khan
Imad Khan Former Senior Reporter
Imad was a senior reporter covering Google and internet culture. Hailing from Texas, Imad started his journalism career in 2013 and has amassed bylines with The New York Times, The Washington Post, ESPN, Tom's Guide and Wired, among others.
Expertise Google | AI | Internet Culture
Imad Khan
3 min read
OpenAI's logo, a hexagonal rosette pattern

OpenAI and Broadcom are partnering to make new AI chips. 

Illustration by Stephen Shankland/CNET

OpenAI is collaborating with chip maker Broadcom to develop and deploy newly designed AI chips, the companies said in blog posts on Monday. 


Don't miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.


The deal will involve OpenAI designing systems and accelerators, which are specialized hardware used for demanding calculations, and Broadcom deploying these chips. OpenAI says developing its own chips will allow it to use what it learned developing frontier AI models and better tune its hardware to unlock new capabilities. 

(Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

"Partnering with Broadcom is a critical step in building the infrastructure needed to unlock AI's potential and deliver real benefits for people and businesses," said Sam Altman, co-founder and CEO of OpenAI, in the blog post. "Developing our own accelerators adds to the broader ecosystem of partners all building the capacity required to push the frontier of AI to benefit all humanity."

ai-atlas-tag.png

Broadcom, too, is excited about the OpenAI partnership. 

"Our partnership with OpenAI continues to set new industry benchmarks for the design and deployment of open, scalable and power-efficient AI clusters," said Charlie Kawwas, president of the semiconductor solutions group for Broadcom. "Custom accelerators combine remarkably well with standards-based Ethernet scale-up and scale-out networking solutions to provide cost and performance optimized next generation AI infrastructure."

The server racks will include Broadcom's suite of Ethernet, PCIe and optical connectivity products. 

Representatives for OpenAI and Broadcom did not immediately respond to requests for comment.

AI on the rise

Companies have been eager to make deals with OpenAI in recent months. The company's core product, ChatGPT, is now a household name and kicked off the AI revolution. It currently boasts 800 million active weekly users and its new Sora 2 generative video model has been flooding the internet with convincing AI clips. 

Last month, chip giant Nvidia invested $100 billion into OpenAI to build out 10 gigawatts worth of data centers. Earlier this month, AMD traded 160 million of its shares, or 10% of the company, in exchange for OpenAI to buy six gigawatts upcoming MI450 chips, which are still being designed.

These deals continue to drive AI hype, although some fear such news might be fueling an AI bubble. At the same time, many of these tech companies buy from one another, trading investments and driving up each other's values, along with the overall market. 

Some fear the stock market is quickly becoming an AI-built house of cards of which a significant reverberation could collapse the whole deck, hurting the economy. To put it into some perspective, AI investments have been so vast that they surpassed US consumer spending. As a result, the valuations of Nvidia, Microsoft and Google have vaulted past $3 trillion. Actually, Nvidia's value has already usurped $4 trillion, which is is greater than the entire market equities of Italy, Germany, France, UK and Canada. The problem is that the profitability of expensive AI systems still hasn't been proven.Â