By Max A. Cherney
SAN FRANCISCO (Reuters) -Broadcom is launching a new networking chip on Tuesday that will help companies build artificial intelligence computing systems by stringing together hundreds of thousands of chips that crunch data, deepening its rivalry with Nvidia.
The chip, called the Thor Ultra, enables computing infrastructure operators to deploy far more chips than they otherwise could, allowing them to build and run the large models used to power AI apps such as ChatGPT.
The Thor Ultra will battle Nvidia’s networking interface chips and aim to further entrench Broadcom’s control of network communications inside data centers designed for AI applications.
It comes after Broadcom on Monday unveiled a deal to roll out 10 gigawatts worth of custom chips for ChatGPT maker OpenAI beginning in the second half of 2026, challenging Nvidia’s grip on the AI accelerator market.
AI represents a big opportunity for Broadcom. Chief executive Hock Tan said late last year the market the company is going after for its various AI chips is in the range of $60 billion to $90 billion in 2027, divided between its networking chips and the data center processors it helps Alphabet’s Google and OpenAI make.
Broadcom reported AI revenue of $12.2 billion in fiscal 2024. In September, it announced a new, unnamed $10 billion customer for its custom data center AI chips.
The Thor Ultra chip, part of Broadcom’s growing catalog of networking chips, operates as a critical link between an AI system and the rest of the data center. The networking chips help data center operators move information around inside a facility.
“In the distributed computing system, network plays an extremely important role in building these large clusters,” Ram Velaga, a Broadcom senior vice president, told Reuters. “So I’m not surprised that anybody who’s in the GPU business wants to make sure that they are participating in the networking.”
FOCUSING ON CHIP DESIGN
While networking chips are crucial to the company’s plans, the AI chips Broadcom helps design for large cloud computing companies such as Google are lucrative.
Broadcom has worked on multiple generations of Google’s Tensor processor, which Google began designing more than a decade ago. The Tensor chips have generated billions of dollars in revenue for Broadcom, according to analyst estimates.
During a tour in September of Broadcom’s San Jose network chip-testing labs where engineers work on current and forthcoming designs of its networking chips, company executives detailed measures taken to construct and test new networking processors.
Towards that end, Broadcom’s engineers doubled the bandwidth on Thor Ultra compared with the prior version. They put the chips through rigorous testing and evaluation from the earliest stages of production.
To make a chip like the Thor Ultra or flagship series of Tomahawk networking switches, the engineers build an entire system around the chip. With the hardware system team, the engineers will discuss what kind of package the chip uses, how much power it will need and how much heat it will emit, Velaga said.
Broadcom does not sell servers itself but offers the designs for components and systems it creates for testing to its customers in order to give them a reference point to construct the networking infrastructure around it.
“For every dollar we invest in our silicon, there is at least $6 to $10 that our ecosystem partners are investing,” Velaga said. “So a lot of our focus is on design, as if we are almost ready to take it to production.”
(Reporting by Max A. Cherney in San Francisco; Editing by Muralikumar Anantharaman)