Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Wayne Williams

Wait, now Broadcom makes GPUs? Nvidia could face unexpected foe in China as ByteDance could use rival bigger than Intel, AMD, Arm and Qualcomm to design its AI chips

ByteDance.

US trade restrictions have created significant obstacles for Chinese companies, limiting their access to advanced AI hardware needed to remain competitive globally.

Nvidia’s H20 GPUs, scaled-down versions of the powerful H100, were developed to meet export control requirements but still come with a hefty price tag of around $10,000 per unit.

Even at that price, the availability of these GPUs is limited, compounding the difficulties Chinese companies face. This shortage has fueled a thriving black market for Nvidia’s high-end chips, such as the H100 and A100, where prices continue to rise due to overwhelming demand. However, global companies, particularly ByteDance - the parent company of TikTok, already under intense scrutiny in the US - cannot afford the legal and reputational risks associated with engaging in such illicit markets.

Two AI chips

ByteDance has made significant investments in AI, reportedly spending over $2 billion on Nvidia's H20 GPUs in 2024, and now according to The Information, the company is looking to develop its own AI GPUs to reduce dependency on Nvidia.

The report adds these chips will include one designed for AI training and another for AI inference, and both will be produced using TSMC’s advanced N4/N5 process, the same technology used for Nvidia’s Blackwell GPUs.

Broadcom, recognized for its AI chip designs for Google, will reportedly lead the development of these GPUs, which are expected to enter mass production by 2026. While several Chinese companies have developed their own AI GPUs to reduce reliance on Nvidia, most still depend on Nvidia’s hardware for more demanding tasks. Whether ByteDance can fully transition to its own hardware - and whether it would want to - remains to be seen.

The move will certainly not be without challenges. As Tom's Hardware notes, “The company now relies on Nvidia's CUDA and supporting software stack for AI training and inference. Once it goes with its AI GPUs, it must develop its software platform and ensure its software stack is fully compatible with its hardware."

More from TechRadar Pro

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.