Google Triggers AI Hardware Reset with Gemini 3 Performance
Google has significantly altered the competitive landscape in the AI hardware sector, with its Tensor Processing Units (TPUs) powering Gemini 3 to surpass GPT-5 in independent tests. This development has put considerable pressure on both OpenAI and Nvidia.
Gemini 3's performance was primarily achieved using Google's proprietary TPUs, rather than the widely adopted Nvidia GPUs. Following these results, Sam Altman, CEO of OpenAI, reportedly instructed his staff to refocus efforts on improving ChatGPT and its foundational models.
This strategic shift within OpenAI occurred after what the company described as a "code red" situation. Concurrently, industry analysts have indicated that Google is planning to more than double its TPU production by 2028, reflecting the escalating demand for in-house developed AI chips.
Google Expands TPU Production and Enters Outside Sales Market
Google is now looking to extend the use of its TPUs beyond its own cloud infrastructure and into the broader market. A recent significant deal involved supplying 1 million TPUs to Anthropic, a transaction valued in the tens of billions of dollars, which has notably impacted Nvidia's investors.
The core concern for Nvidia is that increased TPU sales to external companies could directly reduce demand for its data-center hardware.
According to chip analysts at SemiAnalysis, TPUs are now considered to be "neck and neck with Nvidia" for both the training and deployment of advanced AI systems. Morgan Stanley projects that each increment of 500,000 TPUs sold to external clients could generate up to $13 billion in revenue for Google.
The bank anticipates that TSMC will manufacture 3.2 million TPUs next year, with production expected to rise to 5 million in 2027 and 7 million in 2028. Analysts now foresee stronger growth in 2027 than previously forecasted.
Google manufactures its processors primarily with Broadcom, receiving additional support from MediaTek. The company emphasizes its advantage lies in its comprehensive vertical integration of hardware, software, and AI models within a single ecosystem. Koray Kavukcuoglu, Google's AI architect and DeepMind CTO, stated, "The most important thing is that full stack approach. I think we have a unique approach there."
He further elaborated that Google's access to data from billions of users provides deep insights into Gemini's functionality across various products, including Search and AI Overviews.
Nvidia's stock experienced a decline last month following a report from The Information suggesting that Meta had been in discussions with Google regarding the purchase of TPUs. Meta has declined to comment on these reports. Analysts now suggest that Google could establish similar supply agreements with other major AI players like OpenAI, Elon Musk's xAI, or Safe Superintelligence, potentially generating over $100 billion in additional revenue over several years.
Nvidia Responds Amidst Deepening TPU Impact
In response to the market sell-off, Nvidia has asserted its continued leadership. The company stated that it remains "a generation ahead of the industry" and is "the only platform that runs every AI model." Nvidia also confirmed its ongoing supply relationship with Google, highlighting that its systems offer "greater performance, versatility, and fungibility" compared to TPUs, which it characterizes as targeting specific frameworks.
Meanwhile, developers are gaining access to new tools that simplify the transition away from Nvidia's proprietary Cuda software. AI coding tools are now capable of rewriting workloads for TPU systems more rapidly than before, diminishing one of Nvidia's key competitive advantages.
The development of TPUs predates the current AI boom. In 2013, Jeff Dean, Google's chief scientist, delivered an internal presentation following a significant advancement in deep neural networks for speech systems. Jonathan Ross, who was then a hardware engineer at Google, recalled the situation: "The first slide was good news, machine learning finally works. Slide two said bad news, we can’t afford it." Dean had calculated that if hundreds of millions of users interacted with Google for three minutes daily, the company's data-center capacity would need to double, incurring costs in the tens of billions of dollars.
Ross initiated the development of the first TPU as a side project in 2013, working near the speech team. He stated in December 2023, "We built that first chip with about 15 people." Ross now leads the AI chip firm Groq.
The year 2016 marked a significant milestone in AI history when AlphaGo defeated the world Go champion Lee Sedol. Since that event, TPUs have been instrumental in powering Google's Search, advertising, and YouTube services for years.
Google previously updated its TPUs every two years, but this cycle was accelerated to an annual update two years ago, in 2023.
A Google spokesperson commented on the increasing demand, stating, "Google Cloud is seeing growing demand for both our custom TPUs and Nvidia GPUs. We will continue supporting both."

