“We need to build machines that learn from experience.” - Geoffrey Hinton, pioneer of deep learning
Google’s AI hardware splash
Data giant Google’s move to supply its tensor processing units (TPUs) to companies like Meta marks a major strategic shift in the AI industry. Until now, Google reserved these chips for its own data centres as a proprietary play. With Meta considering TPUs from 2027 and renting TPU capacity from 2026, Google is positioning itself as a direct competitor to Nvidia, which currently dominates the AI chip market. The market is thus entering a totally new phase of hyper-competition.
Gemini boosted the TPUs
The launch of Google’s new Gemini model end-November boosted confidence in TPU capability. Early comparisons suggest Gemini outperforms some rivals. Meta was thus keen on Google hardware, giving credence to Google’s footprint in the AI infrastructure race. Google being a giant, this news has had the expected market reaction, with a temporary slide in Nvidia’s stock. TPUs could claim a part of Nvidia’s future annual revenues, and since AI chip demand is unstoppable, even AMD is focussing on in-house hardware.
Chips all the way
Even giants like Amazon, Microsoft and Anthropic are exploring Google TPUs. Apple continues to train models on its own chips. The industry displays a mix of rivalry and interdependence as the Big Tech daddies need to outdo each other while needing each other! Google’s TPU approach differs from Nvidia’s GPU orientation as TPUs are specialized for matrix computation in deep learning and may prove cost-effective for LLMs (large language models). Of course it was Google that gave the world the pathbreaking Transformer Architecture, and its hardware push is a natural extension of that.
Summary
Google’s shift into supplying TPUs to major companies challenges Nvidia’s dominance and widens competition in the AI chip market. Strong model performance, growing demand and high stakes for big tech make this a pivotal moment for AI hardware.
Food for thought
If TPUs become widely adopted, how might that alter the balance of power among AI companies?
AI concept to learn: Tensor Processing Units (TPUs)
Tensor processing units are custom hardware designed by Google to accelerate mathematical operations central to deep learning. They handle matrix multiplications more efficiently than general purpose chips. They help large language models train and run at far lower cost and higher speed.
[The Billion Hopes Research Team shares the latest AI updates for learning and awareness. Various sources are used. All copyrights acknowledged. This is not a professional, financial, personal or medical advice. Please consult domain experts before making decisions. Feedback welcome!]
COMMENTS