Apple has opted to use Google’s Tensor Processing Units (TPUs) for its new AI software infrastructure, moving away from Nvidia’s GPUs.
Points
- Apple selects Google’s TPUs over Nvidia’s GPUs for AI infrastructure.
- Apple unveils its new AI system, Apple Intelligence.
- The new system includes advanced features like improved Siri and AI summaries.
- Google’s TPUs enable the building of more complex AI models.
- The decision reflects a shift in the AI hardware landscape.
Apple has revealed its choice to utilize Google’s Tensor Processing Units (TPUs) for its new AI software infrastructure, marking a significant departure from the industry norm where Nvidia’s GPUs are typically preferred for AI model training.
Apple also unveiled its new Artificial Intelligence
Apple Chooses Google TPUs for AI Infrastructure, Sidesteps Nvidia
-
– title start — Apple Chooses Google TPUs for AI Infrastructure, Sidesteps Nvidia — title end —
-
– start content —
Apple has opted to use Google’s Tensor Processing Units (TPUs) for its new AI software infrastructure, moving away from Nvidia’s GPUs.
Points
- Apple selects Google’s TPUs over Nvidia’s GPUs for AI infrastructure.
- Apple unveils its new AI system, Apple Intelligence.
- The new system includes advanced features like improved Siri and AI summaries.
- Google’s TPUs enable the building of more complex AI models.
- The decision reflects a shift in the AI hardware landscape.
Apple has revealed its choice to utilize Google’s Tensor Processing Units (TPUs) for its new AI software infrastructure, marking a significant departure from the industry norm where Nvidia’s GPUs are typically the go-to for AI model training.
Apple also unveiled its Artificial Intelligence system, Apple Intelligence, along with a technical paper. The new system boasts several advanced features, including an improved Siri, enhanced natural language processing, and AI-generated summaries. In the next year, Apple plans to incorporate generative AI capabilities into the system, such as image and emoji generation and a more advanced Siri capable of retrieving and using information from apps.
The technical paper detailed that Apple’s AFM on-device model was trained with 2048 TPU v5p chips, while the AFM-server utilized 8192 TPU v4 chips. Google’s TPU v5p, announced in December, enhances the efficiency of AI computation and remains one of the most sophisticated custom chips for running AI operations.
Nvidia has dominated the market for GPUs used in training neural networks, but their high costs and limited availability have led organizations to seek alternatives. Nvidia’s key customers include OpenAI, Microsoft, and Anthropic, while other tech giants like Google, Meta, Oracle, and Tesla also explore different AI capabilities.
Tech leaders like Meta’s CEO Mark Zuckerberg and Alphabet’s CEO Sundar Pichai have expressed concerns about overinvestment in AI infrastructure but agree that underinvestment poses a high risk of losing ground in this future-defining technology. While being one of Nvidia’s largest customers, Google also employs TPUs to train AI models and offers Nvidia GPUs via cloud services.
Apple’s decision to use Google’s TPUs is a strategic move to build more complex and larger AI models. The detailed research paper, which includes information disclosed at Apple’s Worldwide Developers Conference in June, indicates that these TPUs provide the computational power needed for Apple’s advanced AI plans. When Apple releases its AI features to beta users, the effects of its hardware decisions will become more apparent.
Analysis
- Strategic Shift: Apple’s decision to use Google’s TPUs marks a significant shift in the AI hardware landscape, moving away from Nvidia’s GPUs.
- Advanced Capabilities: The new AI system, Apple Intelligence, includes advanced features that enhance user experience and interaction.
- Market Implications: This move could influence other tech companies to explore alternatives to Nvidia’s GPUs, potentially reshaping the AI hardware market.