NIX Solutions: Apple to Rely on Google TPUs

In its quest to develop artificial intelligence systems, Apple is currently relying on third-party components, specifically Google’s tensor processors. This choice sets Apple apart from many competitors who predominantly use Nvidia hardware. CNBC’s report, based on Apple’s explanatory note, sheds light on the computing clusters used by the company to train its large language models, which will underpin the Apple Intelligence technology.

NIX Solutions

Training Process and Hardware Apple’s documentation reveals the use of “cloud clusters based on TPUs” (Tensor Processing Units), Google’s proprietary hardware for AI tasks. This indicates that Apple is leveraging Google’s cloud computing resources for its AI development stage. The company trained its Apple Foundation Model using two clusters:

  1. A cluster of 2,048 Google v5p processors (the most advanced) for the model that will run on end devices.
  2. A larger cluster with 8,192 v4 processors for training the server part of the model.

Google rents out these clusters at $2 per hour per processor. Interestingly, while Apple relies on Google’s hardware, Google itself uses Nvidia chips alongside its own processors for training language models.

Future Developments

Although Apple currently depends on third-party hardware, there’s speculation that the company may develop its own server processors for AI systems in the future, notes NIX Solutions. As Apple continues to refine its AI technology, we’ll keep you updated on any significant developments in their approach or infrastructure.

The preliminary version of Apple Intelligence has already been demonstrated on some devices, signaling the company’s progress in this field. As Apple moves forward with its AI initiatives, the tech community will be watching closely to see how its strategy evolves and how it competes with other major players in the AI space.