Apple recently announced that they have opted to use Google’s Tensor Processing Units (TPUs) for training the artificial intelligence models that power Apple Intelligence. This move signifies a shift away from the popular Nvidia GPUs that dominate the market for high-end AI training chips. Apple’s decision to utilize Google’s TPUs was detailed in a technical paper released by the company, shedding light on the reasoning behind this strategic choice.
Apple’s entrance into the generative AI space comes later than many of its competitors who had already embraced this technology. The company introduced Apple Intelligence, a system with updated features such as improved natural language processing and AI-generated summaries in text fields. Apple plans to incorporate functions based on generative AI, including image and emoji generation, as well as enhancing Siri’s capabilities to utilize personal information and perform actions within applications. This indicates Apple’s efforts to catch up with industry trends and not fall behind in the rapidly evolving field of artificial intelligence.
In its technical paper, Apple disclosed that its Apple Foundation Model (AFM) was trained on “Cloud TPU clusters,” signifying that the company rented servers from a cloud provider to carry out the necessary calculations. Apple adopted a scalable approach to training its AI models efficiently, including on-device and server-based training, utilizing Google’s advanced TPUs for this purpose. The specifics mentioned in the paper reveal the complex and extensive infrastructure required for training cutting-edge AI models.
Google’s latest TPUs are stated to cost under $2 per hour when the chip is in use, making them a cost-effective option for organizations looking to train AI models. These TPUs, which were first introduced in 2015 for internal workloads before being made available to the public in 2017, are among the most mature custom chips designed specifically for artificial intelligence. Despite this, Google remains a prominent customer of Nvidia, utilizing both Nvidia’s GPUs and its own TPUs for various AI training tasks.
Apple’s strategic approach to AI development involves a well-thought-out combination of in-house technology and external resources to achieve its goals. By leveraging Google’s TPUs for training its AI models, Apple is demonstrating a willingness to collaborate with industry leaders to stay competitive in the AI space. The company’s focus on enhancing Siri’s capabilities and incorporating generative AI functions indicates a forward-looking mindset aimed at delivering innovative AI-powered features to its user base.
Apple’s decision to utilize Google’s TPUs for training its artificial intelligence models represents a strategic shift in the company’s approach to AI development. By embracing the capabilities of Google’s advanced TPUs and incorporating generative AI functions into its offerings, Apple is positioning itself to compete in the rapidly evolving landscape of artificial intelligence. The company’s careful selection of technology partners and focus on scalability and efficiency in training AI models highlight its commitment to delivering cutting-edge AI-powered experiences to its customers.
Leave a Reply