Sundar Pichai and Tim Cook
Source: Reuters; Apple
apple On Monday, it said that the artificial intelligence model that supports Appleās intelligence, its artificial intelligence systempre-trained on a Google-designed processor, a sign that big tech companies are looking for alternatives Nvidia When it comes to training for cutting-edge artificial intelligence.
Apple’s Choice Google’s The domestic tensor processing unit (TPU) used for training is in A technical paper just published company. In addition, Apple also released preview edition On Monday, Apple Intelligence was released for some devices.
Nvidia’s expensive graphics processing units (GPUs) dominate the market High-end AI training chip, and has been in such high demand over the past few years that it has been difficult to procure the required quantities. Open artificial intelligence, MicrosoftAnthropic all use Nvidia GPUs in their models, and other technology companies, including Google, Yuan, Oracle and Tesla They are being snapped up to build their AI systems and products.
CEO Yuan Mark Zuckerberg and Alphabet CEO Sundar Pichai Both men made comments last week saying their companies and others in the industry Possibly overinvested AI infrastructure, but acknowledges that the business risks of not doing so are too high.
“The disadvantage of falling behind is that you won’t be able to master the most important technologies in the next 10 to 15 years,” Zuckerberg said at a conference. podcast With Bloomberg’s Emily Chang.
Apple doesn’t mention Google or Nvidia in its 47-page paper, but it does note that its Apple Foundation Model (AFM) and AFM server are trained on a “cloud TPU cluster.” This means Apple rents servers from cloud providers to perform calculations.
“This system enables us to efficiently and scalably train AFM models, including on-device AFM, AFM servers, and larger models,” Apple said in the paper.
Representatives for Apple and Google did not respond to requests for comment.
Apple announced its artificial intelligence plans later than many of its peers, and soon after OpenAI launched ChatGPT in late 2022, Apple loudly embraced generative artificial intelligence. Apple information. The system includes several new features, such as a new look for Siri, better natural language processing, and artificial intelligence-generated summaries in text fields.
Next year, Apple plans to launch features based on generative artificial intelligence, including image generation, emoji generation and an enhanced version of Siri that can access users’ personal information and perform operations within apps.
Apple said in Monday’s paper that the AFM on the device is trained on a single “slice” of 2,048 TPU v5p chips working together. This is a state-of-the-art TPU, first launched in December. According to the paper, the AFM server was trained on 8,192 TPU v4 chips configured to work together as eight slices over the data center network.
According to reports, Google’s latest TPU will cost less than $2 per hour if booked three years in advance. Go to Google website. Google first introduced TPUs for internal workloads in 2015 and made them Open to the outside world 2017.
Still, Google remains one of Nvidia’s largest customers. It uses Nvidia’s GPUs and its own TPUs to train artificial intelligence systems, and sells access to Nvidia’s technology on its cloud.
Apple has previously said that inference, which is taking a pre-trained artificial intelligence model and running it to generate content or make predictions, will occur in part on its own chips in Apple data centers.
This is the second technical paper on Apple’s AI system, following the release of a more general version in June. Apple says then The company uses TPU when developing artificial intelligence models.
Apple is scheduled to report quarterly results after the market close on Thursday.