Apple has been surprised by the rapid development of AI tools like OpenAI’s ChatGPT and is now moving fast to develop competitive features for its devices.
At its recent developer conference, Apple did not talk about AI. But it did talk a lot about “machine learning” that will be used in many Apple products. Examples include the improved autocorrect in iOS 17, which is based on a language model, or the computer vision wonder “Apple Vision Pro”.
Has Apple slept on AI?
Apple’s confident messaging at its developer conference has countered the perception that the company is sleeping on AI.
But according to Bloomberg’s Apple correspondent Mark Gurman, there is some truth to that impression.
Apple was caught off guard by the tech industry’s sudden focus on AI and has been working hard since late last year to catch up, Gurman reports, citing insider information.
Gurman quotes an insider as saying that Apple is scared and feels it has missed out on an important development. “There’s a lot of anxiety about this and it’s considered a pretty big miss internally.”
Even Siri, which seems like an obvious product for AI improvements, especially with ChatGPT’s latest audio features, has yet to benefit from advances in generative AI.
Leaders in Apple’s AI offensive
John Giannandrea and Craig Federighi, Apple’s senior vice presidents of AI and software engineering, and Eddy Cue, head of services, are leading the company into generative AI. Together, they are said to have an annual budget of $1 billion. Compared to the investments of Microsoft, OpenAI, or Google, this still sounds modest for a tech giant like Apple.
Giannandrea’s team is focused on developing the technology for a new AI system and reworking Siri to make it smarter. The improved version of Siri could come out next year, but the technology is still immature and there are safety concerns.
Federighi’s software development group is integrating AI into the next version of iOS, and is tasked with incorporating features based on Apple’s large language model (LLM). Meanwhile, Cue’s organization is exploring how to integrate AI into as many applications as possible.
AI in Apple’s product line
Apple’s software development teams are considering integrating generative AI into development tools like Xcode, according to internal sources. The move could speed up the app development process and align Apple’s services with Microsoft’s GitHub Copilot, which offers automatic code completion suggestions to developers, Gurman reports.
Cue’s team is also exploring the potential of generative AI in productivity apps. For example, AI could help with writing in apps like Pages, or automatically create slide presentations in Keynote, similar to features Microsoft has introduced in its Word and PowerPoint apps.
Another application area is implementing AI in internal customer service tools within the AppleCare Group. This would make Apple’s customer service more efficient and responsive, and improve the overall user experience.
Apple AI: On-device or Cloud-based
There is a fundamental internal debate about how generative AI should be deployed, Gurman notes: either as an on-device capability, a cloud-based solution, or a combination of the two.
The on-device approach would be consistent with Apple’s commitment to privacy. It would also be faster to implement, but cloud-based LLMs would give Apple more flexibility, which would be a big advantage given the rapid evolution of AI. A combined approach is likely, Gurman adds.
Apple reportedly spends millions of dollars a day on AI training
According to an earlier report by Gurman, Apple is already using a language model internally to prototype, summarize text, and answer questions about the data it is trained on. The model is based on the Ajax framework and is called Apple GPT. Its use in customer applications is still unclear.
The Information previously reported that Apple is spending millions of dollars a day to train its AI systems. The 16-person “Foundational Models” team, led by Giannandrea, focused on conversational AI similar to ChatGPT.
The previously leaked “Ajax GPT” model is said to be more capable than OpenAI’s GPT-3.5, with more than 200 billion parameters. Research on this model is expected to benefit users in the form of more accurate voice commands for the iPhone.
For example, Apple plans to introduce a voice feature that allows users to send GIFs via voice command using Apple’s Shortcut app. This feature is expected to arrive with a new version of iOS next year.