Apple expanded the AI prowess of its upcoming iOS 18, iPadOS 18 and MacOS Sequoia operating systems during its WWDC 2024 keynote on Monday, using the catch-all term “Apple Intelligence” as its moniker for its version of artificial intelligence.
The company's broad concept combines generative AI models with personal context in iPhone, iPad and Mac to "deliver truly helpful intelligence" to users, CEO Tim Cook said.
Officials described how most AI processing will be kept local on each Apple hardware device running powerful silicon where it can be kept separate from cloud servers where data privacy can be imperiled. In some cases where cloud servers are needed, they will run Apple silicon to sort out only the needed data for a specific task.
“This is AI for the rest of us,” said software chief Craig Federighi in a pre-recorded presentation that took up 45 minutes for the new AI content after an hour-long series of separate updates to Vision Pro, Apple Watch and the entire Apple personal product line.
Presenters ticked off a number of new Apple Intelligence capabilities. They were varied but also as simple as making Siri smarter and able to understand corrections a user makes in natural language commands. (For example, if a user tells Siri, “I meant we should meet at Muir Woods instead of the beach,” the intent would be understood with AI capabilities. In that example, a user could follow up and instruct Siri to create an invitation to be emailed based on time and place, with driving directions, to “there,” which Siri would understand to be the previously-mentioned Muir Woods.)
More complex features will allow an iPhone user to send a message to a friend with GenMoji and Image Playground, generating an image on the fly with, perhaps, that person’s image found in the phone database and stylized with colors and themes. There were also multiple examples during the presentation of using AI to make suggestions for writing emails or texts, depending on context from previous messages, as well as correcting grammar and structure.
A connection to OpenAI v. 4.o will also be possible when the OS updates happen in beta later this year. The connection to ChatGPT will be possible without creating a separate account, and an existing ChatGPT account can be used in Siri.
The various new capabilities are of course software-based but built on acceleration hardware from Apple processors like the A17 and M family, Apple noted. These silicon advancements have been gradually taking on a bigger role in recent OS generations with Apple marketing materials mostly called its capability “machine language” instead of the more current and widely used “artificial intelligence.”
Apple also made the case for local, on-device processing of data, made possible with the compute power of its chips. That prevents any personal data held on a user’s iPhone, iPad or Mac from being used by a cloud provider without a person’s permission.
Apple introduces Private Cloud Compute
In some cases, Apple acknowledged a server in the cloud might be needed for computation in a complex AI model. To keep security of private data in the cloud, Apple has created Private Cloud Compute where Apple silicon runs on certain cloud servers. Apple Intelligence analyzes the data to make sure only the user’s data relevant to the task is analyzed on the server and the data is never used later by Apple. In Private Cloud Compute, a person's data is never stored and is used only for their requests and the privacy promite can be verified.
Apple Intelligence “is the beginning of an exciting new chapter,” Cook concluded. “We are just getting started and I hope you are as excited as I am…We think AI is going to be indispensable.”
Early reactions to Apple Intelligence announcement at WWDC
A number of analysts viewed the Apple Intelligence concept as important for Apple, especially as the company's AI strategy has taken some time to be formalized.
Apple Intelligence "is probably their most massive 'just one more thing' moment in years," remarked Leonard Lee, analyst and founder of NeXt Curve. Even so, he raised some questions in an online post, including how Private Cloud Compute will work. It "allows the scaling of Apple Intelligence to servers using Apple silicon. I'm not sure how this is supposed to make this hybrid architecture more private or secure. Lots of question marks here and tough questions to be asked."
Lee also wondered how users will pay for Apple Intelligence capabilities. "There was no indication of Apple Intelligence fees unless the costs will be absorbed by iCloud subscription revenue or offloaded onto devices."