Apple Unveils “Apple Intelligence” with OpenAI Integration & Updated Siri

Apple WWDC24 Apple Intelligence

Apple’s WWDC Day 1 announced Apple Intelligence, which will bolster Siri with AI capabilities, fuel AI workflows within apps like Notes, and partner with OpenAI.

The first day of Apple’s WWDC unveiled Apple’s push into the generative AI arena officially, finally. Collectively dubbed “Apple Intelligence,” it’s a host of improvements in existing Apple products such as Siri, Notes, Phone, etc. allowing you to accomplish more, faster. Notably, the announcement also included a partnership with OpenAI – an integration at the OS level (for now, Apple seems to be relying on a third party for such a core function instead of rolling out its own novel LLM).

In its official announcement, Apple calls this group of updates a “personal intelligence system that puts powerful generative models at the core of iPhone, iPad, and Mac.”

Siri’s next generation will roll out with a slew of AI-powered updates in the next Apple operating systems, making it more capable and more in line with the expectations of people who want raw generative AI power to be a tap away. Particularly, Siri will be able to remember context across multiple requests, understand voice and text commands better, and be able to see your screen and take actions accordingly on the device for you (yes, creepily similar to Microsoft’s Recall).

In Apple’s defense, the announcement includes that the remote server will have private processing for all this and the features will be off by default (which Microsoft only decided or at least announced after much backlash for its Recall feature which can track your screen all the time).

Onscreen awareness in operating systems is going to be the next big debate. Only time will tell how will all this pan out.

Now, apart from a buffed-up Siri, Apple also announced that new products and apps will be getting AI support to supercharge productivity. AI writing tools will be built into Mail, Notes, and Messages. Mail will also use AI to better organize the inboxes. Phone will auto-generate transcriptions and summaries of phone calls. Photos will allow you to search images in a more conversational manner. You will be able to create Genmojis to be used throughout the Apple ecosystem. The new “Image Playground” feature will allow you to generate images from text prompts.

All of this will be processed through Apple’s Private Cloud Compute or PCC system. This means that the data you send in the process of these AI workflows will be held separately. And you will need to opt into these features, they won’t be on by default.

Now, let’s talk about the bigger piece of news – the partnership with OpenAI. This partnership will essentially allow Siri to use the latest ChatGPT weight when it needs solutions to more complex problems or more detailed answers to questions. OpenAI published an announcement about this, commenting how this OS-level integration will allow capabilities such as image and document understanding.

GPT will power the system-wide Writing Tools, image generation, and Siri.

Apparently, Elon Musk didn’t like this OS-level, system-wide integration, particularly because he currently harbors a beef with a lawsuit with the company he helped found. In a barrage of tweets, he likened this move to “creepy spyware” and said that if this is implemented, the visitors to his companies’ offices will need to check Apple devices in a Faraday cage and employees will be banned from bringing them to office premises altogether. The comments also hint that Musk might be thinking of an X/Grok Phone and apparently, there’s a lot of support for that, far outweighing the comments on Twitter’s own somewhat creepy terms and policies. Personally, I doubt he watched the full keynote (or just chose to ignore Apple’s privacy-focused promises on how all this will be handled).

By Abhimanyu

Unwrapping the fast-evolving AI popular culture.