A series of tweets concerning few people working with Apple have kind of teased what Siri’s AI-powered capabilities could look like.
All of this is speculation. Apple has been working on its LLMs as well as other machine learning algorithms. A lot of them are really useful for people generally. These are all published as research papers. None of these mention any use for Apple’s ecosystem, however. Awni Hannun, a research scientist in the Machine Learning Research group at Apple tweeted “LLMs are faster and more memory efficient in MLX!” on March 8. This is just one of the many advancements that Apple’s ML and AI research teams are working on.
Ex-Microsoft employee and current author of Unaligned, a podcast on AI startups, Robert Scoble was one of the many people to retweet this. His retweet got a comment from Dag Kittlaus, the co-founder and CEO of Siri. Kittlaus said:
Siri will do some cool new things in 2024. Then accelerate and become a real force in the AI arena. Apple is uniquely positioned to enable new, useful and unexpected LLM use cases.
His last tweet at the time of writing, dated January 15, goes on to link out to a CNBC video on “The AI dark horse: Why Apple could win the next evolution of the AI arms race” with a comment that reads, “Siri is quietly training hard for the comeback. Hey Siri, time to dominate.”
It’s not clear how much Kittlaus really knows. But it’s widely reported that Apple is working on its own LLM for use in devices locally (which is unlike other tools such as ChatGPT or Google Gemini). They have published research papers on these advancements and we could soon be expecting a major AI-related announcement from Apple.
Notably, Apple has also added major AI capabilities into its new chip lineup (M3). In its announcement of new Air laptops, Apple has a whole section dedicated to consumer applications of AI. This ties in neatly with the rest of this story. Consumer use of AI is different from corporate use. For example, consumer use could be something like editing images with AI or getting answers from the web. This can be accomplished locally, which the research from Apple has already made efficient.
In all likelihood, Apple’s rollout of its own advanced Siri (with AI) or other AI capabilities will be largely a matter of integration with various apps on your device that make your life easier and more productive, powered by an Apple LLM.
This new lineup of Air laptops from Apple also offers better AI inferencing capabilities, meaning you can train models and have them do their work more efficiently compared to consumer-grade Windows laptops that lack the best GPUs. The M3 chip, in other words, has a lot of AI capability.