Apple has amped up its hiring and acquisitions, working on artificial intelligence tools and models for its next generation of products with the notable feature of running locally on the machines themselves.
Apple has been late to the party according to some, but new reports suggest that the company has been working secretly on its own AI model and applications that will soon surface with the next generation of the company’s products.
It’s a no-brainer that a big company like Apple will want in on the action. There’s simply no reason to not allocate a department. Since 2017, the company has already acquired 21 startups and the job postings mentioning machine learning seem to be on the rise.
What this means is that Apple is working behind closed doors and will launch its own AI model, tools, software, or an AI-updated Siri at the very least with the next major announcement. So far, the company is playing it safe, not rolling out tools and chatbots like its peers such as Google and Microsoft to let the users test the products for them.
According to a recent Ars Technica article, the company’s goal is to operate generative AI through mobile devices. Just last month, Apple published a research paper on an efficient way to let a model run on a device like an iPhone locally. It was quite the breakthrough. It’s not impossible to run a model locally. But even something like a Stable Diffusion model with a low weight will require a dedicated GPU on a PC to give results in a practical timeframe. To run an entire generative AI such as ChatGPT or Google Bard on an iPhone is currently unfathomable, which Apple wishes to change.
A local LLM like the one that Apple is reportedly working on is naturally secure and honors the privacy of the user as it won’t send the prompts to Apple or access Apple’s servers to get the responses. The reason why a chatbot like ChatGPT, Bing Chat, Bard, Claude, Poe, or Perplexity is so fast is that it’s running on a data center. Your device is merely using the internet connection to send prompts and receive responses. In fact, all data shared between a user and any of the world’s popular commercial chatbots is used by the companies to further train their models.