I hadn’t seen the new Perplexity voice mode until Federico Viticci blogged about it. Looks impressive. I also think Federico is exactly right on this:
Looking at the big picture for a second, I do think Apple is in a precarious situation here. The fact that the company makes the best computers for AI is a double-edged sword: it’s great for consumers, but those same consumers are increasingly using Apple devices as mere conduits for other companies’ AIs, funneling their data, context, and – most importantly – habits into systems that cannot be controlled by Apple.
There used to be a lot of talk of AI companies not having a “moat” that would protect them against competition from Apple and Google as everyone caught up to the latest advanced models. It’s clear now the moat is the product, not the model. ChatGPT with memory based on everything you’ve asked it is starting to become a light form of lock-in.
Perhaps this iOS integration with Perplexity could be the same thing if it takes off. I’m a little skeptical because Perplexity doesn’t have the reach of OpenAI and Anthropic, and as Federico says many folks still have a bad first impression from Perplexity skirting the gray areas of copyright and crawling.
As I blogged last month, Apple has the added challenge of not yet knowing if what they are trying to do is even possible. Their competition isn’t limited in the same ways that Apple is: not relying on local models, not focused on privacy, not announcing features only once a year in June. OpenAI, Perplexity, and others are developing at a different pace.