Apple Users Have A Lot To Say About Its Rumored Google AI Plan: 'Anything Is Better Than Current Siri'

According to Bloomberg News, Apple is about to sign off on a landmark deal with Google to use its AI framework for Siri. This story is being reported by Mark Gurman, who's built a reputation for consistently getting early scoops on Apple-related stories, such as when the iPhone switched to USB-C. Gurman writes that Apple is planning to pay $1 billion per year to use a 1.2 trillion-parameter AI model developed by Google. Bloomberg says that Google's tech will "dwarf the level of Apple's current models," suggesting that Apple users are likely to see major changes to Siri in the near future.

This might come as a surprise to some, since Apple and Google are considered two of the biggest rivals in the tech industry. Google created and owns Android after all, which makes it easy to illustrate Apple and Google's competition as Blue Bubble vs. Green Bubble. But this move might also make sense to the many Apple users who've been frustrated with Siri's limitations pretty much since the supposedly smart assistant launched in 2010. This includes many iPhone and iPad users, of course, but Siri is just about everywhere now — including macOS, AirPods, Apple Watch, and Apple HomePod.

I'm one of those elder millennials who prefer to do everything on my MacBook, and I occasionally reach for the Siri shortcut button on my keyboard, though I mostly use Siri with my iPhone and Apple Watch. So, when I see redditors say that "anything is better than what Siri currently is," believe me, I get it. But, I think there's more nuance to this surprising new collaboration — and based on online chatter, I'm not the only one.

Apple Intelligence hasn't made Siri any smarter

This isn't the first time Apple has relied on a rival tech company to improve Siri. As part of its Apple Intelligence overhaul last year, the company began working with OpenAI to pair Siri with ChatGPT. This was such an exciting development that I even upgraded to a beyond-my-budget iPhone 16 Pro to take full advantage of the new AI-enhanced iOS. There are some great Apple Intelligence features available on newer devices, but very few of them have anything to do with Siri.

Generally, Siri is fine with simple use cases like setting timers or asking for the weather. Even this isn't perfect, though. One issue with Siri has been its infamously poor ability to understand what you're saying. Slightly more complicated voice prompts can really trip Siri up, such as when I'm trying to ask for specific settings with my smart lights or sometimes just trying to get them to turn on or off. Ideally, Google's super-smart AI that Apple is going to pay $1 billion for (hopefully the ad-free plan) will also improve its ability to hear and decipher voice prompts, enabling it to execute them more effectively.

But what Siri really needs help with is planning, which, according to Bloomberg, is actually what Apple intends to use Google AI for — in addition to summarizing. Let's break down the latter first, since it's simpler. Many iPhone users have seen a summary of a text from a friend or group chat that either didn't make sense or emphasized the wrong general statements, leading to momentary panic. Any improvements in summarizing text into briefer snippets will be welcome. Planning, though, is next level. That's the AI we've all been wishing for all this time — the iPhone version of Tony Stark's J.A.R.V.I.S.

Siri is a natural platform for agentic AI

AI is so much more than the advanced chatbots that have propelled the technology into the mainstream over the past few years. Agentic AI, which is more autonomous, adaptable, and goal-driven, is already here — but still not sophisticated enough to be the game-changer it could be. An AI "agent" is much closer to the idea of a smart assistant than an LLM like ChatGPT — ideally, it could read your emails and schedule appointments for you, as well as other more "mundane" tasks. For example, you can ask it to book you a flight with an aisle seat that isn't a red-eye, and the agentic AI can do everything from finding one to booking it for you.

If Siri evolved into something like this, "Hey Siri" could be used for so much more than setting timers. It already works with core Apple apps like Reminders and Calendar, and it's gotten a lot better at suggesting times and locations to add to your schedule. But because it's not totally reliable, it can be more trouble than it's worth to constantly double-check it to make sure it's accurate and not get you in trouble for missing a meeting. If Apple's partnership with Google is what's needed for Siri to grow into a truly agentic AI, it would not only be great for Apple users but for the industry as a whole. Just as iTunes did for music and the iPhone did for smartphones, Siri might be the face of agentic AI that allows for mainstream adoption of the technology.

Many Apple users will take what they can get

All this is to say that I'm trying to stay optimistic about this mega deal between two of the world's biggest companies. I'm trying to imagine the best-case scenario for what Google's AI could do for Siri, but at the same time, I understand that what we get may not be nearly as advanced. It might just make Siri slightly better at what it already does, or perhaps a little faster at it — I can't stand waiting five seconds, staring at my phone, to make sure a reminder went through.

This much more limited expectation of Apple's Google AI plan seems to dominate much of the online discourse about it among regular users. Many of these users have been so disappointed and frustrated with Siri for so long that ANY plan to improve it — even if it involves paying a direct competitor — is welcome. As one r/Apple commenter puts it, "Siri has been around since 2011. You'd think that in 14 years they could have made some improvements." If Apple's not going to do it themselves, then why not just let the company outsource the problem?

That user who said that "anything" is better than the current Siri also included their reasoning, saying, "It's absolutely baffling how terrible Siri is at even the most basic things." Multiple Reddit posts discussing the deal are filled with complaints about Siri's limitations, including one user frustrated with voice recognition and reports that Siri is "often ridiculously wrong" when they request songs through their HomePod.

Others don't think working with Google is worth it

Some users would be excited if Siri could be used more like Google Gemini and other LLMs, as they've gotten a taste for using the latter and are frustrated that Apple's smart assistant doesn't work the same way. However, others — presumably those like me who want Siri to become a useful agentic AI — opine that "an LLM-based Siri designed to capitalize on the LLM 'AI' hype train would very much be a waste of effort. Apple Intelligence already pivots to integrated ChatGPT when you give Siri certain prompts, as it is.

Perhaps the main concern Apple users have with a Google collaboration revolves around privacy. Apple has leaned hard into personal privacy to separate itself from the competition, constantly playing up its encryption policies and app-tracking transparency. When it launched Apple Intelligence, CEO Tim Cook made sure to note that prompts would be either computed on-device or in specialized private servers. Google, infamously, is all about sucking up as much of your data as possible — it's what makes the company a lot of its money (and its searches more "relevant" for a user, in its defense).

Some fear that, whatever new AI powers Siri, it will treat personal data more like Google and less like Apple. A poster in r/Privacy declares that "I'd much rather have a useless Siri than Google's AI on my iPhone. Yes, they claim Google won't have access to our data. Sure." Then again, some users suggest they're okay with sacrificing privacy if it makes Siri more useful, since limiting on-device computing yields poorer results. On r/OpenAI, a redditor declares that Apple's current commitment to privacy has "crippled Siri to the point that my Robovacuum understands commands better."

Some users wish Apple would just focus on improving Siri by itself

One debate already happening in light of this announced deal is whether or not this is the right move for Apple. The company has been notably lagging in the AI revolution. Microsoft is OpenAI's largest investor, and Google's Gemini quickly became one of the "big" LLMs used by the mainstream. By paying for Google's AI, it almost seems like the company is giving up on the AI race. Or is it?

After all, Apple wasn't the first to launch smartphones or augmented reality. The company famously takes its time developing its own take on a product or technology and is seemingly guided by the philosophy that it's not about being first — it's about being the best. One Redditor suggests that Apple may just be using Google as a "stop gap" to provide users with a better Siri, as it works behind closed doors to develop its own, smarter artificial intelligence platform.

Others, though, find the deal with Google to be a cop-out. I wonder, though, if this opinion isn't more based on Apple fandom than anything else. I'm fully invested in the Apple ecosystem, but only because it's convenient that all my devices work harmoniously together and because I find both Apple's interface and hardware to be superior to most of the competition. I don't consider myself a "fanboy" that has any allegiance to the corporation, though. If I were, I might be personally offended that Apple is asking Google for help, of all companies. But I'm not. I do share concerns about privacy and wasting resources on gimmicky LLMs, but if — in the end — it results in a better product and better user experience — then what do I care who's paying who a billion dollars?

Recommended