After years of watching Siri fall behind ChatGPT, Alexa, and Google Assistant, Apple is about to make its biggest move yet: a completely rebuilt Siri powered by Google’s Gemini AI, arriving this month.
TL;DR — The Quick Take
Apple will preview the new Gemini-powered Siri in the second half of February 2026, with iOS 26.4 beta. It promises genuine conversational AI, screen awareness, and deep app integration — while still running on Apple’s Private Cloud Compute for privacy. The biggest Siri upgrade since 2011 is nearly here.
What’s Actually Happening
According to Bloomberg’s Mark Gurman — who has an exceptional track record with Apple leaks — Apple is preparing to unveil the new Siri as early as late February 2026. The reveal may come through private briefings or a dedicated event rather than a major keynote.
This isn’t just a Siri update. It’s a fundamental rebuild.
The new assistant runs on Google’s Gemini AI models as part of a collaboration announced earlier in January 2026. During Apple’s Q1 2026 earnings call, Tim Cook confirmed the partnership structure:
“We basically determined that Google’s AI technology would provide the most capable foundation for AFM (Apple Foundation Models), and we believe that we can unlock a lot of experiences and innovate in a key way due to the collaboration.”
Translation: Apple tried to build competitive AI on its own. It wasn’t enough. Gemini fills the gap.
What the New Siri Will Actually Do
Based on reporting from Bloomberg, CNET, and MacRumors, here’s what’s changing:
1. Conversational Intelligence
The current Siri handles commands. The new Siri handles conversations. You’ll be able to ask follow-up questions, reference previous context, and have actual back-and-forth exchanges instead of starting fresh every time.
2. Screen Awareness
Ask Siri about what’s on your screen. Point to an image and ask questions. Reference content in apps without explaining everything from scratch. This is functionality that ChatGPT and Google Assistant have had — Siri is finally catching up.
3. Deep App Integration
The new Siri will integrate more tightly with Mail, Messages, Calendar, and other core apps. Think: summarizing emails, scheduling meetings based on message context, or finding information across multiple apps without manual input.
4. Personal Context
This is the big one. Apple is calling it the “personalized version of Siri” — an assistant that understands your habits, preferences, and data to provide relevant responses without you spelling everything out.
5. Richer Responses
Gone are the days of “Here’s what I found on the web.” Gemini’s language capabilities mean Siri can actually explain things, provide detailed answers, and have opinions about recommendations.
The Privacy Question (And Apple’s Answer)
Here’s where things get interesting. Apple built its brand on privacy. Google built its brand on… well, data. So how does this work?
Apple addressed this directly on the earnings call:
“We’ll continue to run on the device and run in Private Cloud Compute and maintain our industry-leading privacy standards in doing so.”
What this means in practice:
- On-device processing first: Simple tasks run locally on your iPhone, iPad, or Mac
- Private Cloud Compute for complex requests: When cloud processing is needed, it happens on Apple’s custom silicon servers — not Google’s infrastructure
- Data isolation: Apple’s Private Cloud Compute isolates user data, preventing it from being combined with other users or used for training
- No data sharing with Google: Based on Apple’s statements, your queries and personal data aren’t sent to Google’s servers — Gemini’s technology powers the system, but runs within Apple’s privacy framework
Is this trustworthy? Apple has a strong track record here. They refused to unlock iPhones for the FBI. They built differential privacy into their analytics. They’ve consistently chosen user privacy over advertising revenue.
That said, we’ll need to see the technical implementation once iOS 26.4 ships. “Trust but verify” remains good advice.
When Is It Coming?
Here’s the timeline based on current reporting:
- Late February 2026: Apple previews new Siri (private briefings or small event)
- February 2026: iOS 26.4 enters beta with new Siri capabilities
- March/April 2026: General release of iOS 26.4 with Gemini-powered Siri
- June 2026 (WWDC): Even bigger Siri overhaul expected with iOS 27 — potentially a full chatbot-style assistant comparable to ChatGPT
The February preview is the appetizer. WWDC 2026 may be the main course.
What Devices Will Support It?
Apple Intelligence requires sufficient memory and processing power, which limits compatibility. Based on current requirements:
Likely supported:
- iPhone 15 Pro / Pro Max and newer
- All iPhone 16 models
- iPad Pro (M1 and later)
- iPad Air (M1 and later)
- MacBook Air, MacBook Pro, Mac mini, iMac, Mac Studio, Mac Pro (M1 and later)
Likely NOT supported:
- iPhone 15 / 15 Plus (non-Pro)
- Older iPhones (14 and earlier)
- Intel-based Macs
- Older iPads without M-series chips
If your device doesn’t currently support Apple Intelligence features, it probably won’t get the new Siri either. Apple hasn’t confirmed device requirements, but the AI processing demands haven’t changed.
Why Apple Partnered With Google (Instead of Building It Themselves)
This is the part that surprised everyone. Apple partnering with a competitor? For Siri, the product that launched the voice assistant revolution?
The reality: Apple was losing the AI race badly.
When ChatGPT launched in late 2022, it could do things Siri couldn’t dream of. Google followed with Gemini. Even Microsoft bolted GPT-4 onto everything. Apple’s attempts to build competitive AI in-house weren’t keeping pace.
The January 2026 announcement was a pragmatic admission: building a state-of-the-art language model from scratch takes years and billions of dollars. Google already has one. Rather than let Siri become irrelevant, Apple chose to collaborate.
Tim Cook framed it as a partnership, not a surrender:
“You should think of it as a collaboration. And we’ll obviously independently continue to do some of our own stuff, but you should think of what is going to power the personalized version of Siri as a collaboration with Google.”
Apple keeps its privacy architecture, user experience control, and hardware integration. Google provides the AI engine. Both companies benefit — and more importantly, users finally get a Siri that works.
What This Means for ChatGPT, Claude, and Other AI Assistants
The competitive landscape is shifting:
For ChatGPT: Apple already integrated ChatGPT as an optional fallback in iOS 18. With Gemini powering core Siri functionality, that relationship becomes less central. ChatGPT may still be available, but it’s no longer the only “smart” AI option on iPhone. (See our ChatGPT vs Claude comparison for how these chatbots stack up.)
For Claude: Anthropic’s assistant has been gaining ground in the AI market. The Apple-Google partnership doesn’t directly affect Claude’s position, but it does mean iPhone users will have a competent built-in assistant for the first time — reducing the need to seek alternatives.
For Google: This is a massive win. Google gets its AI technology deeply integrated into the iPhone — the most valuable smartphone platform in the US and Europe. Even if Google doesn’t get user data, Gemini’s capabilities become familiar to hundreds of millions of Apple users.
For Users: More competition is good. A better Siri pushes Google, Amazon, and others to improve their assistants. Everyone benefits.
Should You Be Excited?
Here’s my honest take:
Yes, if:
- You’ve given up on Siri and want to try again
- You prefer native integrations over third-party apps
- You care about privacy but still want modern AI features
- You have a compatible device
Hold your expectations if:
- You expect ChatGPT-level capabilities from day one
- You think this will replace dedicated AI tools for work
- You’re on an older device that won’t get the update
The February preview will likely show polished demos. Real-world performance? That’s what the beta period is for. I’d wait for iOS 26.4’s public release before judging.
What Comes Next
February’s reveal is just the beginning. Apple is reportedly planning an even bigger Siri transformation for WWDC 2026 in June, potentially including:
- A full chatbot-style conversational interface
- Siri capabilities competitive with Gemini 3 (Google’s latest)
- Deeper integration across all Apple platforms
- New interaction paradigms beyond voice
If Apple executes well, 2026 could be the year Siri becomes genuinely useful again. And given Google’s Gemini technology powering it, there’s actually reason to believe they might pull it off.
Curious about AI agents that can automate tasks for you right now? See our best AI agents guide.
The Bottom Line
Apple’s Gemini-powered Siri is the most significant update to the assistant since it launched in 2011. By partnering with Google while maintaining their privacy architecture, Apple is trying to have it both ways — cutting-edge AI with industry-leading privacy protections.
Will it work? We’ll find out this month. But for the first time in years, there’s a real reason to be optimistic about Siri. For a look at how Gemini stacks up against the competition on its own merits, see our DeepSeek AI complete guide — another model shaking up the AI landscape.
For AI research tools you can use right now, see our Perplexity AI review or our guide to the best AI writing tools.
📬 Get weekly AI tool reviews and comparisons delivered to your inbox — subscribe to the AristoAIStack newsletter.
Keep Reading
- Claude vs Gemini 2026
- ChatGPT vs Claude: Which Should You Use?
- Best AI Agents 2026
- Best Free AI Tools 2026
- Getting Started With AI Tools
Last updated: February 2026



