Why Apple Intelligence Hasn't Delivered

(A sequel to “Why You Shouldn’t Buy the iPhone 16 for Apple Intelligence”)
Disclaimer: I write these pieces as a former technology columnist and current industry observer. They reflect my personal analysis and opinions only, not those of my employer or any affiliated organization.
When iOS 18 launched in September 2024 alongside the iPhone 16, Apple made Apple Intelligence the headline feature. This was Apple’s long-awaited entry into generative AI, and it came wrapped in promises of a smarter Siri, on-device writing tools, and context-aware assistance. The implication was clear: this would be the moment the iPhone became more than a device — it would become a partner.
At the time, I wasn’t convinced. A year later, as iOS 26 rolls out with the iPhone 17, I can say confidently: my doubts were justified.
Promises vs. Reality
Apple billed “Super Siri” as its flagship upgrade. Instead, Siri received only a minor facelift, a friendlier voice, and a ChatGPT add-on that feels more like a patch than a breakthrough. The long-promised leap forward never materialized. Meanwhile, Amazon managed to outpace Apple with Alexa+, which, although imperfect, already surpasses Siri in conversational abilities and smart-home integration.
Apple also emphasized its AI-powered email and notification summaries. But once users experienced them, the experience proved inconsistent and frustrating. Summaries either lost nuance, hid essential details, or failed completely to convey the original meaning. Apple even had to suspend its news summarization feature after repeatedly encountering errors with BBC content. What was meant to simplify our digital lives often ended up making them more complicated.
The “Super Siri” moment never arrived. What we got was a reminder that bold promises are easy; delivering on them is another matter.
Developers Aren’t Buying In
Apple framed on-device AI as a natural advantage: private, instant, and always available. It should have been a developer magnet. But in practice, there has been no rush to build with Apple Intelligence. Only now, with iOS 26, have developers gained meaningful access to Apple’s foundation models, and so far the response has been tepid.
Why? Because the world already runs on cloud AI. Meta apps use Meta AI. Microsoft apps run on Copilot. Google apps are woven into Gemini. Independent developers, such as Grammarly, integrate with Azure’s OpenAI Service. These platforms are larger, more capable, and improving at a pace Apple hasn’t matched. Apple’s three-billion-parameter model doesn’t compete with trillion-parameter giants in the cloud.
Apple is trying to sell developers on the benefits of privacy. However, developers are following users, and users are already embedded in ecosystems where AI is currently in use.
Cloud AI Won the Year
In the meantime, cloud services continued to advance rapidly. People continued to rely on ChatGPT for research, Copilot for productivity, Gemini for search, Perplexity for discovery, and Meta AI for social apps. Even Grok on X has carved out its niche.
Apple Intelligence, in contrast, still feels like scaffolding—a framework with potential, but little real-world relevance. The privacy-first story is elegant, but capability is what matters. And Apple still hasn’t closed that gap.
Where Apple Intelligence Could Still Matter
That doesn’t mean Apple is out of the game. There are areas where on-device AI could deliver real differentiation.
Health and wellness are the most obvious. With the Apple Watch collecting massive amounts of biometric data, on-device models could provide powerful insights into sleep, stress, and activity without sending sensitive information to the cloud. Privacy matters in health, and here Apple has an edge.
Real-time translation is another. Apple’s vision — seamless cross-language conversation via iPhone and AirPods Pro 2 — could be transformative for travelers and business users alike. If Apple can make it reliable, it’s a feature people would pay for.
And then there’s the living room. Imagine Apple Intelligence deployed to HomePod and Apple TV, learning your content preferences locally and surfacing shows, music, or even apps you’d never normally discover. Unlike Netflix or Disney’s cloud-driven recommendation engines, this could be done privately, on-device, and across Apple’s ecosystem. If Apple plays this right, it could create a content experience no rival can match.
The hardware is ready. The opportunity is there. What’s missing is execution.
Privacy vs. Capability
Apple continues to bet on privacy as its brand-defining strength. And while that’s commendable, privacy alone isn’t enough. Users want tools that work, and developers want platforms that move quickly. Right now, those needs are being met elsewhere.
Until Apple can integrate its privacy-first approach with features that genuinely alter how people use their devices, Apple Intelligence will remain an impressive idea rather than a must-have capability.
The Bottom Line
Apple Intelligence has matured over the past year, but it hasn’t delivered the breakthrough Apple promised. Siri is still behind. Summaries are buggy. Developers aren’t interested. And while Apple talks about the future, its rivals are shipping it.
Yes, the iPhone 17 is Apple’s best phone yet. But if you’re considering it because of Apple Intelligence, although imperfect, my advice remains the same as it was a year ago:
Don’t buy it for Apple Intelligence.