Apple has released iOS 26 (in beta), and its headline feature is a cosmetic flourish: Liquid Glass. Billed as a refined, futuristic aesthetic meant to elevate the visual experience of iPhones and iPads, Liquid Glass is smooth, translucent, and undeniably pretty. It looks impressive in marketing shots, glinting with the Apple-esque polish we’ve come to expect.
But once again, we’re facing an update that values looks over substance. The actual user experience? It’s harder to read, more frustrating to navigate, and paired with yet another year of Siri falling behind. Yes, Siri can still set an alarm—but ask her, “How long until my alarm goes off?” and she draws a blank. After more than a decade of development, that’s not just an oversight. It’s a failure.
Let’s start with Liquid Glass. This redesign covers the system UI in layered translucency, giving the illusion of depth and responsiveness. Icons float atop hazy gradients. Widgets melt seamlessly into the background. On paper, this sounds innovative. In practice, it’s difficult to use. The transparency is often so aggressive that text becomes unreadable depending on your wallpaper. Notifications blur with backgrounds. Control Center elements overlap in a haze. You end up straining your eyes trying to determine where one visual element ends and another begins.
There’s a long-standing design principle Apple used to champion: clarity over decoration. That principle appears to be dead. The new interface isn’t more intuitive. It’s not more functional. It’s just prettier—for a few moments, until you actually try to use your phone outdoors, in bright light, or at a glance. For users with visual impairments or even just aging eyesight, it’s a step backward.
There’s no option to opt out. No “classic view,” no “reduce transparency” mode that actually restores usability to what it was. You’re stuck with the Apple vision of futuristic glass—whether it helps or hurts your experience.
But the most frustrating part of iOS 26 isn’t the design misstep. It’s the ongoing embarrassment that is Siri. Year after year, Apple promises improvements. And yet, the basic functionality still isn’t there.
Siri can set an alarm, sure. “Hey Siri, wake me up at 6:30 a.m.”—no problem. But if you then ask, “How long until my alarm?” you get nothing useful. No time estimate. No confirmation. No recognition that this is a completely logical, human question. Instead, Siri says something like, “You have one alarm set for 6:30 a.m.” It’s a non-answer. And if it’s currently 11:17 p.m., why can’t Siri subtract one from the other?
This should be a simple function. We’re not asking Siri to write code or compose music. We’re asking her to do a basic time calculation—something your 1985 Casio wristwatch could’ve handled. Why is this still not possible?
And this isn’t an isolated annoyance. Siri is riddled with shallow capabilities that stop short of actual utility. Ask her to reschedule an event conditionally. Try to get her to remember how you like your reminders worded. Tell her to read your last text, then reply with something specific, only to be met with confusion or another round of “Here’s what I found on the web.”
Compare that to what AI can do elsewhere in 2025. Google Assistant is contextually aware, ties deeply into apps and routines, and can have nuanced conversations. Alexa offers consistent routines, smart home integration, and dynamic scheduling. ChatGPT and other large language models can handle complex queries with layered context and real human nuance.
Meanwhile, Siri feels frozen in time. Apple claims a more advanced Siri is coming soon—contextual awareness, on-device processing, GPT-style responses—but none of that is here now. And what is here is unimpressive.
The contrast is sharp: Apple devotes immense energy to designing reflective UI layers, but doesn’t fix the fundamental parts of its ecosystem that people use daily. Setting timers, managing reminders, checking appointments, communicating—these are real-world tasks that Siri should master. And yet, here we are.
The issue is even more frustrating given Apple’s hardware prowess. The A-series chips are among the fastest in mobile history. The iPhone is a powerhouse of computing capability. The privacy infrastructure Apple has built is admirable. So why does Siri lag so far behind in functionality?
One possible answer is Apple’s privacy-first stance. By processing Siri requests on-device, Apple avoids storing or analyzing user data in the cloud. That limits personalization and contextual memory—two things that make assistants genuinely useful. But this is a problem Apple needs to solve, not an excuse to lag behind. Other companies have figured out opt-in personalization while respecting user choice. Apple should do the same.
Instead, users are given a new interface no one asked for, and told it’s revolutionary. But the most revolutionary change Apple could deliver at this point isn’t visual—it’s practical. Give us Siri that finally works. Let her understand basic time questions. Let her interact with apps meaningfully. Let her speak like an assistant that belongs in this decade.
Until that happens, iOS 26 is just another layer of glass: smooth, shiny, and ultimately transparent in its lack of meaningful progress. We don’t need another re-skin. We don’t need ethereal aesthetics. We need tools that work—and a voice assistant who can do basic math.
So go ahead and ask, “Siri, how long until my alarm?”
She still doesn’t know.
And that says everything.