I think Henry Ford once said, “If I’d ask customers what they wanted, they would’ve told me a faster horse.” People don’t know what they want until you show it to them.
—Steve Jobs
UI instead of AI? Investors and users are expressing disappointment — they were expecting AI, not just a polished or overhauled UI. As a result, the shareholders’ lawsuit (Tucker v. Apple Inc. et al.) is now making headlines, accusing the company of overstating the capabilities of Apple Intelligence unveiled at the previous WWDC. The suit claims Apple lacked a realistic plan for integrating AI, dismissing the design updates as ’empty gloss’ amid delays in delivering real AI features.
But let’s be fair — design is a form of innovation. If Apple has truly delivered a visual revolution, it could be considered a breakthrough in itself. From a designer’s point of view, it’s not just ‘gloss’ — it’s a shift in how digital surfaces feel and behave, both today and in the future.
Apple’s decisions (even the controversial ones) tend to ripple across the industry. Whether praised or criticized, they set a tone. Time and again, we’ve seen rivals eventually adopting similar patterns and overall style.
We know that Apple has withstood outrage more than once — and each time, they’ve proven the move was calculated, serving both their existing ecosystem and future products. If today they’re building seamless cross-device experiences and defining their unified design language, tomorrow we’ll see the bigger picture.
What others launched
Apple isn’t the only company trying to shape the future of design. Let’s take a look at how others are approaching it, focusing on Google’s Material Expressive and Airbnb’s refined design language.
Airbnb’s redesign has sparked a mix of admiration and skepticism within the design community. On one hand, professionals praise the new icons with motion, so called “Lava” 3D icon format, calling it “a bold shift in UI” and “a redefinition of icon behavior.” Presented as 3D micro-videos, the icons appear alive, fluid, animated and… dimensional. But when static, they feel like throwbacks to the CorelDRAW era, don’t they?
My first reaction was that the complexity of Airbnb’s new icons feels like a direct response to AI-generated iconography. Michal Malewicz proved my words: https://www.youtube.com/shorts/3k3V-0dbRe8
As tools now make it easier to enrich icons with detail and texture (and AI-animate afterwards), we as designers are almost being pushed to keep up , not just in creativity, but in intricacy with the help of Cinema 4D or Blender and non-AI tools. When visual abundance becomes effortless, minimalism suddenly feels… underdressed.
However, community feedback, especially on Reddit, highlights tensions in execution. One user noted, “Icons look like illustrations, not actionable menu items”.
Ok, let’s also look at what Google has brought to the table. With Material 3 Expressive, Google introduced a motion‑physics system that goes beyond traditional easing curves. This approach uses spring-based, physics-driven motion, making interactions feel more natural, responsive, and alive. It’s still a vivid, flat design — but now rigged with fresh springs that shine in motion.
Once again, motion emerges as a defining force in visual language. For designers, this signals a major shift: motion is no longer just decoration — it’s a core design element. Whether through spring-based dynamics or fluid lensing, interfaces now “feel” alive, responding in ways that align with our physical expectations. Google uses 2D-motion to inject emotional depth into UI, while Apple leans toward immersive, 3D/spatial interaction.
The end of the flat era?
As screenshots of iOS 26 debuted, comparisons to Windows Vista began to spread: translucent panels, glossy reflections, soft shadows, and generous light play. It felt futuristic in 2007, but it also became infamous for draining battery life and overburdening underpowered hardware.
Designers coined the term Frutiger Aero — a retro-futuristic aesthetic known for gradients, lens flares, and hyper-polished glassy surfaces. It merged the Aero (not to be confused with Aqua) visual style with the typeface clarity of Frutiger, defining an entire era of mid-2000s UI culture.
It may seem that Liquid Glass shares same visual DNA, but the intent feels different. Vista dressed up a static UI with glassy chrome. iOS 26, on the other hand, turns the interface into a reactive surface. And this isn’t just visual styling — it’s a computational material, tuned to light, motion, and context. Where Vista UI just shimmered, Liquid Glass bends light. Where Aero just floated, this one flexes. So material in all its shine and glory — and ironically, Apple has made it feel even more “Material” than Google’s UI, despite the name.
Sure, iOS 26 inherits glassmorphism but pushes it forward. Glassmorphism itself can be seen as a subset of neumorphism, borrowing from it the idea of using depth to define interaction. Personally, I appreciate how the Z-axis is used to build layered, spatial interfaces.
At its core, glassmorphism seems to chase the goal of becoming invisible. But once you layer rim lights, background distortions, and frosted glass effects, the screen can quickly feel cluttered and distracting, tiring your eyes faster than expected.
In many places it’s a nightmare of cognitive overload and low readability. The glass refractions need a certain level of transparency to look the best and that level will crash and burn on half of the backgrounds.
This quote is from the ever-sarcastic Michal Malevitz (one of my favorite authors) who claims to have coined the term ‘glassmorphism.’
That said, Liquid Glass feels elegant. It really does. And, like with every bold design refresh, you’ll probably stop noticing it after a week. But if the visual “wow” fades, the clutter stays… We’ll need time to see how this plays out.
Are we witnessing just a new visual style or the slow sunset of flat design? Minimalism vs ‘a sense of substance’… unflattened minimalism? Light, depth, motion, transparency — all working together to make the interface feel more alive and more physical.
Liquid Glass doesn’t reject minimalism, but adds more visual… mmm, tactility? ‘Visual tactility’—can I put it that way?
Could flat design become obsolete so unexpectedly? No. But expectations are shifting and users who are getting used to this new interface will start to demand a more sensory-driven experience everywhere.
Android, your move?
Design becomes computational?
“Liquid Glass” signals a deeper shift for us, interface/UX/product designers and for developers as well. We’ve long treated design systems as collections of UI patterns, components, guidelines, fonts, and sometimes animations. But Apple shows that consistency isn’t just about alignment and spacing anymore.
It begins with consistency of the material itself — its responsiveness, its viscosity, its behavior under light and pressure. In Liquid Glass, material consistency becomes the basis for design consistency. The way the interface flexes, reacts, and adapts becomes part of the unified design language — not just how it looks, but how it behaves and responds.
This new digital meta-material doesn’t just look different — it acts like a responsive or even living substance. It conveys flexibility and tactile feedback through motion, not just appearance. It’s no longer just visual consistency — it’s computational consistency.
Speaking of, we used to hear a lot about computational photography from Apple, where a camera doesn’t just capture the world, but interprets and calculates it through layers, filters and algorithms. Now, it seems, computational design is emerging.
Liquid Glass isn’t just rendered once — it’s computed continuously. It dynamically adapts to light, content, and context. It’s live, real-time response to surroundings and user input. The material flexes, refracts, and reacts ; its every presence calculated across multiple layers: highlights, shadows, tint, luminance, and surface graphics.
This changes the game: interfaces are no longer just layouts. They’re behaviors. Designers and developers are no longer working on static interface elements— we’re shaping real-time, responsive matter.
As a practical designer though, I don’t love the fact that building a high-fidelity prototype in Figma (or Framer) now means emulating the effect by layering styles and tweaking backdrop blurs. Flat design spared us from that kind of hassle. Sure, premade UI kits for Liquid Glass are already available (and more will follow), but to convincingly present a product in a realistic digital environment, we’ll now also have to deal with computationally heavy visuals, and this will inevitably burden our projects, not to mention the final interfaces.
Apple cautions designers not to overuse Liquid Glass. Yet we’ve already seen early concepts (like the all-glass Spotify or Instagram redesign) misuse and overdo these glass panels, turning elegance into ‘eye candy overload’.
Apple clearly wants the rollout to be fast and smooth. They’ve prepared us for a seamless transition to Liquid Glass. Designers already have guidelines and even tools like Icon Composer to help adapt apps to the new style. And for developes in SwiftUI, it’s as simple as .buttonStyle(.glass) — a single line to apply Liquid Glass to your interface. I’ll dive deeper into official guidelines in my upcoming article.
Can hardware keep up with Liquid Glass?
At first glance, the glassy interface might seem like just another visual layer. But try to replicate it and you’ll quickly realize the scale of the challenge. One Android developer who attempted to recreate the effect shared this:
“I honestly don’t know how Apple pulled it off. After trying to mimic it (using AGSL shaders — author’s note), I’ve gained a whole new respect for Liquid Glass. My guess is they’re using shaders too — just far more refined. Apple has spent years building a system that makes this possible. Android isn’t quite there yet. Maybe Xiaomi or another Chinese brand will figure out a workaround. But until we get an open-source solution, true glassmorphism on Android remains out of reach.”
This isn’t just eye-catching animation — it’s a coordinated system of material behaviors, lighting, touch feedback, and real-time shaders. Liquid Glass demands power, precision… and ideally a 120Hz ProMotion display to truly shine.
On Reddit, one such attempt (this time in a web environment) ran into severe restrictions:
“We discovered limited browser support, forcing us to use suboptimal workarounds. Over time, WebKit introduced the backdrop-filter CSS property, but it’s still a performance killer — browsers have to recalculate the blur on every scroll. Maybe Apple has optimized this across their devices, but I strongly advise anyone building a Liquid Glass design on platforms other than Apple to thoroughly test performance.”
It seems that once again, Apple’s tight integration of hardware and software is working in its favor.
Translucent or transparent? Let’s talk accessibility
Today, Apple often praised as a leader in inclusive design, but such reputation was earned over time. In the 80s and 90s, people with vision loss had almost no built-in accessibility. They had to rely on third-party screen readers like Echo II — and even those didn’t always work well. While Windows users benefited from more mature tools like JAWS and Window-Eyes, Apple lagged behind. Their first major attempt at accessibility — Universal Access in OS X 10.2 Jaguar — was buggy and unreliable.
It wasn’t until OS X 10.4 Tiger and the iPhone 3GS that a new era began: VoiceOver was integrated into the system, braille displays were supported, and zooming became native. In 2013, Tim Cook said, “When we work on making our devices accessible, the ROI is not the important thing. We do it because it’s just and right.”