Apple started WWDC off with a slight brag about how much Apple intelligence they delivered last year and at the same time, there was a minor apology for the delayed features in the form of, “took more time to reach our high quality criteria” and “sharing more about this in the coming year” It was glossed over, but it was more than I thought Apple would do.
Foundational Model Framework
Talking about AI, the Foundational Model framework is a great tool for developers, allowing their apps to access on-device LLM models, which work offline and also have privacy built in. This will enable both third-party apps and Apple’s own apps to offer meaningful AI experiences in people’s day-to-day lives. Apple didn’t spend much time discussing this, but I feel that in a few years, in retrospect, it might be seen as one of the defining features of this WWDC, alongside the new design language across all of Apple’s software platforms.
Liquid Glass
I think the new design is good, but I might be biased as it’s something new and novel. There are definitely places where it doesn’t look as good as the design it’s replacing, but this is version one of the new look, and expecting it to have the same level of polish as its predecessor isn’t really a fair comparison.
Since the keynote, I’ve been on Twitter and Mastodon, where users are sharing screenshots of the new UI. I find myself alternating between “this is great” and “this is bad”, and it’s reminiscent of how I felt when iOS 7 launched. It’s not as drastic a change as the shift from skeuomorphism to flat design. The move from iOS 6 to iOS 7 was a complete turnaround—shadows were gone, depth was gone, and everything became flat—perhaps a bit too flat. But this feels like the next step from flat design. The icons now have layers, and the UI elements have a bit of depth—or should I say “Liquid Glass”—which I really like.
All that said, this marks a departure from the flat design era on Apple’s devices, and it looks quite different from other operating systems out there, especially Android. This is Apple’s attempt to be unique and stand out amidst the other software platforms out there. It also means the look is unified amongst Apples platforms, consistency is good.
That said, it appears there are quite a few accessibility concerns—very much like the reaction when iOS 7 launched—but I’m sure Apple will introduce options to bring the UI back to a usable state for those who need it. When iOS 7 beta 1 came out, the changes felt too drastic, and there was a lot of pushback online, particularly from designers. I’m seeing the same reaction now, and it feels like classic beta one behaviour for iOS 26. As more beta versions roll out, I expect Apple to tone down a few things. They should definitely keep the Liquid Glass elements, but place greater emphasis on accessibility and readability, and perhaps not make the changes quite so extreme.
I really like the new ‘Clear’ option that’s available alongside the existing light and dark modes. It looks fantastic in motion, though perhaps not quite as good in still screenshots—but when interacting with the device, it looks , and feels brilliant.
iOS 26
They actually renamed the operating systems as rumoured, which makes it much easier to remember which platform is on what version.
Almost all the apps have gone through a redesign, and among the ones demoed during the keynote, I really appreciated the UI changes to the Camera app. The Camera app had started to feel bloated with all the different modes, but now things have been simplified at launch. If you want the full feature set, it’s still there, just tucked away until you need it.
The icons have been redesigned too, and I especially like the new clock and camera icons. The camera icon now resembles the one from iOS 6 and earlier, which I’ve always preferred. I never liked the current camera icon, where the iconography imitates a “real” camera, not an iPhone camera. Now we’re back to how the camera icon initially was, and how it should have been all along.
One of the iPhone features I really liked was the new Call Screening and Hold Assist features, which I can definitely see myself using. I tend to get a lot of scam calls, and these will definitely help me avoid them. Speaking of phone calls, the live translation feature they showcased was nicely executed. I’ll reserve judgment until I see how it performs in real-world use, but it’s an impressive little feature—and even more so because it runs entirely on-device.
The iMessage updates are solid, but as someone living in the UK, most of my friends and family are on WhatsApp. These feel more like catch-up features than game-changers, but if your friends and family live on iMessage, these can be beneficial.
The new Games app is a great addition to the first-party app lineup. I always missed the old Game Center app, and I think Apple removing it was a mistake. But now, with the Games app, there’s finally a unified place for your games, to track your progress and compete with friends. This app doesn’t really do much, but Apple should have never killed off Game Center in the first place.
The Visual Intelligence features are fantastic—especially the one where you can take a screenshot of an event, and the iPhone automatically pulls the event details into your calendar. Brilliant. I’ve always wanted something like this. So many times, I’ve received an invite as an image on WhatsApp and had to manually input the details into Fantastical. This feature would make that process seamless. I know Android has had something similar, so it’s great to finally see this on iPhone. I’m really glad visual intelligence is evolving, it’s one of those features I constantly use, I would have never guessed they would tie it to the screenshot gesture, but if it works, then why not.
I am aware this was something that was bundled with the promises of Apple Intelligence features last year, we never got those features and we never might, but this would do for now. So if I can’t ask Siri to add something on the screen to my calendar, I will take this over having to manually add them to the calendar myself.
watchOS 26
WorkoutBuddy is the newest feature—aside from the design changes, of course. I don’t see myself using it, as I prefer listening to my music or podcasts without being distracted by notifications while working out. That said, I’ll at least give it a try to see if it delivers on its promises. Just because it might not be for me, it could still be a major addition to the Apple Watch and turn out to be one of the features that helps lock people into the Apple Watch and iPhone ecosystem.
The Notes app coming to the Watch might not sound like a big deal, but it’s long overdue, and I already know it’ll become one of my most-used apps on the Watch.
tvOS 26
The fresh coat of paint—thanks to Liquid Glass—can really only be felt in the playback and Control Centre controls. The Home Screen still looks very similar to the existing tvOS interface. tvOS seems to have the least amount of new features, but the Automatic Sign-In API is something that should have been available already. If I have an Amazon Prime account, why should I have to spend time logging into the app across all the different platforms? This should simplify the process. It’s one of those features you use once and then forget about, but I’m glad Apple has finally brought it in.
(I gave the example of Amazon Prime here instead of the obvious Netflix, because Netflix has so far refused to implement many of the Apple TV related features)
macOS 26
Talking about the Mac, I’m not sure if I’m a fan of the redesign. I don’t mind Liquid Glass, but the Mac now looks a bit too much like the iPad. I don’t know if I like my Mac looking like my iPad. I guess that’s the trade-off for consistency, and I hope I warm up to it.
Bringing Live Activities from the iPhone onto the Mac is a pretty nifty idea. The example they showed with Uber Eats is exactly why I’d want something like this. I often place orders on Uber Eats and then have to keep my phone nearby, constantly checking the Home Screen and Lock Screen widget to see how far along my delivery is. That’s a perfect example of an iPhone feature making its way to the Mac in a meaningful way.
One of the coolest features is the ability to launch iPhone apps via Mac Spotlight, which is simply brilliant. iPhone Mirroring, introduced last year, is one of my favourite and most-used features in the Apple ecosystem. I’m glad they’ve now extended it with more capabilities. I tend not to log into social media accounts on my MacBook since it’s my machine for “serious productivity.” But sometimes, I want to post something on Trends or Mastodon, so I use iPhone Mirroring to link to my iPhone and post from there. Until now, that meant opening iPhone Mirroring, searching for the app I needed, and then doing the task. But now, this simplifies the process—I can open the app in iPhone Mirroring straight from Spotlight on the Mac.
If I pick up my iPhone to do something, I know I’ll end up distracted and doomscrolling. iPhone Mirroring has helped me stay productive, and these enhancements will only make it better.
We’re also getting a native clipboard, which is a feature I’ve always wanted—and it’s finally coming. At the moment, I use Alfred as my clipboard manager. I have a lifetime licence for Alfred, but I feel like I may end up switching to Spotlight with macOS 26 Tahoe. I’ve been considering moving to Raycast for a while, but haven’t had the time to really test it. Now I’m considering jumping from Alfred to Spotlight, which would be ideal. Of course, Raycast is much more feature-rich, but Spotlight’s new features—clearly inspired by Raycast—are genuinely impressive.
visionOS 26
The way Apple has handled widgets—where you can place, for example, a clock on the wall, and even after rebooting your Vision Pro it remembers the exact location, is simply brilliant. I know Apple is working on glasses that can display information to you, and when those arrive, features like this will likely come built-in. As Apple works to reduce the price of the Vision Pro, this really does feel like the future of computing.
Another standout feature was shared spatial reality, two people can put on these glasses and look at the same virtual object or environment, whether it’s for gaming or doing something productive like designing in 3D. I really hope Vision Pro becomes the success it deserves to be, because this is the coolest thing I’ve seen from Apple in a long time—easily since the launch of the iPhone.
Finally, PSVR2 controllers can now be paired with Vision Pro, which had been rumoured for a while. I’m glad it’s finally a reality as it means more high end games can come to the Apple Vision Pro. If you go to Sony, you can’t buy individual controllers as they don’t sell them separately. So, I feel that Sony might start selling these individually, maybe only at Apple Stores. If and when they do, this will even help PSVR2 players on PlayStation if they ever need to purchase a replacement controller.
iPadOS 26
Basically, all the multitasking changes on the iPad introduced during WWDC25 are what Apple should have introduced five years ago. Over the past few years, Apple experimented with various design and multitasking paradigms, but this update essentially discards all of them. They’ve now brought pretty much everything multitasking-related from the Mac to the iPad. iPad multitasking is fixed, finally.
Apple could have stopped there and I would have been happy, but they also brought the menu bar over to the iPad, another feature borrowed from the Mac. It finally feels like the iPad is having its moment, where the software has caught up to the hardware. These features are thoughtfully implemented: the iPad can still be a simple device for those who just want a larger iPhone, but if you want the advanced tools, they’re there too.
You could argue that these are “Pro” features, and Apple could have limited them to just the iPad Pro models—but instead, they’re rolling them out across the entire iPad lineup. I was genuinely surprised to see multitasking options enabled on my now 7-year-old iPad.
There’s always been this question: will the Mac get touch support first, or will the iPad gain Mac features first? I suppose this WWDC gave us the answer. I still have a 2018 iPad Pro, and now I’m seriously considering upgrading, these new features have finally given me a reason.
Closing Thoughts
The WWDC 2025 keynote was surprisingly short, it ran for around ninety minutes. I was expecting a full two-hour presentation. They didn’t talk much about AI, which felt quite different from the demos at Google I/O, where AI was the main focus. Apple’s approach felt much more grounded. I think what they’ve shown is enough for this year, but they’ll need to seriously step up their AI game next year to stay competitive. That said, it’s a good start, and the Foundational Model framework could change the way our apps work. These on-device AI features could certainly position Apple as a serious contender in AI-related technology, and with Apple’s industry-leading processors, this might finally put their AI struggles behind them. All this said, let’s not forget that last year Apple promised and showed off AI related features that were not ready yet. My trust in the company has eroded, but I am hoping Apple will not do the same mistake twice.
For the new design language and this framework to be adopted by apps, Apple also needs to sort out its relationship with developers. If developers are expected to spend their summer doing the hard work of porting their apps to the new software versions, Apple has to play nice—resolve the issues around App Store fees, provide alternate payment systems, and basically be a good partner. When the iOS 7 redesign came out, every developer I knew jumped in to update their apps. I hope we see the same momentum this year because—let’s be honest—Apple needs developers for all of this to come to fruition.