
Did you know?
In the technology world, we must always pay attention to Apple, both in the past and the foreseeable future. This is especially true for those who create products that run on their devices. Whether you like them or not, Apple’s WWDC 2025 wasn’t just another software update; it introduced Liquid Glass, a unified, fluid design language that spans iOS 26, macOS Tahoe, iPadOS 26, watchOS 10, tvOS 17, and visionOS 3, bringing translucent, dynamic visuals to every device. If you make products for Apple devices, here is a rundown from a business/product perspective.
Under the hood, on-device AI took center stage: Live Translation now works system-wide in Messages, FaceTime, and calls, while Enhanced Visual Intelligence lets you search, identify, and act on anything on screen—all without leaving your private device.
Developers gain access to Apple’s Foundation Models framework, enabling rich LLM-powered app features without cloud dependency. On iPad, true windowed multitasking, including resizable floating windows and tabs, transforms tablets into Mac-like workstations, and a dedicated Games app, along with new controller support, blurs the line between mobile and console gaming.
Ok, So What?
In today’s hyper-competitive marketplace, consistent, intuitive UI across platforms isn’t a luxury; it’s a baseline expectation. Liquid Glass ensures your customers’ experience feels seamless, whether on mobile, desktop, or mixed reality, reducing cognitive load and enhancing brand recall.
On-device AI allows you to embed privacy-first intelligence into your products. Think of real-time translation in your support chat or image-based inventory checks in your field-service app without latency or data leak concerns. Opening Foundation Models to third-party developers signals Apple’s commitment to democratizing AI, enabling businesses to prototype intelligent capabilities faster.
Finally, with pro-grade iPad multitasking and cross-device APIs, your teams can build lean, powerful workflows that operate seamlessly, driving productivity and innovation.
Now What?
- Prototype an AI-powered feature: Use Apple’s Foundation Models framework to create a secure, offline-first translation assistant or automated call-screening module in your existing mobile app, measure uplift in user engagement and satisfaction.
- Revamp your UI kit: Align your design system with Liquid Glass principles, update translucency, animations, and control bars in your customer-facing apps to create a cohesive experience across iPhone, iPad, and Mac.
- Pilot iPad multitasking workflows: Develop a resizable, multi-window version of a core internal tool (e.g., a dashboard or reporting app) to test how your team leverages floating windows and tabs for quicker decision-making.
Questions to think about
- How could on-device AI enhance your customer interactions while keeping their data truly private?
- Where in your product portfolio could unified design principles reduce friction and reinforce your brand identity?
- What first‐mile use case would justify investing in Foundation Models today, and how will you measure its ROI?
- Are your teams ready to adopt pro-grade iPad multitasking, and what training might they need?