On June 9, Apple staged its Platforms State of the Union as a part of its 2025 Worldwide Builders Convention. This serves as a secondary keynote presentation, meant for app builders, that delves into a few of the bulletins in larger depth.
SEE: Apple’s WWDC 2025 Keynote Roundup
Apply ‘Liquid Glass’ to app design
One of many largest bulletins was the revealing of “Liquid Glass,” the brand new interface design for use within the subsequent working system replace. Builders can now undertake the glassy, shiny new look into their apps utilizing SwiftUI, UIKit, or AppKit, similar to within the app icon.
Billy Sorrentino, Apple’s senior director of human interface, mentioned at WWDC that the brand new design goals to carry higher hierarchy, concord, and consistency to gadgets.
Key consumer interface (UI) components now float above content material for readability and focus, clarifying the hierarchy. On the identical time, concord is created because the UI shapes and interactions are higher aligned with system geometry and pure contact patterns. As the brand new design is common throughout Apple platforms, apps really feel constant and acquainted wherever they run.
Create AI options with Apple Intelligence
The keynote additionally delivered some information about Apple Intelligence, together with that builders can now entry Apple’s on-device basis fashions to combine into their apps.
For instance, an training app might use them to generate a personalised quiz out of a pupil’s notes. There are not any related cloud API or server prices, making it a less expensive various than paying for a third-party service. It really works even when the consumer is offline, and privateness is assured.
To make this integration much more seamless, Apple launched the Basis Fashions Framework, which is embedded within the Swift programming language. Builders can unlock generative capabilities similar to textual content summarization and customized software calling with just some traces of code. This framework is designed to be developer-friendly.
Constructing on this, Apple has additionally enhanced its App Intents platform. Beforehand a software for integrating app content material with options like Siri, Highlight, or widgets, it now helps Visible Intelligence. This development permits builders to include visible search capabilities into their apps, permitting customers to determine and work together with content material immediately from photographs utilizing AI, similar to recognizing an object in a photograph or scanning handwritten notes.
Xcode 26 has ChatGPT
Apple’s improvement toolset, Xcode, is gaining built-in help for ChatGPT in its subsequent iteration.
When builders construct their apps inside Xcode 26, they will ask the AI assistant for bug fixes, documentation, and some other help they want whereas coding. It each responds to pure language prompts and suggests actions based mostly on exercise. Builders can even use API keys to combine third-party AI chatbots, aside from ChatGPT, and use the timeline characteristic to undo any AI-generated modifications that don’t meet their necessities.
Matthew Firlik, Apple’s senior director of developer relations, additionally introduced that macOS Tahoe, the subsequent working system launch, would be the final that helps Intel Macs. He inspired builders to rebuild their apps with Xcode 26 to allow them to benefit from the efficiency options of M-Sequence chips present in newer Macs.
All of those developer choices at the moment are accessible for testing, with a public beta scheduled for July.
visionOS 26 helps spatial app improvement
visionOS 26, the upcoming working system for the Imaginative and prescient Professional headset, will include a number of options that allow spatial app improvement, together with:
- New volumetric APIs: These permit the creation of absolutely 3D UI layouts utilizing SwiftUI, for instance, by way of the 3D anchoring of widgets.
- Close by Window Sharing: Builders can construct shared spatial experiences for customers sporting two completely different Imaginative and prescient headsets in the identical room.
- RealityKit’s Picture Presentation Element: Permits 2D photographs to show into immersive 3D scenes throughout the app.
- Help for immersive media codecs: Together with 180°, 360°, and wide-field-of-view video by way of Apple Projected Media Profile.
Swift 6.2 consists of help for WebAssembly and extra
Swift has reached model 6.2. The brand new model gives:
- New APIs for effectively working with varieties of reminiscence, similar to Span and InlineArray.
- Help for WebAssembly.
- Improved interoperability with different languages, similar to C++ and Java.
- A containerization software that permits Linux container photographs to run on Mac.
Swift now bridges AI, design, and efficiency, giving builders the instruments to create cohesive cross-platform apps.