At its annual Worldwide Developers Conference (WWDC) 2025, Apple unveiled a major advancement in its artificial intelligence efforts with the launch of the Foundation Models framework. This new toolset allows developers to integrate Apple’s on-device AI models directly into their apps, offering features like summarization, natural language generation, and personalized recommendations. The framework is a critical part of Apple’s AI initiative, known as Apple Intelligence (a system-wide layer of intelligence being rolled out across iOS, macOS, and iPadOS).
Unlike many cloud-based models that require constant internet access and send user data to remote servers, Apple’s Foundation Models operate locally on the device. This ensures user interactions remain private while still enabling real-time text generation, summarization, and other advanced AI tasks. According to the company, these models are optimized for performance on Apple Silicon, and developers can integrate them with just a few lines of Swift code.
The company also showcased early examples to demonstrate how apps are already using the framework. For example, the hiking app AllTrails uses the models to provide personalized trail recommendations, even without an internet connection. According to the tech titan, the models support a broad range of use cases, from education and productivity to fitness and creative tools. It is important to note that there are no usage fees for developers. The Tim Cook-led firm confirmed that running these models on the device is free, unlike many cloud AI services that charge for each request.
The Foundation Models framework became available in beta on June 9, 2025, for members of the Apple Developer Program, with a public beta expected in July 2025. The full public rollout is planned to coincide with the launch of iOS 26 and other software updates in September 2025.
Further advancing its AI tools for developers, the company also announced a major update to Xcode (its software development environment). As part of its AI push, the company has integrated support for OpenAI’s ChatGPT and other large language models directly into Xcode 26. This allows developers to benefit from advanced features like smarter code completion, natural language explanations, inline documentation, and helpful debugging suggestions (all powered by generative AI).
Even developers now have the option to use Apple’s on-device models through the Foundation Models Framework or tap into powerful external models like those from OpenAI for more complex, cloud-based tasks.
“Developers play a vital role in shaping the experiences customers love across Apple platforms. With access to the on-device Apple Intelligence foundation model and new intelligence features in Xcode 26, we’re empowering developers to build richer, more intuitive apps for users everywhere,” Susan Prescott (VP of Worldwide Developer Relations at Apple) said.
Content originally published on The Tech Media – Global technology news, latest gadget news and breaking tech news.