As the generative AI craze sweeps the globe, Apple clearly no longer intends to remain a mere bystander. According to the latest rumors, Apple is expected to...WWDC 2026 Developer ConferenceOn top of that, a brand-new "Core AI" software framework was launched, replacing (or renaming) the long-standing Core ML. Furthermore, byGoogle Gemini technologyThe next generation of Apple Foundation Models, which will be trained, and the new Siri with chatbot capabilities will be the focus of this developer conference.
From Machine Learning to Artificial Intelligence: Core AI's Strategic Transformation
Since Apple introduced the Core ML framework, it has been a core tool for developers to integrate machine learning technologies into iOS and macOS applications. In recent years, with technological advancements, Core ML's functionality has expanded to include on-device deployment of generative AI tools such as Large Language Models (LLMs) and Diffusion Models.
According to Bloomberg News reporter Mark GurmanHis statement in his "Power On" columnApple plans to officially unveil the "Core AI" framework at the upcoming WWDC 2026. While it remains unclear whether this represents a complete overhaul of the underlying architecture or simply a "renaming" of Core ML to more accurately reflect its current main functions, it is expected that the two frameworks will coexist for a period of time initially.
This change may seem like a minor tweak, but the shift from "ML" (machine learning) to "AI" (artificial intelligence) strongly signals a change in Apple's internal priorities, indicating that Apple is beginning to fully embrace this wave of AI, rather than passively waiting for the "AI bubble" to burst.
Integrating third-party models and opening up ecosystems become the focus.
Looking back at WWDC 2025 last year, the biggest highlight of Apple's iOS 26 update was enabling developers to integrate Apple Foundation Models into their own applications, using the device's AI computing power to generate text or perform other AI application functions.
One of the core tasks of Core AI this year is rumored to be further expanding its "openness," helping developers more easily integrate "third-party AI models" into their apps. Although the specific implementation details are not yet clear, it is speculated that Apple is very likely to adopt a technical standard similar to MCP (Model Context Protocol) to make cross-model collaboration more seamless.
WWDC 2026 Highlights: The New Siri Powered by Gemini
Despite earlier rumors suggesting that the focus of the iOS 27 update would be...This will be focused on fixing system bugs and improving stability.However, the agenda for WWDC 2026 will definitely be packed with AI elements.
Besides the Core AI framework, the core focus of the entire presentation is expected to be the new version of Apple Foundation Models, trained by Google Gemini. As the market has recently anticipated, this model will give Siri true chatbot conversational capabilities, allowing this digital assistant, which has been around for many years, to receive a more complete "intelligence upgrade" at WWDC 2026 after the spring update.
Analysis of viewpoints
In the past few years, Apple has deliberately avoided using the widely used buzzword "AI" in its external communications, preferring to use "machine learning" to emphasize its practicality. But now, with the launch of Apple Intelligence and its deep alliance with Google Gemini, it's clear that Apple has fully embraced "AI" in both marketing and technology development.
For developers, the launch of Core AI, coupled with its seamless integration capabilities with third-party models, will significantly lower the barrier to entry for developing AI applications on Apple devices. Once Apple's massive ecosystem, with over 20 billion active devices, officially provides more standardized integration tools for all third-party AI models, it will be a crucial turning point for the entire generative AI industry to truly move towards "mass-market application on mobile devices."




