During WWDC 2025, although Apple emphasized the progress in the "Apple Intelligence" service and stated that it would open the Foundation Models framework to allow more developers to incorporate the "Apple Intelligence" service into App design, it obviously avoided talking about the new version of Siri function that was introduced in a big way last year.

When explaining the operating system update in iOS 26, Apple stated that the new version of the "Apple Intelligence" service has added features such as instant translation, a new version of the image park, a fitness partner, and more "intelligent" shortcuts. It also emphasized that the new version of the "Visual Intelligence" function can search Google for user screenshots, or compare Etsy e-commerce services to view relevant information about the corresponding items, and can even identify information such as event dates and locations, so as to quickly add them to calendar itineraries.

In order to allow more apps to access the "Apple Intelligence" service, Apple announced the opening of the Foundation Models framework, allowing developers to incorporate the "Apple Intelligence" service into their app development and enable them to execute artificial intelligence application functions through the device-side model framework even when offline. It also boasts native support for the Swift development language, which can be used with as little as three lines of code, allowing developers to easily add artificial intelligence functions to apps.

At the same time, Apple also explained that it has incorporated the "Apple Intelligence" service into many App usage experiences. For example, the Reminders app will automatically identify related items such as emails and websites and automatically categorize them. The Apple Wallet function will also automatically identify order information in emails, thereby automatically tracking the progress of order processing. In addition, instant translation, writing tools, photos, smart replies, etc. all have functions driven by the "Apple Intelligence" service.

However, although Apple stated that Siri can already follow the subsequent content when the user's spoken interaction process is suddenly interrupted, and can understand the context of the previous and next sentences, Siri is still unable to interact with people naturally as Apple demonstrated at WWDC 2024 last year. Apple did not specifically explain the development progress of the new version of Siri this time, and it may take some time before it can be provided.
As for the new version of "Apple Intelligence", the service currently supports English, French, German, Italian, Portuguese (Brazilian), Spanish, Japanese, Korean or Chinese (Simplified). It is expected that support for Danish, Dutch, Norwegian, Portuguese (Portugal), Swedish, Turkish, Chinese (Traditional) and Vietnamese will be added before the end of this year.
Due to regulatory restrictions in different regions, Apple also stated that some features may not be applicable to all languages or countries.



