Earlier this year, Apple introduced the Foundation Models framework during WWDC 2025, which allows developers to use local AI models to feed their applications.
The company offered that in this context, developers gain access to AI models without worrying about any cost of conclusions. In addition, these local models have capabilities such as the guided generation and the dialing of tools integrated.
As iOS 26 circulates to all users, developers have updated their applications to include features fueled by Apple’s local AI models. Apple models are small compared to top models from Openai, Anthropic, Google or Meta. That is why local characteristics only greatly improve the quality of life with these applications instead of introducing significant changes in the application flow.
Below are some of the first applications that utilize Apple’s AI box.
The artist Lil application offers different interactive experiences to help children learn different skills such as creativity, mathematics and music. Developer Arima Jain sent a AI story creator with IOS 26 update. This allows users to choose a character and theme, with the application creating a story using AI. The developer stated that the production of text in history is powered by the local model.
This developer works in an original for automatically indicating emojis for timetables based on the title for the Daily Planner application.
Financing Moneycoach monitoring application features two neat features powered by local models. First, the app shows information about your expenses, such as whether you went more than the average for grocery stores for that week. The other feature automatically suggests categories and subcategories for a spending item for quick entries.


This word learning application has added two new features using Apple AI models. There is a new way of learning, which utilizes a local model to create examples that correspond to a word. In addition, the example asks users to explain the use of the word in a sentence.


The developer also uses On-Device models to create a map view of a word origin.


Just like a few other applications, the Tasks application implemented a feature to suggest labels for an entry using local models automatically. It also uses these models to detect a recurring task and plan it accordingly. And the application allows users to talk a few things and use the local model to break them down to various tasks without using the internet.


The Automattic Journaling App Day One uses Apple models to get highlights and suggest titles for your entry. The team has also applied a feature to create prompts that push you to dive deeper and write more based on what you have already written.


The recipe app uses Apple Intelligence to suggest labels for a recipe and match names to timers. It also uses AI to break a text block into steps easy to watch for cooking.
The digital signature application uses Apple local models to extract basic knowledge from a contract and give users a summary of the document they sign.
We will continue to update this list as we discover more applications using Apple local models.
