Earlier this year, Apple introduced its Foundation Models framework during WWDC 2025. This framework allows developers to use the company’s local AI models to power features in their applications. Apple highlighted that developers gain access to these AI models without worrying about any inference cost. The local models also have built-in capabilities such as guided generation and tool calling.
As iOS 26 rolls out to all users, developers have been updating their apps to include features powered by Apple’s local AI models. Apple’s models are smaller compared with leading models from OpenAI, Anthropic, Google, or Meta. This is why local-only features largely improve quality of life within these apps rather than introducing major changes to the apps’ core workflows.
Below are some of the first apps to tap into Apple’s AI framework.
The Lil Artist app offers various interactive experiences to help kids learn different skills like creativity, math, and music. Developer Arima Jain shipped an AI story creator with the iOS 26 update. This allows users to select a character and a theme, with the app generating a story using AI. The developer said that the text generation in the story is powered by the local model.
The developer of the daily planner app Daylish is working on a prototype for automatically suggesting emojis for timeline events based on the event title.
Finance tracking app MoneyCoach has two neat features powered by local models. The app shows insights about your spending, such as whether you spent more than average on groceries for that particular week. The other feature automatically suggests categories and subcategories for a spending item to enable quick entries.
This word learning app has added two new modes using Apple’s AI models. There is a new learning mode which leverages a local model to create examples corresponding to a word. The example then asks users to explain the usage of the word in a sentence. The developer is also using on-device models to generate a map view of a word’s origin.
Just like a few other apps, the Tasks app implemented a feature to suggest tags for an entry automatically using local models. It is also using these models to detect a recurring task and schedule it accordingly. The app lets users speak a few things and use the local model to break them down into various tasks without using the internet.
Automattic-owned journaling app Day One is using Apple’s models to get highlights and suggest titles for your entry. The team has also implemented a feature to generate prompts that nudge you to dive deeper and write more based on what you have already written.
The recipe app Crouton is using Apple Intelligence to suggest tags for a recipe and assign names to timers. It also uses AI to break down a block of text into easy-to-follow steps for cooking.
The digital signing app SignEasy is using Apple’s local models to extract key insights from a contract and give users a summary of the document they are signing.
We will continue updating this list as we discover more apps using Apple’s local models.

