How developers are using Apple’s local AI models with iOS 26

Earlier this year, Apple introduced its Foundation Models Framework during WWDC 2025, with which developers can use the company’s local AI models to promote functions in their applications.
The company praised that with this framework developers gain access to AI models without worrying about inference costs. Moreover, these local models have options such as built -in generation and tool calls.
Because iOS 26 is being rolled out for all users, developers have updated their apps with functions powered by Apple’s local AI models. Apple’s models are small compared to leading models from OpenAi, Anthropic, Google or Meta. That is why functions that only largely improve the quality of life with these apps instead of introducing important changes in the app’s workflow.
Below are some of the first apps that use the AI framework from Apple.
Lil -artist
The Lil -artist App offers various interactive experiences to help children learn different skills, such as creativity, mathematics and music. Developer Arima Jain shipped an AI Storymaker with the iOS 26 update. This allows users to select a sign and a theme, where the app generates a story with the help of AI. The developer said that the text generation in the story is powered by the local model.

Day list
The developer of the Day list App is working on a prototype for automatic suggestion of emojis for timeline events based on the title for the Daily Planner app.
Money Coach
Financial tracking -app Money Coach Has two fun functions powered by local models. Firstly, the app shows insights about your expenses, such as whether you have spent more than average groceries for that specific week. The other function automatically suggests categories and subcategories for an expenditure item for rapid entries.

Seek out
The app becomes learning Seek out has added two new modes with the help of Apple’s AI models. There is a new learning mode that uses a local model to make examples that match a word. Moreover, the example requires users to explain the use of the word in a sentence.

The developer also uses models on the device to generate a map view of the origin of a word.

Tasks
Just like a few other apps, the Tasks App has implemented a function to present tags for an item automatically using local models. It also uses these models to detect a recurring task and to plan it accordingly. And the app makes users speak a few things and use the local model to break them down in different tasks without using the internet.

Day one
Automattic ownership JournalApp Day one Uses Apple’s models to get highlights and proposes titles for your entry. The team has also implemented a function to generate prompts that you pop up to dive deeper and write more based on what you have already written.

Crouton
Recipe -app Crouton Uses Apple Intelligence to present tags for a recipe and assign names to timers. It also uses AI to split a block of text into easy -to -follow steps for cooking.
Timid
Digital signing -app Timid Uses Apple’s local models to extract important insights from a contract and to give users a summary of the document they sign.
We will continue to update this list while we discover more apps using the local models from Apple.




