Apple Intelligence, Apple’s new generative AI offering, won’t just be a consumer-facing feature; developers will be able to take advantage of the latest technology, too. In Apple’s keynote address at WWDC 2024 on Monday, the company announced that developers would be able to integrate the experience powered by Apple Intelligence into their own apps.
Apple’s SDKs (software development kits) have been updated with a variety of new APIs and frameworks that will allow app makers to do things like integrate the Image Playground — or GenAI image creation — with just a few lines of code. Apple showed off how an app like Craft could use this offering to make users’ documents more visual by allowing them to add AI images.
AI-powered writing tools will also be automatically available in any app that uses the standard editable text view. For this, Apple demonstrated how an app like Bear Notes would automatically be able to allow users to rewrite, proofread and summarize their notes.
In addition, Apple is building more ways for developers to take actions in apps with Siri.
Developers who have already adopted SiriKit — an SDK for integrating Siri into their apps — will see immediate enhancements for many of Siri’s new capabilities without any extra work on their part, Apple said. This includes areas like Lists, Notes, Media, Messaging, Payments, Restaurant reservations, VoIP calling and Workouts.
In its Developer keynote, Apple said that there are two new Siri capabilities that developers will be able to benefit from without additional work. First, Siri will be able to invoke any item from an app’s menus. That means a user could say something like “show my presenter notes” when in their slide deck or even something more conversational, like “I need to see my speaker notes.”
Secondly, Siri will be able to access any text displayed on the page using Apple’s standard text systems. This will allow users to reference and act on text on the screen. For instance, if you had a reminder or note to “wish grandpa a happy birthday,” you could just say “FaceTime him” to take action on that note.
Meanwhile, Apple’s App Intents framework will also gain access to Apple Intelligence. Apple is defining new intents and making them available to developers across categories starting with a subset of categories including Books, Browsers, Cameras, Document Readers, File Management, Journals, Mail, Photos, Presentations, Spreadsheets, Whiteboards and Word Processors.
These intents are defined and tested so they’re easier for developers to adopt, Apple claims.
With the intents, a photo-editing app like Darkroom could leverage the Apply Filter intent so users could just say “Apply a cinematic present to the photo I took of Ian yesterday” to have the app take action. More domains will be added in time.
Initially, users will be able to develop with the Shortcuts app but over time, Siri will gain the ability to call the app intents in the supported domains.
Plus, Apple shared in its keynote address, apps that fit an existing SiriKit domain will be able to benefit from Siri’s enhanced conversational capabilities, like responding correctly even if you stumble over your words or understanding references to an earlier part of the conversation.
Siri will also be able to search data from apps using a new Spotlight API that enables app entities to be included in its index. These entities refer to Apple Intelligence’s semantic index of things like photos, messages, files, calendar events and more.
Also on Monday, the company announced its own password-manager app, AI-generated Bitmoji and Calculator for the iPad.
This post was updated after publication with more information from the Developer keynote.
Comment