Apple Preparing for Upcoming Siri Onscreen Awareness Feature With New iOS 18.2 API for Developers

Apple is working on more advanced Siri functionality as part of its Apple Intelligence feature set, and to prepare, it has been providing developers with App Intent APIs so apps will be ready for the new capabilities.With the latest wave of betas, Apple has a new API that lets developers make onscreen content in their apps available to ‌Siri‌ and ‌Apple Intelligence‌. From Apple's documentation:When a user asks a question about onscreen content or wants to perform an action on it, Siri and Apple Intelligence can retrieve the content to respond to the question and perform the action. If the user explicitly requests it, Siri and Apple Intelligence can send content to supported third-party services. For example, someone could view a website and use Siri to provide a summary by saying or typing a phrase like "Hey Siri, what's this document about?"In the iOS 18.2 beta, ChatGPT integration with ‌Siri‌ allows users to ask questions about photos and documents, such as PDFs and presentations, and get information about them. You can, for example, ask ‌Siri‌ "what's in this photo?" and ‌Siri‌ will take a screenshot and hand it over to ChatGPT. ChatGPT then relays what's in the image, and the same feature works for PDFs and other documents.It does not seem that the iOS 18.2 ChatGPT integration is the onscreen awareness functionality that Apple has planned for ‌Siri‌, but it could be related. Apple describes onscreen awareness as the ability for ‌Siri‌ to understand and take action on things on the screen. If someone texts you an address, for example, you'll be able to say "Add this address to their contact card," and ‌Siri‌ will do it. This functionality is not available in iOS 18.2, and ChatGPT is limited to assessing screenshots, but it is somewhat confusing.Onscreen awareness, like personal context and in-app actions, is a feature that Apple has planned for ‌Siri‌, but it's probably not something that we're getting this year. Many of the ‌Siri‌ features are coming in a future version of iOS 18, and Bloomberg's Mark Gurman has said we can expect to see them in iOS 18.4, an update set to be released in the spring of 2025.While there are multiple ‌Siri‌ features that won't be coming until next year, Apple is providing developers with APIs in advance so that developers have several months to prepare and so the features are ready for the public when the updates actually come out.Related Roundups: iOS 18, iPadOS 18Related Forums: iOS 18, iPadOS 18This article, "Apple Preparing for Upcoming Siri Onscreen Awareness Feature With New iOS 18.2 API for Developers" first appeared on MacRumors.comDiscuss this article in our forums

featured-image

Apple is working on more advanced Siri functionality as part of its Apple Intelligence feature set, and to prepare, it has been providing developers with App Intent APIs so apps will be ready for the new capabilities. With the latest wave of betas, Apple has a new API that lets developers make onscreen content in their apps available to ‌Siri‌ and ‌Apple Intelligence‌. From Apple's documentation: When a user asks a question about onscreen content or wants to perform an action on it, Siri and Apple Intelligence can retrieve the content to respond to the question and perform the action.

If the user explicitly requests it, Siri and Apple Intelligence can send content to supported third-party services. For example, someone could view a website and use Siri to provide a summary by saying or typing a phrase like "Hey Siri, what's this document about?" In the iOS 18.2 beta, ChatGPT integration with ‌Siri‌ allows users to ask questions about photos and documents, such as PDFs and presentations, and get information about them.



You can, for example, ask ‌Siri‌ "what's in this photo?" and ‌Siri‌ will take a screenshot and hand it over to ChatGPT. ChatGPT then relays what's in the image, and the same feature works for PDFs and other documents. It does not seem that the iOS 18.

2 ChatGPT integration is the onscreen awareness functionality that Apple has planned for ‌Siri‌, but it could be related. Apple describes onscreen awareness as the ability for ‌Siri‌ to understand and take action on things on the screen. If someone texts you an address, for example, you'll be able to say "Add this address to their contact card," and ‌Siri‌ will do it.

This functionality is not available in iOS 18.2, and ChatGPT is limited to assessing screenshots, but it is somewhat confusing. Onscreen awareness, like personal context and in-app actions, is a feature that Apple has planned for ‌Siri‌, but it's probably not something that we're getting this year.

Many of the ‌Siri‌ features are coming in a future version of iOS 18 , and Bloomberg 's Mark Gurman has said we can expect to see them in iOS 18.4, an update set to be released in the spring of 2025. While there are multiple ‌Siri‌ features that won't be coming until next year, Apple is providing developers with APIs in advance so that developers have several months to prepare and so the features are ready for the public when the updates actually come out.

.