Apple’s plans to improve the detection of the app store using AI labels are now available on iOS 26’s Beta Build.
However, labels do not appear in the Public App Store so far, nor do they update the App Store search algorithm in the public store.
Of course, with any upcoming App Store update, there is speculations about how changes will affect the search for an application.
A new detailedsmallIt is from the Application Information Provider, for example, it proposes metadata that is exported from snapshots of an application affects its ranking.
The business thought Apple was exiting text captions. Previously, only the application name, the subtitle and the keyword list would be calculated in the search ranking, he said.
The conclusion that screenshots update the app is accurate, based on what Apple announced at the World Conference Developer (WWDC 25), but the way Apple exports this data includes AI, not OCR techniques, as appigures had guessed.
At the Annual Developer Conference, Apple explained that screenshots and other metadata will be used to help improve the detection of an application. The company said it uses AI techniques to export information that would otherwise be buried in an application description, category information, snapshots or other metadata, for example. This also means that developers should not add keywords to screenshots or take other measures to influence the labels.
This allows Apple to assign a label for better categorization of the application. In the end, developers will be able to check which of these AI labels will be associated with their applications, the company said.
In addition, Apple assured developers that people will review labels before going alive.
Over time, it will be important for developers to better understand the labels and which will help their implementation be discovered when labels reach the app users.
