For years, sites have included information on the type of detectors that are not allowed on their site with a robot file. Adobe, who wants to create a similar standard for images, has added a content tool for content credentials with the intention of giving them a little more control to what is used for AI models training.
AI persuasive companies to truly adhere to Adobe’s standard may be the primary challenge, especially taking into account that Crawlers AI are already known To ignore the requests In the robot file.txt.
Content credentials are information on the metadata of a media file used to identify authenticity and property. It is a type of implementation of the coalition for the origin and authenticity of the content (C2PA), a standard for the authenticity of the content.
Adobe releases a new web tool to allow creators to attach content credentials to all image files, even if not created or processed through its own tools. In addition, it provides a way for creators to mark AI companies that they should not use this particular image for training models.
Adobe’s new web application, called Adobe Content Authentity App, allows users to attach their credentials, including names and social media accounts, to a file. Users can attach these credentials up to 50 JPG or PNG files to a GO.
Adobe works with LinkedIn to use the Microsoft platform verification program. This helps in the proof that the person who connects the credentials to an image has a verified name on LinkedIn.


Users can also attach Instagram or X profiles to an image, but there is no integration by verifying these platforms.
The same application allows users to mark a frame to mark their images should not be used for model training.
While the field exists in the application and then in a metadata metadata with content credentials, Adobe has not signed an agreement with any of the AI model creators to adopt this standard. The company said they are in talks with all the top AI model developers to persuade them to use and respect this model.
Adobe’s intentions are in the right place to give an indicator to AI training data manufacturers, but the initiative will not work if companies do not agree with the standard or do not respect the index.


Last year, the application of META labels to automatic label images on its platform caused disruption as photographers complained of their edited images with a label “made with ai”. Meta later changed the label to “Ai Info”.
This development emphasized that while Meta and Adobe are part of the C2PA Managing Committee, there is a difference in implementation on different platforms.
Andy Parson, senior director of the Authenticity Initiative in Adobe, said the company created the new content certification application with creators. Since the regulations on intellectual property rights and AI training data are scattered around the world, the company wants to give creators a way to mark their intention for AI platforms with the application.
“Content creators want a simple way to indicate that they do not want their content to be used for Genai training. We have heard from small creators and organizations that they want more control over their creations [in terms of AI training on their content]“Parson told TechCrunch.
Adobe also releases a Chrome extension for users to locate images with content credentials.
Said the company with Content Credents, uses a mixture Digital fingerprint imprinting, open source watermark and encryption metadata To integrate the metadata into various pixels of an image, so even if the image is modified, the metadata remain intact. This means that users can use Chrome extension to check content credentials on platforms such as Instagram that do not inherently support the standard. Users will see a small “CR” symbol in an image if they are attached to these content credentials.
In a world where there is a lot of talk about AI and art, Parson says that C2PA does not believe in the opinion or direction of art. But he believes that content credentials could be an important index of ownership.
‘There is the gray area [of when an image is edited using AI, but it is not 100% AI-generated]And what we are saying is to allow artists and creators to sign their work and claim performance for it. This does not mean that the IP is legal or copyright, but it just shows that someone did it, “Parson said.
Adobe said that while his new tool is designed for images, he wants to add video and sound under the line as well.
