EyeEmthe Berlin-based photo-sharing community that left last year for the Spanish company Freepik after going bankrupt, it now licenses its users’ photos to train AI models. Earlier this month, the company notified users via email that it was adding a new clause to its Terms and Conditions that would give it the right to upload user content to “train, develop and improve software, algorithms and machine learning models.” Users were given 30 days to opt out by removing all their content from EyeEm’s platform. Otherwise, they have consented to this use case for their work.
At the time of its acquisition in 2023, EyeEm’s photo library included 160 million images and nearly 150,000 users. The company said it will merge its community with Freepik’s over time.
Once considered a potential challenger to Instagram — or at least the “Instagram of Europe” — EyeEm was down to a staff of three before selling to Freepik, TechCrunch’s Ingrid Lunden previously reported. Joaquin Cuenca Abela, CEO of Freepik, hinted at the company’s potential plans for EyeEm, saying it will explore how to bring more AI into the equation for creators on the platform.
As it turns out, that meant selling their work to train AI models.
Now, EyeEm’s updated Terms & Conditions as follows:
8.1 Grant of Rights – EyeEm Community
By uploading Content to the EyeEm Community, you grant us with respect to your Content the non-exclusive, worldwide, transferable and sublicensable right to reproduce, distribute, publicly display, convert, adapt, create derivative works of, communicate to the public and/or promote such Content.
This specifically includes the sublicensable and transferable right to use your Content to train, develop and improve machine learning software, algorithms and models. If you do not agree to this, you should not add your Content to the EyeEm Community.
The rights granted in this Section 8.1 with respect to Your Content remain valid until fully deleted from the EyeEm Community and Partner Platforms in accordance with Section 13. You may request deletion of Your Content at any time. The conditions for this are in section 13.
Section 13 outlines a complex process for deletions that begins with the first outright deletion of photos — which will not affect content previously shared in EyeEm magazine or on social media, the company notes. To remove content from EyeEm Market (where photographers sell their photos) or other content platforms, users should submit a request to support@eyeem.com and provide the Content ID numbers for those photos they wanted to delete and if they should be deleted it was also removed from their account or only from the EyeEm market.
It’s worth noting that the notice states that these deletions from the EyeEm marketplace and partner platforms may last up to 180 days. Yes, that’s right: Requested deletions take up to 180 days, but users only have 30 days to opt out. This means that the only option is to manually delete photos one by one.
Even worse, the company adds that:
You hereby acknowledge and agree that your authorization for EyeEm to market and license your Content in accordance with sections 8 and 10 will remain valid until the Content is deleted from EyeEm and all Partner Platforms within the time frame mentioned above. All license agreements entered into prior to full deletion and usage rights granted are not affected by the deletion request or deletion.
Section 8 details the licensing rights for AI training. In Section 10, EyeEm informs users that they will waive their right to any payments for their work if they delete their account — something users may consider doing to avoid feeding their data to AI models . Gotcha!
EyeEm’s move is an example of how AI models are trained on the back of user content, sometimes without their explicit consent. Although EyeEm offered an opt-out process, any photographer who missed the announcement would have lost the right to dictate how their photos are used in the future. Since EyeEm’s position as a popular Instagram alternative has waned significantly over the years, many photographers may have forgotten they ever used it in the first place. They certainly may have ignored the email if it wasn’t already in a spam folder somewhere.
Those who noticed the changes were upset that they only received 30 days notice and no options to bulk delete their contributionsmaking it more painful to opt out.
Requests for comment sent to EyeEm were not immediately confirmed, but since this countdown had a 30-day deadline, we chose to publish before reaching out again.
This kind of dishonest behavior is why users today are considering switching to the open social web. The federal platform, Pixelfedwhich runs on the same ActivityPub protocol that powers Mastodon, leverages EyeEm’s status to attract users.
In a post on her official account, Pixelfed was announced “We will never use your images to help train AI models. Privacy First, Pixels Forever.”