Meta said Thursday it is testing new features on Instagram intended to help protect young people from unwanted nude or extortion scams. This includes a feature called “Nude Protection in DM” which automatically blurs images that are detected as containing nudity.
The tech giant also said it would push teenagers to protect themselves by issuing a warning encouraging them to think twice about sharing intimate images. Meta hopes this will strengthen protection against scammers who might send nude images to trick people into sending their own images in return.
The company said it is also implementing changes that will make it harder for potential scammers and criminals to find and interact with teenagers. Meta said it is developing new technology to identify accounts that are “potentially” involved in sex procurement scams and will implement limits on how those suspicious accounts can interact with other users.
In another step announced Thursday, Meta said it has increased the data it shares with online cross-platform child safety program Lantern to include more “signals relevant to blackmail.”
The social networking giant has long-standing policies that prohibit people from spamming nudes or seeking to coerce others into sharing private images. However, this does not prevent these problems from appearing and causing misery for many teenagers and young adults — sometimes with extremely tragic results.
We’ve rounded up the latest collection of changes in more detail below.
Nude screens
Nudity Protection in DMs aims to protect teenage Instagram users from cyberflashing by placing nude images behind a security screen. Users will be able to choose whether or not to see such images.
“We’ll also show them a message encouraging them not to feel pressured to respond, with an option to block the sender and report the conversation,” Meta said.
The nude security screen will be enabled by default for users under 18 worldwide. Older users will see a notification encouraging them to enable the feature.
“When nudity protection is enabled, people sending images containing nudity will see a message reminding them to be careful when sending sensitive photos and that they can unsend those photos if they’ve changed their mind,” the company added.
Anyone trying to promote a nude image will see the same warning encouraging them to reconsider.
The feature is powered by on-device machine learning, so Meta said it will work in end-to-end encrypted conversations because the image analysis takes place on the user’s device.
The nude filter has been in development for nearly two years.
Safety tips
In another safeguard, Instagram users who send or receive nudes will be directed to safety tips (with information about potential risks), which Meta says have been developed with expert guidance.
“These tips include reminders that people can take pictures or forward pictures without your knowledge, that your relationship with the person may change in the future, and that you should check profiles carefully in case they are not the ones who they say they are,” the company wrote in a statement. “They are also linked to a range of resources including Meta Security Center, support lines, StopNCII.org for people over 18 and Put it down for persons under the age of 18.”
The company is also testing showing pop-up messages to people who may have interacted with an account that has been taken down for blackmail. These pop-ups will also direct users to relevant resources.
“We’re also adding new child safety helplines from around the world to our in-app report streams. This means that when teens report relevant issues — such as nudity, threats to share personal images, or sexual exploitation or solicitation — we’ll direct them to local child safety helplines where they exist,” the company said.
Technology to track blackmailers
While Meta says it removes sexist accounts when it detects them, it must first identify bad actors in order to shut them down. So the company is trying to go further by “developing technology to help track down accounts possibly to engage in extortion scams, based on a number of clues that could indicate extortion behavior’.
“While these signals are not necessarily evidence that an account has violated our rules, we take proactive steps to prevent these accounts from being found and interacting with teen accounts,” the company said. “This is based on work we are already doing to prevent other potentially suspicious accounts from finding and interacting with teens.”
It’s not clear what technology Meta uses to do this analysis, nor what signals might indicate a potential sexologist (we’ve asked for more details). Apparently, the company can analyze communication patterns to try to identify bad actors.
Accounts flagged by Meta as potential extortionists will face restrictions on messaging or interacting with other users.
“[A]ny message requests that potential kill accounts try to send will go straight to the recipient’s hidden requests folder, meaning they won’t be notified of the message and will never need to see it,” the company wrote.
Users already chatting with potential scam or redemption accounts will not close their chats, but will be shown Safety Notes “Encouraging them to report any threats to share their personal images and reminding them that they can say ‘no’ to anything that makes them feel uncomfortable,” according to the company.
Teen users are already protected from receiving DMs from adults they don’t connect with on Instagram (and also from other teens, in some cases). But Meta is taking it a step further: The company said it’s testing a feature that hides the “Message” button on teens’ profiles for potentially sexually deviant accounts — even if they’re online.
“We’re also testing hiding teens from these accounts in people’s follower, follow, and like lists and making it harder for them to find teen accounts in search results,” he added.
It’s worth noting that the company has come under increasing scrutiny in Europe over child safety risks on Instagram, and enforcement authorities have questioned its approach since the bloc’s Digital Services Act (DSA) came into effect last year. summer.
A long, slow crawl to safety
Meta has announced measures to combat the campaign in the past — most recently in February, when it expanded access to Take It Down. The third-party tool allows users to create a hash of a personal image locally on their own device and share it with the National Center for Missing and Exploited Children, helping to create a repository of non-consensual image hashes that companies can use to to find and remove revenge porn.
The company’s previous approaches to tackling this problem have been criticized as requiring young people to upload their nudes. In the absence of tough laws regulating how social networks should protect children, Meta has been left to self-regulate for years – with patchy results.
However, some requirements have landed on platforms in recent years – such as the UK Children’s Code (coming into force in 2021) and the more recent DSA in the EU – and tech giants like Meta are finally paying more attention in the protection of minors.
For example, in July 2021, Meta began defaulting young people’s Instagram accounts to private just before the UK compliance deadline. Even tighter privacy settings for teens on Instagram and Facebook followed in November 2022.
This January, the company announced it would default to tighter messaging settings for teens on Facebook and Instagram, just before the full DSA compliance deadline in February.
This slow and repetitive feature on Meta about safeguards for young users raises questions about what took the company so long to implement stronger safeguards. It suggests that Meta has opted for a cynical minimum when it comes to assurance in an effort to manage the impact on usage and prioritize engagement over security. Exactly what Meta whistleblower Frances Haugen repeatedly accused her former employer of.
Asked why the company isn’t implementing these new protections on Facebook, a Meta spokeswoman told TechCrunch, “We want to respond where we see the greatest need and relevance — which, when it comes to unwanted nudity and educating teens on the dangers of sharing sensitive images — we think it’s in Instagram DMs, so that’s where we’re focusing first.”