Instagram will start alerting parents if their teen repeatedly tries to search terms related to suicide or self-harm within a short period of time, according to the company. was announced on Thursday. Notifications are rolling out in the coming weeks to parents signed up for parental controls on Instagram.
The Meta-owned social media platform says that while it already blocks users from searching for suicide and self-harm content, these new alerts are designed to make sure parents know if their teen is repeatedly trying to search for that content, so they can support the teen.
Searches that might trigger an alert include phrases that encourage suicide or self-harm, phrases that suggest a teen may be at risk of harming themselves, and terms like “suicide” or “self-harm.”
Instagram says parents will receive the notification via email, text or WhatsApp, depending on the contact information they’ve provided, along with an in-app notification. The notice will include resources designed to help parents approach conversations with their teen.
The move comes as Meta and other major tech companies face this moment several lawsuits seeks to hold social media giants accountable for harming teenagers.
During testimony in a lawsuit in the U.S. District Court for the Northern District of California this week, Instagram chief Adam Mosseri was blasted by prosecutors in an ongoing social media addiction case for the app’s late release of key security features, including a nude filter for private messages to teenagers.
Additionally, during depositions in a separate lawsuit in Los Angeles County Superior Court, it was revealed that an internal investigation at Meta found that parental supervision and control had little effect on children’s compulsive use of social media. The study also found that children who experienced stressful life events were more likely to have difficulty regulating their social media use.
Given the ongoing lawsuits accusing the company of failing to protect teens on its platforms, the timing of these new notices is not surprising.
The company notes that it will seek to avoid sending these notifications unnecessarily, as overuse could reduce their overall effectiveness.
“In an effort to strike this important balance, we analyzed search behavior on Instagram and consulted with experts from our Suicide and Self-Harm Advisory Group,” Instagram explained in a blog post. “We’ve chosen a threshold that requires a few searches in a short period of time while still being cautious. While that means we may sometimes alert parents when there may be no real cause for concern, we believe — and experts agree — that this is the right place to start, and we’ll continue to monitor and listen to feedback to make sure we’re in the right place.”
The notifications will roll out in the US, UK, Australia and Canada next week, and will be available in other regions later this year.
In the future, Instagram plans to roll out these alerts when a teen tries to engage the app’s AI in discussions about suicide or self-harm.
