Bluesky posted on Friday moderation report for the past year, noting the significant growth experienced by the social network in 2024 and how this has affected the workload of the Trust & Safety team. He also noted that the largest number of reports came from users reporting accounts or posts for harassment, trolling or bigotry — an issue that has plagued Bluesky as it has grown and has even led to widespread protests at times over individual moderation decisions.
The company’s statement did not mention or explain why it did or did not take action on individual users, including those using the list of most blocks.
The company added over 23 million users in 2024 as Bluesky became a new destination for ex-Twitter/X users for a variety of reasons. Throughout the year, the social network benefited from several changes to X, including its decision to change the way blocking works and to train AI on user data. Other users left X after the results of the US presidential election, based on how the politics of X owner Elon Musk began to dominate the platform. The app also grew in users while X was temporarily banned in Brazil in September.
To meet the demands brought on by that growth, Bluesky has grown its coordination team to about 100 coordinators, he said, and continues to hire. The company also began offering team members psychological counseling to help them with the difficult job of constant exposure to graphic content. (An area we hope AI will tackle one day, as humans aren’t built to handle this kind of work.)
In total, there were 6.48 million reports to Bluesky’s monitoring service, 17 times more than in 2023, when there were only 358,000 reports.
Starting this year, Bluesky will begin accepting moderation reports directly from its app. Similar to X, this will allow users to track actions and updates more easily. Later, it will also support in-app review requests.
When Brazilian users flooded Bluesky in August, the company was seeing up to 50,000 reports a day, at its peak. This led to delays in dealing with moderation reports and required Bluesky to hire more Portuguese-speaking staff, including through a contract vendor.
In addition, Bluesky began automating more categories of reports beyond just spam to help it deal with the influx, though this sometimes led to false positives. However, automation has helped reduce processing time to just “seconds” for “high certainty” accounts. Before automation, most reports were processed within 40 minutes. Now, human moderators stay on top to deal with false positives and appeals, if not always handle the initial decision.
Bluesky says 4.57% of its active users (1.19 million) made at least one moderation report in 2024, up from 5.6% in 2023. Most of those — 3.5 million reports — were about individual posts. Account profiles were mentioned 47,000 times, often for a profile photo or banner photo. The lists were mentioned 45,000 times. DMs were mentioned 17,700 times, with streams and starter packs getting 5,300 and 1,900 mentions, respectively.
Most of the reports were about anti-social behavior such as trolling and harassment – a message from Bluesky users that they want to see a less toxic social network, compared to X.
Other reports were for the following categories, Bluesky said:
- Misleading content (impersonation, misinformation, or false claims about identity or affiliations): 1.20 million
- Spam (excessive mentions, replies or repetitive content): 1.40 million
- Unsolicited sexual content (nudity or inappropriately labeled adult content): 630,000
- Illegal or urgent matters (clear violations of the law or Bluesky’s terms of service): 933,000
- Other (subjects not included in the above categories): 726,000
The company also offered an update to its tagging service, which includes tags added to posts and accounts. Tag companies added 55,422 “sexy figure” tags, followed by 22,412 “rude” tags, 13,201 “spam” tags, 11,341 “intolerant” tags and 3,046 “threatening” tags.
In 2024, 93,076 users submitted a total of 205,000 review requests for Bluesky’s supervision decision.
There were also 66,308 account removals by moderators and 35,842 automated account removals. Bluesky fielded an additional 238 requests from law enforcement, governments and law firms. The company responded to 182 of them and complied with 146. Most of the requests were law enforcement requests from Germany, the US, Brazil and Japan, he said.
Bluesky’s full report also delves into other types of issues, including trademark and copyright claims and child safety/CSAM reports. The company noted that it submitted 1,154 confirmed CSAM reports to the National Center for Missing & Exploited Children (NCMEC).