The Meta Board has now expanded its scope to include the company’s newest platform, Instagram Threads. Designed as an independent appeals board that hears cases and then makes previous content moderation decisions, the board has so far ruled on cases such as banning Donald Trump from Facebook, misinformation about COVID-19, removing breast cancer photos and other.
Now the board has begun hearing cases arising from Threads, Meta’s Twitter/X competitor.
This is a major point of differentiation between Threads and rivals like X, where Elon Musk and other users rely heavily on crowdsourced fact-checking from Community Notes to supplement its otherwise lightweight moderation. It’s also very different from how decentralized solutions like Mastodon and Bluesky handle moderation tasks on their platforms. Decentralization allows community members to create their own servers with their own set of moderation rules and gives them the option to disconnect from other servers whose content violates their guidelines.
Startup Bluesky is also investing in stackable moderation, meaning community members can build and run their own moderation services, which can be combined with others to create a customized experience for each individual user.
Meta’s move to offload difficult decisions to an independent board that could override the company and its CEO Mark Zuckerberg was meant to be the solution to Meta’s problem of centralized authority and control over content moderation. But as these startups have shown, there are other ways to do this that allow the user to have more control over what they see, without trampling on the rights of others to do the same.
However, the Supervisory Board on Thursday announced that it would do so hear his first case from the threads.
The case involves a user’s response to a post containing a screenshot of a news article in which Japanese Prime Minister Fumio Kishida made a statement about his party’s alleged underreporting of income. The post also included a caption criticizing him for tax evasion and contained derogatory language as well as the phrase “died”. He also used derogatory language about someone who wears glasses. Because of the “drop dead” element and hashtags calling for death, a human reviewer on Meta decided the post violated the company’s rule on violence and incitement — even though it looks a lot like your run-of-the-mill X post these days. After their appeal was rejected for the second time, the user appealed to the Board of Directors.
The Board says it chose this case to review Meta’s content moderation policies and enforcement practices against political content in Threads. This is a timely move, given that it’s an election year and that Meta said it would not proactively recommend political content on Instagram or Threads.
The Board case will be the first involving Threads, but it won’t be the last. The organization is already preparing to announce another batch of cases tomorrow focusing on criminal charges based on nationality. These latest cases were referred to the Council by Meta, but the Council will also receive and weigh appeals from Threads users, as it did with the case involving Prime Minister Kishida.
Decisions made by the Board will affect how Threads as a platform chooses to support users’ ability to express themselves freely on its platform, or whether Threads will moderate content more closely than it does on Twitter/X. This will ultimately help shape public opinion about the platforms and influence users to choose one or the other, or perhaps a startup experimenting with new ways to control content in a more personalized way.