Meta fixed the bug that caused people to believe that the company had adjusted their options in a political content settings tool without their consent. The issue had affected users on both Instagram and Threads, seemingly resetting users’ content settings to defaults, which limits the amount of political content users see from people they don’t follow.
On Wednesday, Meta confirmed that it is looking into the problem and working on a fix.
Later Wednesday afternoon, Meta communications director Andy Stone was announced in a Threads post that the problem had been resolved. He also shared more information about the nature of the bug, saying that Meta hadn’t changed people’s political content settings in the backend, it just appeared to have. This made it appear that people’s choices had been reset in the settings, “even though no changes had been made,” Stone wrote in Threads.
The company didn’t share more information about how the bug first appeared, but Stone encouraged users to check to make sure their settings now reflect their preferences.
You can do this from your Instagram Settings, where you’ll scroll down to ‘Content Preferences’ and then select ‘Political Content’. From here, you can choose whether or not you want to limit political content from the people you follow. The setting affects the recommendations that appear in Explore, Reels, Feed Recommendations and Suggested Users, the page explains, and also applies to Threads.
The fact that Meta even has a political content setting demonstrates the power of algorithm-based social media apps, where content appears based on multiple factors, rather than just a reverse chronological stream of people users chose to follow. Other startups, such as Bluesky and other federated networks, are looking for new models for how content on social platforms should be moderated or blocked. Bluesky, for example, allows users to create their own streams and subscribe to moderation services. However, the app 5.9 million more The user base is not as large as the 170 million monthly active users of Threads or Instagram over 2 billion monthly users.
The new audit was first announced earlier this year. It serves as a way for Meta to remove responsibility when it comes to the power its apps have to influence people – something Meta didn’t want to be accused of before the US election.
The move is not surprising given that the tech giant has faced criticism from both sides of the US political spectrum after being accused by Republicans of censoring free speech and by Democrats of being too soft on misinformation and disinformation. Just weeks after the launch of X competitor Threads, House Judiciary Chairman Jim Jordan (R-OH) wrote to Meta CEO Mark Zuckerberg with questions about the app’s content moderation policies.
Later, Meta announced that it would no longer “proactively” recommend political content, leading to backlash from creators.
Fortunately for those using Instagram and Threads, the bug was fixed before Trump and Biden’s first presidential debate on Thursday night.