Italy’s competition and consumer authority, AGCM, fined TikTok 10 million euros (nearly $11 million) following an investigation into algorithmic security issues.
The authority opened an investigation last year into a ‘French scar’ challenge in which users of the platform were reported to have shared videos of their facial scars created by pinching their skin.
In a Press release On Thursday, the AGCM said three regional companies in the ByteDance group, Ireland-based TikTok Technology Limited, TikTok Information Technologies UK Limited and TikTok Italy Srl, had been sanctioned for what it described as “unfair commercial practice”.
“The company has failed to implement appropriate mechanisms to monitor content published on the platform, particularly those that may threaten the safety of minors and vulnerable individuals. In addition, this content is systematically re-suggested to users as a result of their algorithmic profile, stimulating an ever-increasing use of the social network,” AGCM wrote.
The authority said its investigation confirmed TikTok’s responsibility in spreading content “likely to threaten the psycho-physical safety of users, especially if they are minor and vulnerable”, such as videos related to the “French Scar” challenge. It also found that the platform did not take appropriate measures to prevent the spread of such content and said it did not fully comply with its own platform guidelines.
The AGCM also criticized the way TikTok is implementing the guidelines — which it says are being implemented “without sufficient consideration of the specific vulnerability of teenagers.” He pointed out, for example, that teenage brains are still developing and young people may be at particular risk, as they may be susceptible to peer pressure to imitate group behavior to try to fit in socially.
The authority’s remarks particularly highlight the role of TikTok’s recommendation system in spreading “potentially dangerous” content, pointing to the platform’s motivation to increase engagement and increase user interactions and time spent on the service to boost revenue from advertisements. The system powers TikTok’s “For You” and “Followed” streams and, by default, relies on an algorithmic profile of users, tracking their digital activity to determine what content to show them.
“This causes unwarranted conditioning of users who are motivated to increasingly use the platform,” AGCM suggested in another observation notable for criticizing engagement driven by profile-based content streams.
We reached out to the beginning with questions. However, his negative assessment of the risks of algorithmic profiling seems interesting in light of renewed calls by some lawmakers in Europe for profiling-based content streams to be disabled by default.
Civil society groups, such as ICCLthey also argue that this would shut down the flow of resentment monetized by ad-funded social media platforms through loyalty-focused recommendation systems, which have the side effect of reinforcing division and undermining social cohesion for profit.
TikTok is challenging AGCM’s decision to impose a penalty.
In a statement, the platform sought to downplay its assessment of the algorithmic risks posed to minors and vulnerable people, framing the intervention as related to a single controversial but small-scale challenge. Check out what TikTok told us:
We disagree with this decision. The so-called “French Scar” content averaged just 100 daily searches in Italy before AGCM’s announcement last year, and we have long since restricted the visibility of this content to U18s and also made it ineligible for the For You stream .
While Italian enforcement is limited to one EU member state, the European Commission is responsible for overseeing TikTok’s compliance with algorithmic accountability and transparency provisions in the EU-wide Digital Services Act (DSA) — where penalties for non-compliance can scale up to 6% of global annual turnover. TikTok was designated as a very large platform under the DSA in April last year, with compliance expected in late summer.
One notable change as a result of the DSA is TikTok offering users profiles-free streams. However, these alternative feeds are turned off by default — meaning users are still subject to AI-based tracking and profiling unless they take action to turn them off themselves.
Last month, the EU opened a formal investigation into TikTok, citing addictive design and harmful content and the protection of minors as among its areas of focus. This process remains ongoing.
TikTok said it looks forward to the opportunity to provide the Commission with a detailed explanation of its approach to the protection of minors.
However, the company has had several previous encounters with regional enforcement agencies concerned about child safety in recent years, including a child protection intervention by the Italian data protection authority. a €345 million fine last fall for data protection failures also related to minors; and long-running complaints from consumer protection groups concerned about minor security and profiling.
TikTok also faces the potential for increased regulation from member state-level bodies implementing the bloc’s Audiovisual Media Services Directive. Like Ireland’s Coimisiún na Meán, which existed considering the application of rules on video sharing platforms that would require profile-based recommendation algorithms to be turned off by default.
The picture isn’t any brighter for the platform in the US either, as lawmakers have just proposed a bill to ban TikTok unless it cuts ties with Chinese parent ByteDance, citing national security and the ability to track and profile its users. platform. a route for a foreign government to manipulate Americans.