Platform
Brazilian Institute Slaps Meta, TikTok, Kwai With $525M Lawsuit Over Insufficient Minor Protection
Brazil’s Collective Defense Institute has filed lawsuits seeking 3 billion reais ($525.27 million) against the Brazilian operations of TikTok, Kwai, and Meta Platforms, alleging failure to implement adequate protections for minors using their platforms.
According to Reuters, the consumer rights group’s legal action demands that these social media companies establish clear data protection mechanisms and issue warnings about potential mental health risks to children and teenagers related to platform addiction.
The lawsuits cite multiple studies examining the possible negative effects of unsupervised social media use by young users.
“It is urgent that measures be adopted in order to change the way the algorithm works, the processing of data from users under 18, and the way in which teenagers aged 13 and over are supervised and their accounts created, in order to ensure a safer, healthier experience,” lawyer and one of the plaintiffs, Lillian Salgado, said in a statement.
According to Salgado, the goal is to achieve safety standards comparable to those in developed nations.
Meta Platforms responded by highlighting its decade-long commitment to youth safety and citing the development of over 50 tools, resources, and features supporting teens and their guardians. The company announces plans to launch a new “Teen Account” feature on Instagram in Brazil, designed to automatically restrict account visibility and user contact options for teenage users.
TikTok – which faces lawsuits from 14 U.S. attorney generals over allegations of harming young users’ mental health – reported no receipt of legal notice regarding the case. At the same time, Kwai affirmed that user safety, particularly minor protection, was a primary concern.
This legal action follows recent heightened scrutiny of social media regulation in Brazil, notably after disputes between X owner Elon Musk and a Brazilian Supreme Court justice, which resulted in significant fines for the platform.
Earlier in October, the European Commission (EC) intensified its scrutiny of major social media platforms under the Digital Services Act (DSA), focusing on the algorithms that power content recommendations.
“Under the DSA, platforms have to assess and adequately mitigate risks stemming from their recommender systems, including risks for the mental health of users and the dissemination of harmful content arising from the engagement-based design of these algorithms,” the EC stated in a press release.