Meta Faces Lawsuit Over Social Media Addiction, with Other Platforms Under Scrutiny for Harmful Effects on Teens
Meta must face claims it promoted the addictive effects of social media on young people, after a federal judge in California ruled that a lawsuit by 34 state attorneys general may proceed. The lawsuit, part of a broader multidistrict litigation (MDL), focuses on Meta's design of its social media platforms, particularly Facebook and Instagram, and their alleged impact on young users' mental health and well-being.
While Meta is at the center of this case, other major social media platforms, including Google (YouTube), TikTok, and Snapchat, are facing similar legal battles. These companies are being sued for employing addictive design features that allegedly harm young users by promoting compulsive behavior, anxiety, depression, and body image issues.
Notably, TikTok has been under intense scrutiny for promoting harmful challenges, such as the infamous "blackout challenge," which led to multiple lawsuits after causing fatalities among children. Similarly, Snapchat is being sued for enabling harmful interactions, such as those involving sexual exploitation and drug-related deaths, linked to its disappearing messages feature.
Background on the Lawsuit
The lawsuit accuses Meta of intentionally designing addictive features on its platforms to increase user engagement, thereby driving advertising revenue. States allege that these platforms, especially Instagram, are deliberately crafted to foster compulsive usage among teenagers and younger audiences. Some features identified include infinite scrolling, autoplay, and content recommendation algorithms designed to keep users hooked for extended periods.
In 2023, California's Attorney General, Rob Bonta, accused Meta of violating federal and state laws, including the Children's Online Privacy Protection Act (COPPA), California's False Advertising Law (FAL), and California’s Unfair Competition Law (UCL). COPPA specifically aims to protect the online privacy of children under 13 years old. The lawsuit further alleges that Meta failed to remove or appropriately modify features harmful to young people's mental health, despite internal research and external warnings about these dangers.
Meta's Response
Meta has pointed to its efforts to mitigate potential harm to younger users. For instance, the company limited who can contact teens on Instagram and controlled the type of content these users are exposed to. However, the lawsuit emphasizes that these actions were insufficient in addressing the broader issue of addiction-driven design.
The judge’s ruling reflects growing concerns about social media's impact on adolescent mental health. With this case proceeding, it could set significant precedents for how companies design online platforms and manage younger audiences, particularly in terms of regulatory scrutiny and the legal obligations tech companies have toward user well-being.
Legal Claims and Theories
The lawsuit leans on key legal frameworks to substantiate its claims against Meta:
- COPPA Violations: The lawsuit alleges that Meta collected data from users under 13 without proper parental consent. This data collection violated COPPA, which mandates strict guidelines for children's privacy protection online.
- False Advertising: Under California’s False Advertising Law, Meta is accused of misrepresenting the safety and positive effects of its platforms, thereby misleading parents and users about the real risks involved in social media use.
- Unfair Competition: Through California’s Unfair Competition Law, the states argue that Meta engaged in unlawful business practices by prioritizing profit over user safety, particularly that of young and vulnerable audiences.
What’s Next?
Meta, which has faced multiple lawsuits and public outcry regarding its platform's effects on mental health, now faces a legal environment where the role of Section 230 (the law shielding tech companies from liability over user content) could be further scrutinized. Although Section 230 may limit some claims related to user content, the design and addictive nature of these platforms may fall outside its scope. The judge's ruling allows these claims to be heard, indicating that the court is willing to explore whether Meta’s actions in platform design were detrimental to public health.
This case marks a critical moment in the regulation of social media platforms, particularly in how they design and market products to younger users.