MADISON — Wisconsin is joining a nationwide lawsuit against Meta for the harmful mental health affects that the company’s social media platforms impose on children and teens.
The lawsuit alleges that the social media tech company knowingly designed and deployed harmful features on Instagram and its other social media platforms that purposefully addict young users. It also states that Meta falsely assured the public that such features are safe and suitable for young users.
“We must keep our kids safe—and that includes from dangers online,” said Attorney General Josh Kaul in a press release today. “Adequate protections should be in place to protect kids from harms associated with social media, and parents must receive accurate information about potential dangers to their kids.”
Kaul and 42 other attorneys general say Meta’s business practices violate state consumer protection laws and the federal Children’s Online Privacy Protection Act.
They allege that Meta was aware of young users, some under 13 years of age, were active on the platforms, and knowingly collected data from these users without parental consent.
“Consumer rights and protections matter, especially when it involves our youth,” said Secretary Randy Romanski of the Wisconsin Department of Agriculture, Trade and Consumer Protection in the press release. “It doesn’t matter whether your child uses Facebook, Instagram, or another social media platform. Companies should not be allowed to misrepresent their products and their impact, or use tactics to manipulate youth and their parents into using those products.”
The federal complaint says that Meta concealed the extent of the psychological and health harms suffered by young users addicted to using its platforms based on publicly available sources, as well as items not yet released to the public.
One of those sources was a former Meta employee, who released detailed information about Meta generating profits by purposely making its platforms addictive to children and teens. The platform algorithms, infinite scroll feature and near-constant alerts were created to constantly lure children and teens back to the platform.
The multistate coalition that brought Tuesday’s complaint is also investigating TikTok’s conduct on a similar set of concerns. That investigation remains ongoing as states push for adequate disclosure of information and documents in litigation.