Meta Faces Federal and State Lawsuits Over Child Safety Concerns

 

More than 30 states have joined together to file a federal lawsuit against Meta, the parent company of Facebook and Instagram.

The legal action alleges that Meta’s social media platforms are intentionally designed to be addictive and that they harm children’s mental health. The lawsuit, signed by 33 state attorneys general, claims that Meta has violated federal children’s online privacy laws and state consumer protection laws by creating addictive products and providing misleading information about their impact on children.

In addition to the federal lawsuit, eight state attorneys general and the District of Columbia are filing separate lawsuits in their respective state courts, bringing the total to 42 states involved in the legal action.

The allegations in the lawsuit center around Meta’s practices related to child safety and the impact of its platforms on young users. The federal lawsuit accuses Meta of deceiving users by making false and misleading claims that its features are not manipulative, that its products do not promote unhealthy engagement with children, and that they are safe for younger users.

These lawsuits seek to hold Meta accountable for its actions and aim to change the way the company designs and markets its platforms. If successful, the legal action could result in substantial fines and significant changes to how Meta operates its social media platforms, similar to the lawsuits against the tobacco industry in the 1990s that led to significant financial penalties and changes in marketing practices.

California Attorney General Rob Bonta, who is leading the federal lawsuit, stated, “We refuse to allow Meta to trample on our children’s mental and physical health, all to promote its products and increase its profits. We refuse to allow the company to feign ignorance of the harm that it’s causing, and we refuse to let it continue business as usual.”

The legal strategy employed by the states is designed to bypass Section 230 of the Communications Decency Act, which traditionally protects online platforms from liability for user-generated content. Instead of targeting specific content, the consumer protection lawsuits claim that Meta deceived the public about the safety of children using its apps.

Meta responded to the lawsuits by highlighting the more than 30 design changes it has made to improve children’s safety on its platforms. A spokesperson for the company expressed disappointment with the legal action, stating, “We’re disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.”

The federal lawsuit is the most extensive state-led challenge alleging violations of the Children’s Online Privacy Protection Act (COPPA) and consumer protection laws by a social media company. The states allege that Meta knowingly allowed children under the age of 13 to use its platforms without obtaining parental consent, in violation of COPPA. Meta has a policy of banning users under 13, but the lawsuit claims the company does not enforce this restriction effectively.

The states are seeking significant changes to how Meta’s platforms operate, including limiting the time and frequency of use by young people and altering how algorithms display content. While Meta has not yet engaged in settlement discussions, California Attorney General Bonta emphasized that the door is open for such negotiations.

These lawsuits come at a time when Congress has failed to pass legislation to update COPPA or enact new protections for children’s online safety. The legal actions reflect growing concerns about the impact of social media on children’s mental health and the lack of regulatory measures to address these issues.

The multi-state lawsuit is the result of an investigation that began in November 2021 following Facebook whistleblower Frances Haugen’s testimony before Congress. Haugen had revealed that Instagram’s algorithms were pushing harmful content to teen girls. While litigation is an important step, children’s safety advocates stress the need for legislation to ensure long-term solutions. Alix Fraser, director of the Council for Responsible Social Media, emphasized that “Congress needs to step up with solutions that hold the platforms accountable.”

The lawsuits are expected to face challenges related to Section 230 and the First Amendment, as Meta may seek to have the cases dismissed on these grounds. However, the state attorneys general remain confident in the strength of their cases.

The legal proceedings are likely to continue for years, and Congress is under pressure to address the issue of child safety on social media platforms through legislation.

About J. Williams

Check Also

President Donald Trump

Trump Says He’d Use Police, National Guard And Possibly The Military To Expel Immigrants

Ariana Figueroa, New Jersey Monitor Former President Donald Trump in his second term would carry …

Leave a Reply