A recent court filing has accused Meta, the owner of Facebook and Instagram, of implementing a policy that allowed sex traffickers to post content related to sexual solicitation or prostitution 16 times before their accounts were suspended on the 17th “strike.” The filing also alleges that Meta prioritized profit and user engagement over the safety of children.
The lawsuit, brought by children, parents, school districts, and states including California, accuses Meta, Google’s YouTube, Snap, and TikTok of intentionally addicting children to harmful products. The filing claims that the companies targeted children and schools and misrepresented their social media products.
Meta has denied the allegations, stating that they have made efforts to protect teens by introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences. Snap criticized the allegations for misrepresenting its platform, highlighting the safety features they have implemented.
The plaintiffs cited internal company communications, research reports, and sworn depositions by current and former employees to support their claims. The filing also alleges that Meta recommended nearly 2 million minors to adults seeking to sexually groom children on Instagram in 2023.
The lawsuit also criticizes Meta’s approach to children’s mental health and its impact on schools, claiming that social media has created a compromised educational environment. Internal messages from Meta researchers referred to teens as “hooked” on Instagram despite the negative impact on their mental health.
The filing accuses Zuckerberg and Meta of lying to Congress about their goals to increase users’ time spent on Meta’s platforms. It also alleges that Meta discontinued a project called “Project Mercury” after finding that people who stopped using Facebook for a week reported feeling less depressed, anxious, lonely, and socially judged.
Senate Hearing Reveals Troubling Findings About Meta’s Impact on Users
In a recent Senate hearing, a representative from a major company (whose identity remains undisclosed) was questioned about Facebook’s ability to detect a potential link between increased usage of their platform by teenage girls and a rise in signs of anxiety. The representative responded with a firm “No,” according to the official filing.
Despite Meta’s public claim that only a small fraction of users were exposed to harmful content related to suicide and self-harm, an internal study conducted by the company revealed a much higher figure of approximately 7%, as stated in the filing.
Former Meta executive Brian Boland, who served as a vice-president for over a decade before departing in 2020, expressed his belief during a deposition that the company lacked genuine concern for user safety. He stated, “My feeling then and my feeling now is that they don’t meaningfully care about user safety.”
Furthermore, when a Meta employee raised concerns about the decision to conceal “likes” on posts as a measure to safeguard children’s mental well-being, a member of the company’s growth team dismissed the criticism by stating, “It’s a social comparison app. [Expletive] get used to it,” as outlined in the filing.

