Former Meta employees made a shocking revelation: the corporation behind Facebook and Instagram is fully aware that its platforms pose serious risks to children.
They acknowledge that the content circulated on their services has detrimental effects on children’s self-esteem, resulting in anxiety, depression, and, alarmingly, suicidal thoughts.
Moreover, children constitute a significant portion of users in Meta’s virtual reality spaces, where they face regular incidents of stalking, harassment, and sexual abuse.
Whistleblowers Jason Sattizahn and Cayce Savage testified this month before a Senate Judiciary subcommittee about how the company consistently prioritizes profits over the safety of children.
They disclosed that Meta actively destroys evidence related to sexual abuse and alters data to downplay the harm caused to its users.
To prevent creating an incriminating record, the whistleblowers stated that Meta’s legal team prohibits its researchers from probing into matters that may yield unfavorable outcomes.
Having encountered many distressing accounts of Big Tech’s harmful practices on youth, I was not surprised by this testimony.
My family has its own tragic story.
My daughter, Becca, experienced a deeply distressing turn of events due to social media.
She was like most teens—hanging out with friends, texting constantly, and engaging on social platforms. However, when she was just 15, she and some peers encountered a group of 18-year-old boys online.
They arranged to meet at a party, where one of the boys drugged and assaulted her.
The impact on Becca was profound. In an earlier time, she might have recovered. However, when someone shared a compromising photo of her on Snapchat, she became a target of relentless cyberbullying, compounding her suffering.
Despite having a supportive family and access to counseling, Becca resorted to drug use, a struggle that culminated in an overdose.
In an effort to protect her from nearby drug dealers, we relocated from Massachusetts to my sister’s home in Maine, believing it would be a safer environment.
Unfortunately, social media thwarted our attempts to keep her safe.
In Maine, Becca and a friend turned to Facebook to find drugs. They intended to get one last fix before entering a residential treatment program the following day. The drugs they acquired online were tainted with fentanyl.
The next morning, I found her friend unconscious but alive, while my daughter, at just 18, was gone.
There is no respite from these platforms and their easy access to illicit substances—it’s as simple as ordering pizza or requesting a ride.
Becca’s death exemplifies why I advocate for children to be removed from social media. If complete removal isn’t feasible, we desperately need regulations to safeguard young users online.
Big Tech companies must be held accountable for creating environments that enable illegal activities, especially when these activities result in the exploitation and even deaths of children.
We must also hold these entities legally liable for the repercussions of their design choices, including the content disseminated through their algorithms.
Bipartisan senators were outraged by the revelations presented by the Meta whistleblowers. They recognize that action is needed now; the time for discussion has passed, and they are committed to passing the Kids Online Safety Act.
This legislation would compel Big Tech to take necessary measures to safeguard their users, similar to what is required across other industries that market products to children.
The Kids Online Safety Act would impose a duty of care, mandating that online platforms like Facebook and Instagram must design their offerings to prevent and address specific threats to young users, such as addiction, sexual exploitation, and bullying.
It would also equip parents with tools to disable personalized algorithms, enhance privacy protections for their children, and combat addictive engagement patterns.
Though the Senate overwhelmingly supported KOSA last year, House leadership declined to bring it to a vote. Big Tech has invested huge sums in lobbying efforts against the bill.
Meta, for instance, announced plans last year to develop a $10 billion AI data center in Louisiana, the home state of House Speaker Mike Johnson and Majority Leader Steve Scalise.
This indicates significant pressure on congressional leaders to yield to the demands of Big Tech.
Allowing KOSA to fade or dilute its provisions would be a grave mistake.
With every new congressional hearing, we hear more gut-wrenching stories from grieving parents who have lost children.
In a press briefing before this month’s hearing, Maurine Molak spoke about her son, David, who died by suicide at 16 after enduring months of bullying on Instagram.
Brian Montgomery shared about his son Walker, who fell victim to an online predator on Instagram. Posing as a teenage girl, the predator convinced Walker to exchange sexual images and subsequently extorted him for $1,000 to keep those images private. Just hours later, Walker took his own life.
American teens spend an average of nine hours a day online, and most parents are unaware of the dangers lurking in that digital space.
Social media companies excel at capturing attention, feeding insecurities, and fostering social anxiety. Unlike many other sectors, tech firms benefit from legal protections that allow irresponsible behavior, even malevolent practices.
The whistleblowers addressing the Senate were not simply transient Meta employees.
Sattizahn, with his PhD in integrative neuroscience, aimed to improve these technologies. However, during his six years at Meta, he witnessed a consistent theme: the prioritization of profit over safety. “Product teams are hesitant to implement changes that could reduce engagement,” he shared. “We were instructed to generate reports aiming to minimize harm to Facebook. It was legal teams managing my research.”
He noted that Meta dictated the scope and methods of research, actively monitoring it to shut it down if it seemed detrimental. To further safeguard itself, Sattizahn claimed Meta demanded that external contractors handle harm reports, allowing the company to claim ignorance of these findings.
When asked why conduct research at all, Sattizahn’s response unveiled the company’s underlying cynicism. “Some research is necessary,” he stated, “to create a paper trail demonstrating that it was conducted.” It was merely performative.
Savage, who has a degree in experimental psychology and led youth safety research at Meta for four years, expressed her conviction that user engagement is the paramount concern.
Meta is conscious of the harm inflicted on children, yet remains inactive. “In virtual reality, it is common for children to face bullying, sexual assault, solicitation for explicit images, and exposure to inappropriate content such as gambling and violence, as well as participation in adult activities like visiting strip clubs and watching pornography with strangers,” she remarked.
“I wish I could tell you,” Savage told senators, “how many children using VR experience these harms,” but Meta barred her from conducting such research.
Assessing the audience by voice alone, she noted that “in almost every instance, the majority appeared to be under 13 years of age.” However, acknowledging underage users would compel Meta to remove them—which they prefer not to do since it would lower active user figures reported to shareholders. “It is more profitable,” Savage disclosed, “to feign ignorance regarding the actual ages of their users.”
If the Kids Online Safety Act were enacted, Meta would no longer be able to ignore the virtual assaults and other prevalent harms on its platforms.
Furthermore, it couldn’t feasibly bury its research or restrict its researchers to avoid confronting the issues, as the bill would mandate social media companies to report the frequency of harms experienced by minors and the measures they are implementing to address threats like sexual exploitation and cyberbullying.
I was completely unaware of what my daughter had access to through her phone.
Parents need to recognize that the parental controls available on their children’s devices are insufficient.
The Meta whistleblowers indicated that fewer than 10% of parents utilize these controls, and even then, they are often ineffective as kids are adept at bypassing them.
Parents can’t keep pace with this rapidly changing landscape. The only viable solution is for tech companies to accept legal responsibility for the damage they inflict, much like companies in other sectors.
Perhaps they will earn a little less profit, but is that truly a tragedy?
As a mother forever grieving a child I will always love and miss, I firmly believe it is not.
Deb Schmill is the founder and president of the Becca Schmill Foundation, a nonprofit established in memory of her daughter, Becca, who passed away at 18.