news

Unredacted complaint alleges Meta knew of โ€˜huge volume' of child sexual harassment on its platforms

Justin Sullivan | Getty Images
  • A new legal filing alleges a 2021 Meta internal estimate found as many as 100,000 children every day received sexual harassment on Facebook and Instagram.
  • The filing is part of a complaint by the attorney general of New Mexico in an ongoing lawsuit against Meta over the company's steps to protect children online.
  • "The complaint mischaracterizes our work using selective quotes and cherry-picked documents," a Meta spokesperson said.

WASHINGTON โ€” A new legal filing about child exploitation on Meta's Facebook and Instagram apps alleges a 2021 internal company estimate found as many as 100,000 children every day received sexual harassment, such as pictures of adult genitalia, on the platforms.

This was revealed in newly unredacted portions of a complaint from the attorney general of New Mexico in an ongoing lawsuit against the social media giant over the company's steps to protect children online as the platforms exploded in popularity with young people. 

Also included in the complaint is a description of a 2020 Meta internal company chat, in which one employee asked a colleague: "What specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?"

"Somewhere between zero and negligible," the colleague responded. "Child safety is an explicit non-goal this half."

That same year, Meta executives scrambled to respond to a complaint from an executive at Apple, whose 12-year-old child was solicited on Facebook, according to the newly unredacted filing.

"This is the kind of thing that pisses Apple off to the extent of threatening to remove us from the App store," a Meta employee told his colleagues. The same employee also asked when, "we'll stop adults from messaging minors on (Instagram) Direct." 

A Meta spokesperson said the company has fixed many of the problems identified in the complaint. In one month alone, the company said, it disabled more than a half million accounts for violating child safety policies.

"We want teens to have safe, age-appropriate experiences online, and we have over 30 tools to support them and their parents. We've spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online. The complaint mischaracterizes our work using selective quotes and cherry-picked documents," the company said.

The lawsuit alleges that Facebook and Instagram failed to protect underage users from predators online, and that Meta employees urged the company to make safety changes that the company did not implement.

The suit, filed Dec. 5, alleges that the company refused to make the recommended changes because it placed a high priority on increased social media engagement and advertising growth than on child safety. Meta founder and CEO Mark Zuckerberg is named as a defendant.

Mark Zuckerberg told the world in October 2021 that he was rebranding Facebook to Meta as the company pushes toward the metaverse.
Facebook | via Reuters
Mark Zuckerberg told the world in October 2021 that he was rebranding Facebook to Meta as the company pushes toward the metaverse.

"For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation," New Mexico Attorney General Raul Torrez said Thursday.

"Meta executives, including Mr. Zuckerberg, consistently made decisions that put growth ahead of children's safety. While the company continues to downplay the illegal and harmful activity children are exposed to on its platforms, Meta's internal data and presentations show the problem is severe and pervasive," said Torrez.

Meta has long faced criticism surrounding its handling of problematic content targeting younger users. In 2021, whistleblower Frances Haugen leaked internal documents to the Wall Street Journal showing that the company knew of the harm caused to teenage girls by toxic content on Instagram, but did nothing to fix the problem.

Haugen later testified before a Senate panel, where she faced questions from outraged lawmakers who were concerned that the company was putting profits over the safety of users.

Copyright CNBC
Contact Us