Meta suppressed children’s safety research, 4 whistleblowers claim

Two current and two former Meta employees have disclosed documents to Congress alleging that the company may have suppressed research on children’s safety. According to a report from The Washington Post, these claims state that Meta changed its policies around researching sensitive topics like politics, children, gender, race, and harassment. This policy shift occurred just six weeks after whistleblower Frances Haugen leaked internal documents showing Meta’s own research found that Instagram can damage teen girls’ mental health. Those 2021 revelations initiated years of Congressional hearings on child internet safety, an issue that remains a hot topic in global governments today.

The report indicates that Meta proposed two methods for researchers to limit the risk of conducting sensitive research. One suggestion was to include lawyers in their research to protect communications under attorney-client privilege. Researchers could also write about their findings more vaguely, avoiding direct terms like “not compliant” or “illegal.”

Jason Sattizahn, a former Meta researcher specializing in virtual reality, told The Washington Post that his boss made him delete recordings of an interview. In that interview, a teen claimed his 10-year-old brother had been sexually propositioned on Meta’s VR platform, Horizon Worlds.

A Meta spokesperson told TechCrunch that global privacy regulations require information collected from minors under 13 without verifiable parental consent to be deleted. However, the whistleblowers claim the documents they submitted to Congress show a pattern of employees being discouraged from discussing and researching concerns about how children under 13 were using Meta’s social virtual reality apps.

Meta told TechCrunch that these examples are being stitched together to fit a false narrative. The company stated that since the start of 2022, it has approved nearly 180 Reality Labs-related studies on social issues, including youth safety and well-being.

In a lawsuit filed in February, former 15-year Meta employee Kelly Stonelake raised similar concerns to these four whistleblowers. She told TechCrunch earlier this year that she led strategies to bring Horizon Worlds to teenagers, international markets, and mobile users. She felt the app did not have adequate ways to keep out users under 13 and also flagged persistent issues with racism. The suit alleges the leadership team was aware that in one test, users with Black avatars were called racial slurs within an average of 34 seconds of entering the platform. Stonelake has separately sued Meta for alleged sexual harassment and gender discrimination.

While these whistleblowers’ allegations center on Meta’s VR products, the company is also facing criticism for how other products, like AI chatbots, affect minors. Reuters reported last month that Meta’s AI rules previously allowed chatbots to have romantic or sensual conversations with children.