Loading Articles!

AI Generated Newscast About Meta Whistleblowers: Hidden Child Safety Scandal Exposed!

2025-09-08T17:29:08Z


Did Meta try to sweep child safety concerns under the rug—or is this just the tip of the iceberg for Big Tech secrecy? In a story that feels ripped straight from a Silicon Valley thriller, two current and two former Meta employees have now blown the whistle, handing over explosive documents to Congress. Their allegations? That Meta, the tech giant behind Facebook, Instagram, and the VR platform Horizon Worlds, may have suppressed crucial research into children’s safety on its platforms.

Let’s rewind: You might remember the Frances Haugen bombshell in 2021, when internal files showed Instagram hurting teen girls' mental health. That sparked a firestorm of Congressional hearings, and suddenly, child safety online became one of the internet’s hottest battlegrounds. Now, we find out that just six weeks after those revelations, Meta changed how its own people could research sensitive topics—politics, gender, children, race, harassment, you name it. The new rules? Either loop in the legal team for 'protection' (hello, attorney-client privilege!), or sugarcoat the research language. Gone were blunt terms like 'not compliant' or 'illegal.' Instead, everything became…vaguer.

But the most chilling detail comes from former Meta researcher Jason Sattizahn. He claims his boss made him delete a recording with a teen who said his 10-year-old brother was sexually propositioned inside Horizon Worlds, Meta’s virtual reality playground. You read that right—a platform supposedly meant for older teens and adults, but where kids under 13 were finding ways in, with frightening results.

Now, Meta, for its part, says it’s all being twisted. A spokesperson told TechCrunch that privacy regulations demand any information about kids under 13 has to be wiped if it was collected without parental consent. And the company insists it has approved nearly 180 studies about youth safety and social issues since early 2022. But the whistleblowers say the reality on the ground is very different. Their documents, now in Congress’ hands, allegedly reveal a culture where employees were quietly discouraged from even talking about underage kids in Meta’s virtual spaces.

This isn’t just about VR, either. Kelly Stonelake, who spent 15 years inside Meta, sued the company after trying to launch Horizon Worlds to teens and mobile users. She says the leadership knew that their platform couldn’t keep out under-13s and had a racism problem so bad that Black avatars could be hit with slurs within 34 seconds of joining. Stonelake has filed other lawsuits too, including claims of gender discrimination and sexual harassment at Meta.

And if you thought that was the end of it, Reuters uncovered that Meta’s AI chatbots were once allowed to have “romantic or sensual” chats with children. Yes, you heard that right—AI-generated newscast about this scandal could go on for hours, and every detail just gets darker.

With tech giants like Meta under increasing scrutiny, whistleblowers are forcing us to ask: Who’s really looking out for our kids in the digital age? Stay tuned for more AI generated newscasts about these explosive revelations—because this story is far from over.

Profile Image Elena Petrova

Source of the news:   TechCrunch

BANNER

    This is a advertising space.

BANNER

This is a advertising space.