By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
24x7Report24x7Report
  • Home
  • World News
  • Finance
  • Sports
  • Beauty
  • Fashion
  • Fitness
  • Gadgets
  • Travel
Search
© 2023 News.24x7report.com - All Rights Reserved.
Reading: Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety – JS
Share
Aa
24x7Report24x7Report
Aa
Search
  • Home
  • World News
  • Finance
  • Sports
  • Beauty
  • Fashion
  • Fitness
  • Gadgets
  • Travel
  • en English
    • en English
    • id Indonesian
    • ms Malay
    • es Spanish
Follow US
© 2023 News.24x7report.com - All Rights Reserved.
24x7Report > Blog > World News > Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety – JS
World News

Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety – JS

Last updated: 2025/11/29 at 12:28 AM
Share
9 Min Read
Meta allowed sex-trafficking posts on Instagram as it put profit over kids’ safety – The Denver Post
SHARE

A significant new court filing claims Facebook and Instagram owner Meta had a “17x” policy allowing sex traffickers to post content related to sexual solicitation or prostitution 16 times before their accounts were suspended on the 17th “strike.” The allegation is one of many in the filing claiming Meta chose profit and user engagement over the safety and well-being of children.

The description of the purported sexual content policy at Instagram is contained in a new court filing by plaintiffs in an ongoing lawsuit against Meta, Google’s YouTube, Snapchat owner Snap and TikTok brought by children and parents, school districts and states — including California. They accuse the companies of intentionally addicting children to products they knew were harming them.

The filing makes the same general allegations against the four companies — that they targeted children and schools, and misrepresented their social media products — along with company-specific claims.

“Despite earning billions of dollars in annual revenue — and its leader being one of the richest people in the world — Meta simply refused to invest resources in keeping kids safe,” claimed the Friday filing in Oakland’s U.S. District Court. The lawsuit also targets products it deems harmful from Facebook, YouTube, Snapchat and TikTok. The plaintiffs seek unspecified damages, and a court order requiring the companies stop alleged “harmful conduct” and warn minor users and parents that their products are addictive and dangerous.

The new filing also claims Meta’s “outright lies” about its products’ harms prevented “even the most vigilant administrators, teachers, parents, and students from understanding and heading off the dangers inherent to Instagram and Facebook.”

A Meta spokesperson denied the accusations: “We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture.” For more than a decade, Meta has “listened to parents, researched issues that matter most, and made real changes to protect teens – like introducing Teen Accounts with built-in protections and providing parents with controls to manage their teens’ experiences,” the spokesperson said.

See also  Carvana stock surges after first annual profit: 'We didn't disintegrate'

Snap criticized the allegations for misrepresenting its platform, which unlike other social media has no public likes or comparison metrics. “We’ve built safeguards, launched safety tutorials, partnered with experts, and continue to invest in features and tools that support the safety, privacy, and well-being of all Snapchatters,” a Snap spokesperson said in an emailed statement Tuesday.

Google and TikTok did not immediately respond to requests for comment.

The plaintiffs cited what they described as internal company communications and research reports, and sworn depositions by current and former employees. The records are largely sealed by the court and could not be verified by this news organization.

The new filing claimed an account-recommendation feature on Instagram in 2023 recommended nearly 2 million minors to adults seeking to sexually groom children. More than 1 million potentially inappropriate adults were recommended to teen users in a single day in 2022, an internal audit found, according to the filing.

Facebook’s recommendation feature, according to a Meta employee, “was responsible for 80% of violating adult/minor connections,” the filing said.

In March 2020, Meta CEO Mark Zuckerberg told reporters his Menlo Park company was “actually surging the number of people” working on sensitive content including “child exploitation,” but internal communications cited in the court filing indicated that was not true, and the company had only about 30% of the staff it needed to review child-exploitation imagery.

Instagram as of March 2020 had no way for people to report the presence of child sexual abuse material, the filing said. When Vaishnavi Jayakumar, head of safety at Instagram from 2020 to 2023, first started at the company, she was told a reporting process would be too much work to build, and even more work to review reports, the filing said.

See also  Toyota small car maker Daihatsu shuts down Japan factories during probe of bogus safety tests

Even when Meta’s artificial intelligence tools identified child pornography and child sexualization material with 100% confidence, the company did not automatically delete it, the filing claimed. The company, which made $62.4 billion in profit last year, declined to tighten enforcement for fear of “false positives,” but could have solved the problem by hiring more staff, the filing said.

Jayakumar testified in a deposition that when any proposed changes that might reduce user engagement went up to Zuckerberg for review, the result would be a decision to “prioritize the existing system of engagement over other safety considerations,” the filing said.

Instead of simply making kids’ accounts private by default to protect them from adult predators, Meta “dragged its feet for years before making the change — allowing literally billions of unwanted adult-minor interactions to happen,” the filing claimed. Key to the delay, the filing alleged, was the internal projection that the change would cut daily users by 2.2%.

The company didn’t apply default privacy to all teens’ accounts until the end of last year, the filing said.

The lawsuit — still in an evidence-gathering discovery phase — also takes aim at Meta’s approach to children’s mental health and the purported damaging fallout for schools, where social media, the filing claims, has created “a compromised educational environment” and forced school districts to spend money and resources “to address student distraction and mental health problems.”

Internally, Meta researchers said of Instagram, “We’re basically pushers,” and “Teens are hooked despite how it makes them feel,” the filing said.

See also  Feds pave way for Big Tech to plug data centers into power plants

Meta “allowed its products to infiltrate classrooms, disrupt learning environments, and contribute directly to the youth mental health crisis now overwhelming schools nationwide,” claimed the filing, which accused Zuckerberg and Meta of lying to Congress.

Zuckerberg testified three times to Congress that he didn’t give his teams goals to increase time users spent on Meta’s platforms. But several internal messages referred to goals for teens’ time spent, and Zuckerberg himself said of growth metrics, “The most concerning of these to me is time spent,” the filing said.

When Meta in a late 2019 internal study called “Project Mercury” found that people who stopped using Facebook for a week reported feeling less depressed, anxious, lonely and judged socially, the company killed the project, the filing alleged. But in a U.S. Senate hearing, a company representative (who was not identified in the filing) was asked if Facebook could tell if increased use of its platform by teen girls was connected to increased signs of anxiety, and responded, “No,” the filing claimed.

Meta said publicly that a maximum of half a percent of users were exposed to suicide and self-harm material, but its own study found the number was around 7%, the filing said.

Brian Boland, who rose over 11 years to a vice-president position in Meta and left in 2020, testified in a deposition, “My feeling then and my feeling now is that they don’t meaningfully care about user safety.”

When a Meta employee criticized the company’s move, intended to protect children’s mental health, to hide “likes” on posts but only for users who opted-in, a member of Meta’s “growth team” responded. “It’s a social comparison app,” the member said, “[expletive] get used to it,” the filing said.

You Might Also Like

2nd victim dies in Denver shooting after Maduro’s capture

Stephen Colbert Spots The Most Bizarre Party Trump’s Ever Had

CDOT proposes $806 million highway expansion to fix I-270 chokepoint

Conservative Pundit Mocks ‘One Of The Saddest Bits’ Of Trump Admin Spin Yet

JB Holston is new Colorado Department of Higher Education director

TAGGED: Allowed, Instagram, Kids, Meta, posts, profit, put, Safety, sextrafficking

Sign Up For Daily Newsletter

Be keep up! Get the latest breaking news delivered straight to your inbox.
[mc4wp_form]
By signing up, you agree to our Terms of Use and acknowledge the data practices in our Privacy Policy. You may unsubscribe at any time.
Share this Article
Facebook Twitter Copy Link Print
Previous Article Is Aurora Innovation (AUR) The Best Small-Cap Autonomous Driving Stock? Is Aurora Innovation (AUR) The Best Small-Cap Autonomous Driving Stock?
Next Article Comparison photo split photo with Lake Atitlan on one side and Caye Caulker on the other Top 7 Countries Begging For More American Tourists In 2026

Stay Connected

1.30M Followers Like
311 Followers Pin
766 Followers Follow

Latest News

College Football Playoff expert picks: Predictions, odds for Miami vs. Indiana
Sports January 17, 2026
Anthony Huff, 64, Denver police shooting death suspect
2nd victim dies in Denver shooting after Maduro’s capture
World News January 17, 2026
3 Multi-Leg Trades to Watch — SHOP, SBUX, and PINS
3 Multi-Leg Trades to Watch — SHOP, SBUX, and PINS
Finance January 17, 2026
Umit Benan Fall 2026 Menswear Collection
Fashion January 16, 2026
Why The Motorola Razr Fold’s Underwhelming Debut Might Not Matter
Gadgets January 16, 2026
//

This is your World, Finance, Fitness, Fashion  Sports  website. We provide the latest breaking news straight from the News industry.

Quick Link

  • About Us
  • Contact Us
  • Privacy Policy
  • Terms & Conditions
  • Disclaimer
  • Sitemap

Top Categories

  • Fashion
  • Finance
  • Fitness
  • Gadgets
  • Travel

Sign Up for Our Newsletter

Subscribe to our newsletter to get our newest articles instantly!


24x7Report24x7Report
Follow US

Copyright © 2025 Adways VC India Private Limited

Welcome Back!

Sign in to your account

Lost your password?