
Meta Purges 600,000+ Predator Accounts, Supercharges Teen Protection

Meta has taken significant steps to enhance the safety of young users on its platforms by purging over 600,000 accounts linked to sexual exploitation. The company introduced updated safety tools for teenagers, allowing them to block and report suspicious accounts more effectively. In June, teens blocked and reported 2 million accounts after receiving safety prompts. This move comes amid increasing scrutiny from policymakers and the reintroduction of the Kids Online Safety Act, aimed at regulating social media to protect children.
Meta has intensified efforts to safeguard young users following criticism from policymakers for not doing enough to prevent sexual exploitation, including updating safety tools and deleting offensive accounts.
The Details: Meta has unveiled updated safety tools aimed at protecting teenagers on its platforms, especially against exploitative content in direct messages, according to CNBC.
The improvements include additional protections during chatting, such as displaying extra information about the person they're communicating with—like when an Instagram account was created—along with safety tips to help recognize scammers.
Teens can now also block and report suspicious accounts simultaneously.
According to Meta, teens made substantial use of these features: in June alone, they blocked 1 million accounts and reported another 1 million after being prompted by Safety Notices.
Earlier in the year, the company eliminated nearly 135,000 Instagram accounts accused of sexualizing children, citing activities such as posting sexualized comments or soliciting sexual images from profiles managed by adults but featuring children.
Trending Investment Opportunities
Additionally, 500,000 associated Instagram and Facebook profiles linked to these offenders were also taken down.
Why It Matters: Meta continues to face scrutiny over its platforms' impacts on young people, especially after allegations from state attorneys general who claim the company's app features are addictive and harm children’s mental health.
In addition, Meta recently removed about 10 million Facebook profiles that were impersonating major content creators, part of its broader crackdown on spam and misleading accounts in the first half of 2025.
Amid these challenges, Congress has renewed its focus on social media regulation, particularly child safety, by reintroducing the Kids Online Safety Act, which would legally require social media platforms to actively prevent harm to children on their services.
In a related case highlighting the industry-wide issue, Snapchat was sued by the state of New Mexico for allegedly enabling predators to target minors through sextortion schemes.
- Space Stock Tracker: Virgin Galactic Makes A Comeback While Rocket Lab Takes A Breather
Photo: Shutterstock