February 26, 2024

Instagram used to connect network of pedophiles to child pornography, report claims

“That a team of three academics with limited access could find such a huge network should set off alarms at Meta” 

Instagram used to connect network of pedophiles to child pornography, report claims

Instagram is being used by a network of child pornography distributors to commission and sell child sexual abuse material openly, according to a report.

Instagram is being used by a network of child pornography distributors to commission and sell child sexual abuse material openly, according to a report.

The distributors use Instagram’s algorithms and hashtag system to promote content to users, the Stanford Internet Observatory said in a report Wednesday. The posts themselves would not present direct images of child pornography but “content menus,” which would offer links to websites that can be used to purchase the content, including Telegram groups and Discord servers. The distribution of child pornography violates most websites’ terms of service and federal law.

“Due to the widespread use of hashtags, relatively long life of seller accounts and, especially, the effective recommendation algorithm, Instagram serves as the key discovery mechanism” for this network, the Observatory said. Stanford estimates that the network had between 500 and 1,000 accounts.

Meta, the parent company of Instagram, quickly acted after it was asked for comment by the Wall Street Journal. The social media company acknowledged flaws within its anti-child pornography enforcement operation and said it was creating an internal task force to look into these networks.

“Child exploitation is a horrific crime,” the company said. “We’re continuously investigating ways to actively defend against this behavior.”

Instagram has taken down 27 child pornography networks and intends to remove more, Meta said. It has also blocked thousands of hashtags used to sexualize children and restricted the platform from recommending search terms associated with sexual abuse.

Twitter is also a standard tool for promoting child pornography, Stanford said, but the platform is more aggressive in removing those accounts than Instagram. The response has reportedly improved from Twitter. The platform struggled to keep up with child pornography accounts, according to a February New York Times report testing the website’s content moderation efforts.

More at:



Instagram Connects Vast Pedophile Network – WSJ

The Meta unit’s systems for fostering communities have guided users to child-sex content; company says it is improving internal controls

WSJ News Exclusive | Instagram Connects Vast Pedophile Network

The Meta unit’s systems for fostering communities have guided users to child-sex content; company says it is improving internal controls.

Instagram, the popular social-media site owned by Meta Platforms, helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage-sex content, according to investigations by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst.

Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found.

Though out of sight for most on the platform, the sexualized accounts on Instagram are brazen about their interest. The researchers found that Instagram enabled people to search explicit hashtags such as #pedowhore and #preteensex and connected them to accounts that used the terms to advertise child-sex material for sale. Such accounts often claim to be run by the children themselves and use overtly sexual handles incorporating words such as “little slut for you.”

Instagram accounts offering to sell illicit sex material generally don’t publish it openly, instead posting “menus” of content. Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and “imagery of the minor performing sexual acts with animals,” researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person “meet ups.” 

The promotion of underage-sex content violates rules established by Meta as well as federal law.

In response to questions from the Journal, Meta acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised. “Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”

Meta said it has in the past two years taken down 27 pedophile networks and is planning more removals. Since receiving the Journal queries, the platform said it has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse. It said it is also working on preventing its systems from recommending that potentially pedophilic adults connect with one another or interact with one another’s content. 

Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, said that getting even obvious abuse under control would likely take a sustained effort. 

“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, noting that the company has far more effective tools to map its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added.

Alex Stamos, director of the Stanford Internet Observatory in Palo Alto, Calif.

Technical and legal hurdles make determining the full scale of the network hard for anyone outside Meta to measure precisely. 

Because the laws around child-sex content are extremely broad, investigating even the open promotion of it on a public platform is legally sensitive. 

In its reporting, the Journal consulted with academic experts on online child safety. Stanford’s Internet Observatory, a division of the university’s Cyber Policy Center focused on social-media abuse, produced an independent quantitative analysis of the Instagram features that help users connect and find content.

The Journal also approached UMass’s Rescue Lab, which evaluated how pedophiles on Instagram fit into the larger ecosystem of online child exploitation. Using different methods, both entities were able to quickly identify large-scale communities promoting criminal sex abuse. 

Test accounts set up by researchers that viewed a single account in the network were immediately hit with “suggested for you” recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites. Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.

The Stanford Internet Observatory used hashtags associated with underage sex to find 405 sellers of what researchers labeled “self-generated” child-sex material—or accounts purportedly run by children themselves, some saying they were as young as 12. According to data gathered via Maltego, a network mapping software, 112 of those seller accounts collectively had 22,000 unique followers.

Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions.

A Meta spokesman said the company actively seeks to remove such users, taking down 490,000 accounts for violating its child safety policies in January alone

“Instagram is an on ramp to places on the internet where there’s more explicit child sexual abuse,” said Brian Levine, director of the UMass Rescue Lab, which researches online child victimization and builds forensic tools to combat it. Levine is an author of a 2022 report for the National Institute of Justice, the Justice Department’s research arm, on internet child exploitation.

Instagram, estimated to have more than 1.3 billion users, is especially popular with teens. The Stanford researchers found some similar sexually exploitative activity on other, smaller social platforms, but said they found that the problem on Instagram is particularly severe. “The most important platform for these networks of buyers and sellers seems to be Instagram,” they wrote in a report slated for release on June 7.

More at:




Share the News