Instagram's Algorithms Connect 'Vast Paedophile Network' Seeking Child Pornography: Report

A Meta spokesman acknowledged that the company had received the reports and failed to act on them.

Instagram’s recommendation algorithms linked and promoted a “vast network of paedophiles” seeking illegal underage sexual content and activity, according to a Wall Street Journal (WSJ) report. These algorithms also advertised the sale of illicit “child-sex material” on the platform, the outlet further said. The report is based on an investigation into child pornography on Meta-owned platform done by WSJ and researchers at Stanford University and the University of Massachusetts Amherst. Some accounts even allowed buyers to “commission specific acts” or arrange “meet ups”.

“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” the WSJ report said. “Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”

The investigation found that Instagram allowed users to search by child-sex abuse hashtags like #pedowhore, #preteensex and #pedobait.

These hashtags led users to accounts that offered to sell paedophilic materials, and even had videos of children harming themselves, the researchers said.

These things were brought to the notice of the company by anti-paedophile activists, who flagged accounts claiming to belong to a girl selling underage-sex content, and another one that of a scantily clad young girl with a graphically sexual caption.

The responses the activists received were automated saying “Because of the high volume of reports we receive, our team hasn’t been able to review this post.” In another case, the response suggested that the user hide the account to avoid seeing its content.

A Meta spokesman acknowledged that the company had received the reports and failed to act on them, attributing the inaction to a software error.

The company told WSJ that it has fixed the bug in its reporting system and is providing new training to its content moderators.

“Child exploitation is a horrific crime. We’re continuously investigating ways to actively defend against this behaviour,” the spokesperson said.

Meta said it has in the past two years taken down 27 paedophile networks and is planning more removals. It also said that thousands of hashtags that sexualise children have been blocked, some with millions of posts.

The report comes at a time when Meta and other social media platforms face scrutiny over their efforts to prevent the spread of abusive content on their platforms.



Source link

By admin

Malcare WordPress Security