Skip to content

NCMEC Report Again Debunks CSAM Myths

WASHINGTON — The National Centre for Missing and Exploited Children (NCMEC) issued this month its 2021 CyberTipline Report showing that child sexual abuse material (CSAM) is a problem across all kinds of online platforms.

The report clearly shows mainstream sites like Facebook far outranking adult sites in attempted illegal posts that should be subject to moderation.

NCMEC’s report applauded sites “that made identifying and reporting child abuse content a priority,” The Guardian reported last week.

NCMEC (usually pronounced “Nick Mick”) reported a rise of 35% in items found and removed over the 2020 numbers, and noted that this may represent an improvement in moderation efforts on the part of platforms.

In February 2021, Pornhub announced a series of safety and security policies to enhance measures for verification, moderation and detection of content with the goal of positioning the company “at the forefront of combating and eradicating illegal content.”

The new measures were adopted during the independent review of the company’s content compliance program conducted by Kaplan Hecker & Fink LLP. The review began in April 2020 and focused on various MindGeek platforms’ “moderation practices and other procedures for preventing users from uploading non-consensual content, child sexual abuse material (CSAM) and any other content that lacks the consent of all parties.”

According to NCMEC, the record 29.3 million items of CSAM that were found and removed across the internet in 2021, “can be indicative of a variety of things including larger numbers of users on a platform or how robust an ESP’s [electronic service provider’s] efforts are to identify and remove abusive content.”

“NCMEC applauds ESPs that make identifying and reporting this content a priority and encourages all companies to increase their reporting to NCMEC,” the non-profit organization added. “These reports are critical to helping remove children from harmful situations and to stopping further victimization.”

The report noted that the overwhelming majority of reports made to NCMEC came from Facebook and other Meta companies, including 22 million pieces of CSAM reported from Facebook alone, 3.3 million reports from Instagram and 1.3 million from WhatsApp.

Google made 875,783 reports and Snap 512,522.

OnlyFans’ parent company Fenix International made 2,984 reports in 2021.

MG Freesites, reporting on behalf of Pornhub, made 9,029 CSAM reports in 2021.