Cloudflare Makes It Easier For All Its Users To Help Stop Child Porn Distribution

from the this-is-good dept

We recently wrote about how Senators Lindsey Graham and Richard Blumenthal are preparing for FOSTA 2.0, this time focused on child porn — which is now being renamed as “Child Sexual Abuse Material” or “CSAM.” As part of that story, we highlighted that these two Senators and some of their colleagues had begun grandstanding against tech companies in response to a misleading NY Times article that seemed to blame internet companies for the rising number of reports to NCMEC of CSAM found on the internet, when that should be seen as more evidence of how much the companies are doing to try to stop CSAM.

Of course, working with NCMEC and other such organizations takes a lot of effort. Being able to scan for shared hashes of CSAM isn’t something that every internet site can do. It’s mostly just done by the larger companies. But last week Cloudflare (one of the companies that Senators are demanding “answers” from), did something quite fascinating: it enabled all Cloudlfare users, no matter what level of service, to start using Cloudflare CSAM scanning tools for free, even allowing them to set their own rules and preferences (something that might become very, very important if the Graham/Blumenthal bill becomes the law.

I highly recommend reading the entire article, because it’s quite a clear, interesting, and easy to read article about how fuzzy hashing works (including pictures of dogs and bicycles). As the Cloudflare post notes, those who use such fuzzy hashing tools have intentionally kept at least some of the details secret — because being too public about it would allow those who are producing and distributing CSAM to make changes that “dodge” the various tools and filters, which would obviously be a problem. However, that also results in two potential issues: (1) a lack of transparency in how these filtering systems really operate and (2) an inability for all but the largest players to make use of these tools — which would be disastrous for smaller companies if they were required to make use of such things.

And that’s where Cloudflare’s move is quite interesting. In providing the tool for free to all of its users, it keeps the proprietary nature of the tool secret, but it’s also letting them set the thresholds.

If the threshold is too strict ? meaning that it’s closer to a traditional hash and two images need to be virtually identical to trigger a match ? then you’re more likely to have have many false negatives (i.e., CSAM that isn’t flagged). If the threshold is too loose, then it’s possible to have many false positives. False positives may seem like the lesser evil, but there are legitimate concerns that increasing the possibility of false positives at scale could waste limited resources and further overwhelm the existing ecosystem. We will work to iterate the CSAM Scanning Tool to provide more granular control to the website owner while supporting the ongoing effectiveness of the ecosystem. Today, we believe we can offer a good first set of options for our customers that will allow us to more quickly flag CSAM without overwhelming the resources of the ecosystem.

Different Thresholds for Different Customers

The same desire for a granular approach was reflected in our conversations with our customers. When we asked what was appropriate for them, the answer varied radically based on the type of business, how sophisticated its existing abuse process was, and its likely exposure level and tolerance for the risk of CSAM being posted on their site.

For instance, a mature social network using Cloudflare with a sophisticated abuse team may want the threshold set quite loose, but not want the material to be automatically blocked because they have the resources to manually review whatever is flagged.

A new startup dedicated to providing a forum to new parents may want the threshold set quite loose and want any hits automatically blocked because they haven’t yet built a sophisticated abuse team and the risk to their brand is so high if CSAM material is posted — even if that will result in some false positives.

A commercial financial institution may want to set the threshold quite strict because they’re less likely to have user generated content and would have a low tolerance for false positives, but then automatically block anything that’s detected because if somehow their systems are compromised to host known CSAM they want to stop it immediately.

This is an incredibly thoughtful and nuanced approach, recognizing that when it comes to any sort of moderation, one size can never fit all. And, by allowing sites to set their own thresholds, it actually does add in a level of useful transparency, without exposing the inner workings that would allow bad actors to game the system.

That said, I can almost guarantee that someone (or perhaps multiple someones) will come along before too long and Cloudflare’s efforts to help all of its users combat CSAM will somehow be incorrectly or misleadingly spun to claim that Cloudflare is somehow helping sites to hide or enable CSAM. No good deed goes unpunished.

However if you want to support actual solutions — not grandstanding nonsense — to try to deal with CSAM, approaches like Cloudflare’s are ones worth paying attention to. This is especially true if Graham/Blumenthal and others get their way. Under proposals like the one they’re suggesting, it will become virtually impossible for smaller companies to take the actions necessary to meet the standards to avoid legal liability. And that means that (once again) the big internet companies will end up getting bigger. They all have access to NCMEC and the necessary tools to scan and submit CSAM. Smaller companies don’t. Cloudflare offering up its scan tool for everyone helps level the playing field in a really important way.

Filed Under: , , , ,
Companies: cloudflare

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Cloudflare Makes It Easier For All Its Users To Help Stop Child Porn Distribution”

Subscribe: RSS Leave a comment
17 Comments

This comment has been flagged by the community. Click here to show it.

SA C.P. Distributor says:

if you CSAMthing...

Well, if you CSAMthing, say nothing, because all those its for the children® people will always find a new way to exploit your good will to pass more speech crushing laws; or target you with their online hoodlums.

The irony of course is that its those exact people and especially their toxic NGOs that are actually distributing CSAM, all over the globe, are what CSAM distribution networks are.

NCMEC is not just some innocent bystander, because they (like every other NGO involved in this issue) have an internet warehouse FULL of CSAM that they lend to local police forces around the world to use in black operations of all kinds.

CSAM was virtually non-existent until the CIA imported Nazi svengalis during the Op Paperclip/MKultra era, and then there was an explosion of it, which continues today, because the governments and NGOs actually use CSAM as a weaponized form of propaganda/compromise/blackmail tool.

This was most clearly seen in the use by British Mi5-6/JTRIG, where they used it to target ME terrorists, but we also see it coming from Israeli and other private contractors too.

And of course, the MSM, ranging from the toxic Nicholas Kristoff, to the other CIA/FBI mockingbirds are highly invested in not reporting about these topics in a factual manner.

Ultra conservatives all donate their god-dollars to these NGOs, and even the UN is involved with their empowerment programs, and in turn these NGOs are interlinked around the world, and quite notoriously, the AU, Swedish, British, and other intel agencies distribute CSAM for year long stretches in a catch-me-if-you-can pattern of malicious government conduct.

And these are just the open secrets about CSAM and the actual distribution networks around the world that can be found by reading the news.

So, NCMEC, etc., none of them have clean hands in this issue, and in fact, and practice are regularly found to be actual distributors.

This comment has been flagged by the community. Click here to show it.

R/O/G/S says:

The NYT piece has some great infographics explaining the fuzzy hash-but nothing about explaining the fuzzy logic of why it demonizes fathers repeatedly by holding up the extreme and horrific monsters that are actually radical outliers as sexual abusers while completely exonerating women and mothers and other female pedophiles by default, despite this quote:

Another image, found in September 2018, depicts a woman [blank blanking the blank] of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to pedophilia, “it’s not illegal in the United States.”

That one quote highlights the gender agenda of both the writer, and the massive accountability gap between how we perceive child abuse when its a female perpetrator, because our entire cultural construct is flawed.

This complicity with females who sexually abuse kids, or who minimize insight into this forbidden topic has been going on since the 1980s porn wars, and continues today, as radical elements of the left continues to condone attrocious mothering, and even child abuse BY women, as long as it furthers the DVIC agenda of “owning the sex supply”.

Then, the article goes on about foster parents, blahblah blah, while making the not so subtle case that massive spying is ok, as long as the Good Corporations® full of Good Men, and Better Women® do it, because, of course, we can trust Cloudflare, and Peter Thiel, and Mark Zuckerberg, and a few lesbian gender warriors from a Toronto CSAM clearinghouse with troves of child pornography.

Sure…seems legit….

Pure, distorted, American/ western false moral imperatives, and none of it has or ever will help those children heal.

From the family court to foster care to prison pipeline, some 70% of kids might be sexually exploited by predators(and statistics bear that out), but 100%
of them will definitely be exploited by people who work in DVIC industries, using those poor babies in moral panics, and getting a paycheck for doing it, while never “solving” the problem of child abuse, or CSAM, precisely because it is a self-perpetuating business model.

https://www.hg.org/legal-articles/sexual-abuse-an-epidemic-in-foster-care-settings-6703

And the presence of religion prone, middle aged, withered white women at the center of the dialogue is kind of hard to miss, considering that they are also prime suspects since forever in waging religious tolerance movements, and moral panics that have preceded many, many genocides too.

This comment has been flagged by the community. Click here to show it.

R/O/G/S says:

Re: Re: rights are rights, for everyone

Of course Stephen T. Stone is a child rapist apologist, but only when its female child rapists like his mommy.

Coincidentally, hes also a fan of enemas and anal suppositories too.

Any thoughts on the actual article above, or are you just here as a DVIC douchebag ?

[writer wonders aloud “Is this Stone guy part of the medical community that exploits, and profiteers off of the suffering of these children?]

Mother who boy through 13 medically unnecessary surgeries sentenced to prison

https://nypost.com/2019/08/18/texas-mom-put-healthy-son-through-13-unnecessary-surgeries/

This comment has been flagged by the community. Click here to show it.

R/O/G/S says:

Re: Re: Re:2 Re:We are all kinked, but ur a kook

You are one bizarre, flaming zionazi hemorhoid, and a child abuse apologist to boot.

There is nothing wrong with consensual kink, but I draw the line where you stand: equating sexual abuse of boys and other children by females as a “kink.”

Unfortunately, many abused children later find vouce in fetish communities and kink, too, while never addressing child sexual perpetrated abuse by females or others.

Go kill yourself already.

Just do it,

bro

This comment has been flagged by the community. Click here to show it.

Rekrul says:

If the threshold is too strict — meaning that it’s closer to a traditional hash and two images need to be virtually identical to trigger a match — then you’re more likely to have have many false negatives (i.e., CSAM that isn’t flagged). If the threshold is too loose, then it’s possible to have many false positives.

One only has to search for an image on Google and look at the "visually similar" pictures it suggests to see how well computers are at choosing similar photos.

Anonymous Coward says:

Cloudflare is to the internet as a damn is to a flowing river.

Ever since Cloudflare hit the internet, it’s been slowly eroding it by pretending it’s not responsible for the actions of its users.

Before anyone chimes in and defends this bullshit, just remember one thing: Snowden released documents upon documents regarding to how the NSA spied on Americans and the world.

Is there a particular reason you never once asked the question of where the software came from.

Mel Feasans says:

Hope springs eternal.

"It will become virtually impossible for smaller companies to take the actions necessary to meet the standards to avoid legal liability. And that means that (once again) the big internet companies will end up getting bigger."

I keep wondering; whenever I see articles that discuss legislative efforts concerning the internet, if anyone will ever come to the simple realization that making the internet less safe for citizens and making internet mega-corpse bigger is actually the true purpose behind 99% of such legislative proposals.
Oh well. Perhaps some day.

This comment has been flagged by the community. Click here to show it.

AC Liberation Front says:

the most censored thought on the internet

LIBERATE FEMALE PEDOPHILES AND CHILD ABUSERS NOW!©

LIBERATE STEPHEN STONE NOW!©

The NYT piece has some great infographics explaining the fuzzy hash-but nothing about explaining the fuzzy logic of why it demonizes fathers repeatedly by holding up the extreme and horrific monsters that are actually radical outliers as sexual abusers while completely exonerating women and mothers and other female pedophiles by default, despite this quote:

Another image, found in September 2018, depicts a woman [blank blanking the blank] of a naked 2-year-old girl. Google declined to take down the photo, stating in an email to the Canadian analysts that while it amounted to pedophilia, “it’s not illegal in the United States.”

That one quote highlights the gender agenda of both the writer, and the massive accountability gap between how we perceive child abuse when its a female perpetrator, because our entire cultural construct is flawed.

This complicity with females who sexually abuse kids, or who minimize insight into this forbidden topic has been going on since the 1980s porn wars, and continues today, as radical elements of the left continues to condone attrocious mothering, and even child abuse BY women, as long as it furthers the DVIC agenda of “owning the sex supply”.

Then, the article goes on about foster parents, blahblah blah, while making the not so subtle case that massive spying is ok, as long as the Good Corporations® full of Good Men, and Better Women® do it, because, of course, we can trust Cloudflare, and Peter Thiel, and Mark Zuckerberg, and a few lesbian gender warriors from a Toronto CSAM clearinghouse with troves of child pornography.

Sure…seems legit….

Pure, distorted, American/ western false moral imperatives, and none of it has or ever will help those children heal.

From the family court to foster care to prison pipeline, some 70% of kids might be sexually exploited by predators(and statistics bear that out), but 100%
of them will definitely be exploited by people who work in DVIC industries, using those poor babies in moral panics, and getting a paycheck for doing it, while never “solving” the problem of child abuse, or CSAM, precisely because it is a self-perpetuating business model.

https://www.hg.org/legal-articles/sexual-abuse-an-epidemic-in-foster-care-settings-6703

And the presence of religion prone, middle aged, withered white women at the center of the dialogue is kind of hard to miss, considering that they are also prime suspects since forever in waging religious tolerance movements, and moral panics that have preceded many, many genocides too.

Anonymous Coward says:

Cloudflare will automatically send a notice to you when it flags CSAM material, block that content from being accessed (with a 451 “blocked for legal reasons” status code), and take steps to support proper reporting of that content in compliance with legal obligations.

Sadly, that means that the webmaster who enables this is only inviting unwanted attention from the feds the moment some random user uploads something inappropriate. Not only does every trigger of the CSAM flag (false-positive or otherwise) bring the site one step closer to being shut down by Cloudflare, it also brings the site one step closer to becoming a target of a federal fishing expedition… like the one that brought down Backpage.

Backpage used to like to rat users out to NCMEC. Eventually, they realised the price of filing too many of those reports was that investigations were directed not against the offending users but against Backpage itself. At some point, "shoot, shovel and shut up" looks really tempting when dealing with net.abuse as not every site has the resources to defend itself from abusive governments.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...