DSA Framers Insisted It Was Carefully Calibrated Against Censorship; Then Thierry Breton Basically Decided It Was An Amazing Tool For Censorship

from the our-new-truth-czar-is-overreaching dept

A few weeks ago, I highlighted how EU chief Digital Services Act enforcer, Thierry Breton, was making a mess of things sending broadly threatening letters (which have since been followed up with opening official investigations) to all the big social media platforms. His initial letter highlighted the DSA’s requirements regarding takedowns of illegal content, but very quickly blurred the line between illegal content and disinformation.

Following the terrorist attacks carried out by Hamas against Israel, we have indications that your platform is being used to disseminate illegal content and disinformation in the EU.

I noted that the framers of the DSA have insisted up, down, left, right, and center that the DSA was carefully designed such that it couldn’t possibly be used for censorship. I’ve highlighted throughout the DSA process how this didn’t seem accurate at all, and a year ago when I was able to interview an EU official, he kept doing a kind of “of course it’s not for censorship, but if there’s bad stuff online, then we’ll have to do something, but it’s not censorship” dance.

Some people (especially on social media and especially in the EU) got mad about my post regarding Breton’s letters, either saying that he was just talking about illegal content (he clearly is not!) or defending the censorship of disinformation as necessary (one person even told me that censorship means something different in the EU).

However, it appears I’m not the only one alarmed by how Breton has taken the DSA and presented it as a tool for him to crack down on legal information that he personally finds problematic. Fast Company had an article highlighting experts saying they were similarly unnerved by Breton’s approach to this whole thing.

“The DSA has a bunch of careful, procedurally specific ways that the Commission or other authorities can tell platforms what to do. That includes ‘mitigating harms,’” Keller says. The problem with Breton’s letters, she argues, is that they “blow right past all that careful drafting, seeming to assume exactly the kind of unconstrained state authority that many critics in the Global South warned about while the DSA was being drafted.”

Meanwhile, others are (rightfully!) noting that these threat letters are likely to lead to the suppression of important information as well:

Ashkhen Kazaryan, senior fellow of free speech and peace at the nonprofit Stand Together, objects to the implication in these letters that the mere existence of harmful, but legal, content suggests companies aren’t living up to their obligations under the DSA. After all, there are other interventions, including warning labels and reducing the reach of content, that platforms may be using rather than removing content altogether. Particularly in times of war, Kazaryan, who is a former content policy manager for Meta, says these alternative interventions can be crucial in preserving evidence to be used later on by researchers and international tribunals. “The preservation of [material] is important, especially for things like actually verifying it,” Kazaryan says, pointing to instances where evidence of Syrian human rights offenses have been deleted en masse.

The human rights civil society group Access Now similarly came out with concerns about Breton’s move fast and break speech approach might come across.

Firstly, the letters establish a false equivalence between the DSA’s treatment of illegal content and “disinformation.”’ “Disinformation” is a broad concept and encompasses varied content which can carry significant risk to human rights and public discourse. It does not automatically qualify as illegal and is not per se prohibited by either European or international human rights law. While the DSA contains targeted measures addressing illegal content online, it more appropriately applies a different regulatory approach with respect to other systemic risks, primarily consisting of VLOPs’ due diligence obligations and legally mandated transparency. However, the letters strongly focus on the swift removal of content rather than highlighting the importance of due diligence obligations for VLOPs that regulate their systems and processes. We call on the European Commission to strictly respect the DSA’s provisions and international human rights law, and avoid any future conflation of these two categories of expression. 

Secondly, the DSA does not contain deadlines for content removals or time periods under which service providers need to respond to notifications of illegal content online. It states that providers have to respond in a timely, diligent, non-arbitrary, and objective manner. There is also no legal basis in the DSA that would justify the request to respond to you or your team within 24 hours. Furthermore, by issuing such public letters in the name of DSA enforcement, you risk undermining the authority and independence of DG Connect’s DSA Enforcement Team.

Thirdly, the DSA does not impose an obligation on service providers to “consistently and diligently enforce [their] own policies.” Instead, it requires all service providers to act in a diligent, objective, and proportionate manner when applying and enforcing the restrictions based on their terms and conditions and for VLOPs to adequately address significant negative effects on fundamental rights stemming from the enforcement of their terms and conditions. Terms and conditions often go beyond restrictions permitted under international human rights standards. State pressure to remove content swiftly based on platforms’ terms and conditions leads to more preventive over-blocking of entirely legal content.

Fourthly, while the DSA obliges service providers to promptly inform law enforcement or judicial authorities if they have knowledge or suspicion of a criminal offence involving a threat to people’s life or safety, the law does not mention a fixed time period for doing so, let alone one of 24 hours. The letters also call on Meta and X to be in contact with relevant law enforcement authorities and EUROPOL, without specifying serious crimes occurring in the EU that would provide sufficient legal and procedural ground for such a request. 

Freedom of expression and the free flow of information must be vigorously defended during armed conflicts. Disproportionate restrictions of fundamental rights may distort information that is vital for the needs of civilians caught up in the hostilities and for recording documentation of ongoing human rights abuses and atrocities that could form the basis for evidence in future judicial proceedings. Experience shows that shortsighted solutions that hint at the criminal nature of “false information” or “fake news” — without further qualification — will disproportionately affect historically oppressed groups and human rights defenders fighting against aggressors perpetrating gross human rights abuses. 

No one is suggesting that the spread of mis- and disinformation regarding the crisis is a good thing, but the ways to deal with it are tricky, nuanced, and complex. And having a bumbling, egotistical, blowhard like Breton acting like the dictator for social media speech is going to cause a hell of a lot more problems than it solves.

Filed Under: , , , , ,
Companies: meta, tiktok, twitter, x, youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “DSA Framers Insisted It Was Carefully Calibrated Against Censorship; Then Thierry Breton Basically Decided It Was An Amazing Tool For Censorship”

Subscribe: RSS Leave a comment
17 Comments

This comment has been flagged by the community. Click here to show it.

That One Guy (profile) says:

Unless of course that's a feature and not a bug...

Well for all the supporters of the bill that insisted it was absolutely not going to be used for censorship now’s their chance to put their money where their mouth is.

If that’s not the purpose of the bill then it’s current spokesman running around as a one-man Ministry of Truth is in pretty clear violation of the spirit and letter of the law and as such should be roundly and publicly condemned for his actions, with calls for him to immediately walk back his demands and stop overstretching his demands, and/or resign so the job can be held by someone less censorship-happy.

Should that not be done then the only reasonable explanation that comes to mind is that he’s acting entirely within the scope of the law and those that claimed it wasn’t a censorship bill and couldn’t be used as such will have been outed as lying through their teeth the entire time they’ve been defending it.

Anonymous Coward says:

Re:

Every time someone tells me to put my money where my mouth is, I go “eww, germs. And money laundering is illegal!”

You know very well that the supporters of the bill are going to sit on their asses and snigger; their role was done when the bill was passed, they’ll claim “I didn’t do that”.

You should start in on the “outing”, as that’s always the difficult part: holding people accountable.

The avalanche has started. It is too late for the pebbles to vote. — Kosh

John says:

Re: Nope.

Should that not be done then the only reasonable explanation that comes to mind is that he’s acting entirely within the scope of the law and those that claimed it wasn’t a censorship bill and couldn’t be used as such will have been outed as lying through their teeth the entire time they’ve been defending it.

They won’t be outed as lying since any claims to that effect will also be censored. That’s the beauty of the law.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

The only expert in that group of people they got quotes from is the one from the CER. Keller is part of Stanford’s think-tank whose board has people from Facebook and politicians like Lofgren who get donations from tech companies, so of course they’re up in arms about the companies that pay them facing scrutiny. Stand Together is a right-wing think tank with Koch money funneled into it and believes “cancel culture” is a real thing when it’s not.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Let’s take down everything on the Internet that anyone objects to, akin to what’s happening in libraries. Eventually, the Internet will be totally free of content, and there will be no libraries, enabling us all to be equally ignorant (and, probably, unemployed). If newspapers happen to reappear, we can just ban them, too. Eventually, humans will become extinct, thereby solving all the world’s problems…

Anonymous Coward says:

Re:

Eventually, humans will become extinct, thereby solving all the world’s problems…

You forget that people are impatient. They won’t wait for “eventually”.

And even if every human on the planet was turned into an easily-crumbleable dodecahedron, there’d still be the existing pollutants, radiation, and environmental change (even discounting the environmental damage caused by leaving human toys around unattended).

solongsowrong (profile) says:

The business of self-proclamated " news debunkers"

It is sad seeing that several people that found a way to make money and gain visibility by self-proclamating theyselves “news-debunkers”, helped a lot to create this climate of which hunting, and made more easy for governments the act of imposing censorship by hiding their attempts under the veil of hypocrisy. Thèse people, these so called “news debunkers” are not, however, as evil as someone can think, no, they are, according the definition of a famous book by Italian professor Carlo Cipolla, stupid.

The stupid, in fact, harm not only the other people, but theirselves too.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
15:42 Supreme Court Shrugs Off Opportunity To Overturn Fifth Circuit's Batshit Support Of Texas Drag Show Ban (62)
15:31 Hong Kong's Zero-Opposition Legislature Aims To Up Oppression With New 'National Security' Law (33)
09:30 5th Circuit Is Gonna 5th Circus: Declares Age Verification Perfectly Fine Under The First Amendment (95)
13:35 Missouri’s New Speech Police (67)
15:40 Florida Legislator Files Bill That Would Keep Killer Cops From Being Named And Shamed (38)
10:49 Fifth Circuit: Upon Further Review, Fuck The First Amendment (39)
13:35 City Of Los Angeles Files Another Lawsuit Against Recipient Of Cop Photos The LAPD Accidentally Released (5)
09:30 Sorry Appin, We’re Not Taking Down Our Article About Your Attempts To Silence Reporters (41)
10:47 After Inexplicably Allowing Unconstitutional Book Ban To Stay Alive For Six Months, The Fifth Circuit Finally Shuts It Down (23)
15:39 Judge Reminds Deputies They Can't Arrest Someone Just Because They Don't Like What Is Being Said (33)
13:24 Trump Has To Pay $392k For His NY Times SLAPP Suit (16)
10:43 Oklahoma Senator Thinks Journalists Need Licenses, Should Be Trained By PragerU (88)
11:05 Appeals Court: Ban On Religious Ads Is Unconstitutional Because It's Pretty Much Impossible To Define 'Religion' (35)
10:49 Colorado Journalist Says Fuck Prior Restraint, Dares Court To Keep Violating The 1st Amendment (35)
09:33 Free Speech Experts Realizing Just How Big A Free Speech Hypocrite Elon Is (55)
15:33 No Love For The Haters: Illinois Bans Book Bans (But Not Really) (38)
10:44 Because The Fifth Circuit Again Did Something Ridiculous, The Copia Institute Filed Yet Another Amicus Brief At SCOTUS (11)
12:59 Millions Of People Are Blocked By Pornhub Because Of Age Verification Laws (78)
10:59 Federal Court Says First Amendment Protects Engineers Who Offer Expert Testimony Without A License (17)
12:58 Sending Cops To Search Classrooms For Controversial Books Is Just Something We Do Now, I Guess (221)
09:31 Utah Finally Sued Over Its Obviously Unconstitutional Social Media ‘But Think Of The Kids!’ Law (47)
12:09 The EU’s Investigation Of ExTwitter Is Ridiculous & Censorial (37)
09:25 Media Matters Sues Texas AG Ken Paxton To Stop His Bogus, Censorial ‘Investigation’ (44)
09:25 Missouri AG Announces Bullshit Censorial Investigation Into Media Matters Over Its Speech (108)
09:27 Supporting Free Speech Means Supporting Victims Of SLAPP Suits, Even If You Disagree With The Speakers (74)
15:19 State Of Iowa Sued By Pretty Much Everyone After Codifying Hatred With A LGBTQ-Targeting Book Ban (157)
13:54 Retiree Arrested For Criticizing Local Officials Will Have Her Case Heard By The Supreme Court (9)
12:04 Judge Says Montana’s TikTok Ban Is Obviously Unconstitutional (4)
09:27 Congrats To Elon Musk: I Didn’t Think You Had It In You To File A Lawsuit This Stupid. But, You Crazy Bastard, You Did It! (151)
12:18 If You Kill Two People In A Car Crash, You Shouldn’t Then Sue Their Relatives For Emailing Your University About What You Did (47)
More arrow