The Tech Policy Greenhouse is an online symposium where experts tackle the most difficult policy challenges facing innovation and technology today. These are problems that don't have easy solutions, where every decision involves tradeoffs and unintended consequences, so we've gathered a wide variety of voices to help dissect existing policy proposals and better inform new ones.

Social Media Regulation In African Countries Will Require More Than International Human Rights Law

from the isps-and-human-rights dept

There has been a lot of focus on moderation as carried out by platforms—the rules social media companies base their decision on what content remains online. There has however been limited attention on how actors other than social media platforms, in this case governments, seek to regulate these platforms. 

Focusing more on African governments, they carry out this regulation primarily through laws. These laws can be broadly divided into two: direct and indirect regulatory laws. The direct regulatory laws can be seen in countries like Ethiopia and more recently in Nigeria. They are similar to Germany’s Network Enforcement Act and France’s Online Hate Speech Law that directly place responsibilities on platforms and require them to remove online hate speech within a specific time and failure of which attracts heavy sanctions. 

Section 8 of Ethiopia’s Hate Speech and Disinformation Prevention and Suppression Proclamation 1185/2020 provides for various responsibilities for social media platforms and actors. These responsibilities include the suppression and prevention of disinformation and hate speech content by social media platforms and a twenty-four window within which such content must be removed from their platforms. It also provides that they should bring their policies in line with the first two responsibilities. 

The Proclamation further vests the reporting and public awareness responsibilities on the compliance of social media platforms in the Ethiopian Broadcasting Authority—a body empowered by law to regulate broadcasting services. The Ethiopian Human Rights Commission (EHRC), Ethiopia’s National Human Rights Institution (NHRI), also has responsibilities on public awareness. But it is the Council of Ministers that’s responsible for implementing laws in Ethiopia that may give further guidance on the responsibilities of social media platforms and other private actors.

In Nigeria, the legislative proposal, the Protection from Internet Falsehoods, Manipulation and Other Related Matters bill, is yet to become law. The bill seeks to regulate disinformation and coordinated inauthentic behaviour online. The law is similar to that of Singapore which has been criticised by the current United Nations Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression for the threats it poses to online expression and online rights in general. 

Major criticisms against these laws include how they are opaque and pose threats to online expression. For example, the Ethiopian law defines hate speech broadly and does not include the contextual factors that must be considered in categorising online speech as hateful. With respect to the Nigerian bill, there are no clear oversight, accountability or transparency systems in place to check the government’s unlimited powers to decide what constitutes disinformation. 

The indirect regulatory laws are those used by governments through their telecommunications regulatory agencies to compel Internet Service Providers (ISPs) to block social media platforms. This type of regulation requires ISPs to block social media platforms based on public emergencies or national interests. What constitutes these emergencies or interests are vague and in many instances are examples of voices or platforms critical of government policies. 

In January 2021, the Ugandan government ordered ISPs to block Facebook, Twitter, WhatsApp, Signal and Fiber. The order was issued through the communications regulator. The order came a day after Facebook’s announcement that it will close pro-government accounts sharing disinformation. 

In June 2021, the Nigerian government ordered ISPs to block access to Twitter stating that the latter’s activities constituted threats to Nigeria’s corporate existence. However, there have been contrary views that the order was as a result of both remote and immediate causes. The remote cause was the role Twitter played in connecting and rallying publics during the #EndSARS protests against police brutality while the immediate cause was attributed to Twitter’s deletion of President Muhammadu Buhari’s tweet which referred to the country’s civil war, contained veiled threats of violence, and violated Twitter’s abusive policies. 

In May 2021, Ethiopia had just lifted the block on social media platforms in six locations in the country. Routine shutdowns like these have become a thing for African governments and this often occurs during elections or a major political development. 

On a closer look, the cross-cutting challenge posed by both forms of regulation is the lack of accountability and transparency especially on the part of governments on how they enforce these provisions. Social media platforms are also complicit as there is little or no information on the nature of pressure they face from these government actors. 

Alongside the mainstream debates on how to govern social media platforms, it is time to also consider wider forms of regulation especially on how they manifest outside Western systems and the threats such regulation poses to online expression. 

One solution that has been suggested but also severely criticised is the application of international human rights standards to social media regulation. This standard has been argued to be the most preferred because of its customary application across contexts. However, its biggest strength also seems to be its biggest weakness—how does this standard apply in local contexts given the complexity of governing online speech and the myriad of actors involved?

In order to work towards effective solutions, we will need to re-imagine and re-purpose traditional governance roles of not only governments and social media platforms, but also ISPs, civil society, and NHRIs. For example, the unchecked powers of most governments to determine what constitutes online harms must be revisited to ensure that there are judicial reviews and human rights impact assessments (HRIAs) of proposed government social media bans. 

ISPs must also be encouraged to jump into the fray, choose human rights, and not to roll over each time governments make such problematic demands to block social media platforms. For example, they should begin to join other actors like the civil society and the academia to lobby for laws and policies that make judicial reviews and HRIAs requirements before entertaining governments request for blocking of platforms or even content. 

The application of international human rights standards to social media regulation is not where the work stops, but is where it begins. For a start, proximate actors involved in social media regulation like governments, social media platforms, private actors, local and international civil society bodies, and treaty-making bodies like the United Nations and the African Union, NHRIs must come up with a typology of harms as well as actors actively involved in such regulation. In order to ensure that these addresses the challenges posed by these kinds of regulation, the responsibilities of such actors must be anchored on international human rights standards but in such a way that these actors actively communicate and collaborate.

Tomiwa Ilori is currently a Doctoral Researcher at the Centre for Human Rights, Faculty of Law, University of Pretoria. He also works as a Researcher for the Expression, Information and Digital Rights Unit of the Centre.

Techdirt and EFF are collaborating on this Techdirt Greenhouse discussion. On October 6th from 9am to noon PT, we’ll have many of this series’ authors discussing and debating their pieces in front of a live virtual audience (register to attend here).

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Social Media Regulation In African Countries Will Require More Than International Human Rights Law”

Subscribe: RSS Leave a comment
7 Comments
ECA (profile) says:

Fun isnt it?

https://www.un.org/en/about-us/universal-declaration-of-human-rights

These are the UN stated rights for Everyone. For those who signed it anyway.

https://civilrights.org/edfund/resource/where-the-united-states-stands-on-10-international-human-rights-treaties/

These are the ones the USA has/had problems with.
Its interesting that Some took over 40 years to ratify.

  1. Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW) –United States signed in 1980, but has not yet ratified
sumgai (profile) says:

It is truly too bad that some people just refuse to "get it".

I’m sorry to be the one to break the bad news to you, Mr. Ilori, but until you come to a complete understanding that neither you, nor the UN, nor any other body of persons has any power whatsoever over these bad actors (in your case, African governments), you are just peeing in the wind.

To wit, your only weapon available for use to bring these bad actors into line with your thinking is actual war, with weaponry that does not discriminate good from bad, it just kills, period. You can’t shame these bastards, you can’t ostracize them, you can’t defund them, nor disturb their finances signficantly, and you most certainly can’t just shake your finger at them and say "naughty boy" – none of those options work on people in power. They understand only one thing, and that is the credible threat of losing their personal life (lives). Only at that point can you "persuade them to see the light".

And for the record, we citizens of the US are having the same problem – assholes in both State and Federal governments who think that just because they are in a position of power, they have free reign to do whatever is best for them, and screw what everyone else thinks about them and their actions. What I said above about the African dictators applies in our own home as well.

Well, all except the bit about fearing for their lives. We citizens have some work to do on that part, about forcing them to straighten up and fly right, or get the hell out of Dodge. Civilized behavior, and all that, I’m sure you understand.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...
Older Stuff
12:04 SOPA Didn't Die. It's Just Lying In Wait. (5)
09:30 Demanding Progress: From Aaron Swartz To SOPA And Beyond (3)
12:00 How The SOPA Blackout Happened (3)
09:30 Remembering The Fight Against SOPA 10 Years Later... And What It Means For Today (16)
12:00 Winding Down Our Latest Greenhouse Panel: Content Moderation At The Infrastructure Layer (4)
12:00 Does An Internet Infrastructure Taxonomy Help Or Hurt? (15)
14:33 OnlyFans Isn't The First Site To Face Moderation Pressure From Financial Intermediaries, And It Won't Be The Last (12)
10:54 A New Hope For Moderation And Its Discontents? (7)
12:00 Infrastructure And Content Moderation: Challenges And Opportunities (7)
12:20 Against 'Content Moderation' And The Concentration Of Power (32)
13:36 Social Media Regulation In African Countries Will Require More Than International Human Rights Law (7)
12:00 The Vital Role Intermediary Protections Play for Infrastructure Providers (7)
12:00 Should Information Flows Be Controlled By The Internet Plumbers? (10)
12:11 Bankers As Content Moderators (6)
12:09 The Inexorable Push For Infrastructure Moderation (6)
13:35 Content Moderation Beyond Platforms: A Rubric (5)
12:00 Welcome To The New Techdirt Greenhouse Panel: Content Moderation At The Infrastructure Level (8)
12:00 That's A Wrap: Techdirt Greenhouse, Broadband In The Covid Era (17)
12:05 Could The Digital Divide Unite Us? (29)
12:00 How Smart Software And AI Helped Networks Thrive For Consumers During The Pandemic (41)
12:00 With Terrible Federal Broadband Data, States Are Taking Matters Into Their Own Hands (18)
12:00 A National Solution To The Digital Divide Starts With States (19)
12:00 The Cost Of Broadband Is Too Damned High (12)
12:00 Can Broadband Policy Help Create A More Equitable And inclusive Economy And Society Instead Of The Reverse? (11)
12:03 The FCC, 2.5 GHz Spectrum, And The Tribal Priority Window: Something Positive Amid The COVID-19 Pandemic (6)
12:00 Colorado's Broadband Internet Doesn't Have to Be Rocky (9)
12:00 The Trump FCC Has Failed To Protect Low-Income Americans During A Health Crisis (26)
12:10 Perpetually Missing from Tech Policy: ISPs And The IoT (10)
12:10 10 Years Of U.S. Broadband Policy Has Been A Colossal Failure (7)
12:18 Digital Redlining: ISPs Widening The Digital Divide (21)
More arrow