Sunday, August 17

In April 2025, the Human Rights Court in Kenya issued an unprecedented ruling that it has the jurisdiction to hear a case about harmful content on one of Meta’s platforms. The lawsuit was filed in 2022 by Abraham Meareg, the son of an Ethiopian academic who was murdered after he was doxxed and threatened on Facebook, Fisseha Tekle, an Ethiopian human rights activist, who was also doxxed and threatened on Facebook, and Katiba Institute, a Kenyan non-profit that defends constitutionalism. They maintain that Facebook’s algorithm design and its content moderation decisions made in Kenya resulted in harm done to two of the claimants, fuelled the conflict in Ethiopia and led to widespread human rights violations within and outside Kenya.

The content in question falls outside the protected categories of speech under Article 33 of the Constitution of Kenya and includes propaganda for war, incitement to violence, hate speech and advocacy of hatred that constitutes ethnic incitement, vilification of others, incitement to cause harm and discrimination.

Key to the Kenyan case is the question whether Meta, a US-based corporation, can financially benefit from unconstitutional content and whether there is a positive duty on the corporation to take down unconstitutional content that also violates its Community Standards.

In affirming the Kenyan court’s jurisdiction in the case, the judge was emphatic that the Constitution of Kenya allows a Kenyan court to adjudicate over Meta’s acts or omissions regarding content posted on the Facebook platform that may impact the observance of human rights within and outside Kenya.

The Kenyan decision signals a paradigm shift towards platform liability where judges determine liability by solely asking the question: Do platform decisions observe and uphold human rights?

The ultimate goal of the Bill of Rights, a common feature in African constitutions, is to uphold and protect the inherent dignity of all people. Kenya’s Bill of Rights, for example, has as its sole mission to preserve the dignity of individuals and communities and to promote social justice and the realisation of the potential of all human beings. The supremacy of the Constitution also guarantees that, should there be safe harbour provisions in the laws of that country, they would not be a sufficient liability shield for platforms if their business decisions do not ultimately uphold human rights.

That a case on algorithm amplification has passed the jurisdiction hearing stage in Kenya is a testament that human rights law and constitutionality offer an opportunity for those who have suffered harm as a result of social media content to seek redress.

Up to this point, the idea that a social media platform can be held accountable for content on its platform has been dissuaded by the blanket immunity offered under Section 230 of the Communications Decency Act in the US, and to a lesser extent, the principle of non-liability in the European Union, with the necessary exceptions detailed in various laws.

For example, Section 230 was one of the reasons a district judge in California cited in her ruling to dismiss a case filed by Myanmar refugees in a similar claim that Meta had failed to curb hate speech that fuelled the Rohingya genocide.

The aspiration for platform accountability was further dampened by the US Supreme Court decision in Twitter v Taamneh, in which it ruled against plaintiffs who sought to establish that social media platforms carry responsibility for content posted on them.

The immunity offered to platforms has come at a high cost, especially for victims of harm in places where platforms do not have physical offices.

This is why a decision like the one by the Kenyan courts is a welcome development; it restores hope that victims of platform harm have an alternative route to recourse, one that refocuses human rights into the core of the discussion on platform accountability.

The justification for safe harbour provisions like Section 230 has always been to protect “nascent” technologies from being smothered by the multiplicity of suits. However, by now, the dominant social media platforms are neither nascent nor in need of protection. They have both the monetary and technical wherewithal to prioritise people over profits, but choose not to.

As the Kenyan cases cascade through the judicial process, there is cautious optimism that constitutional and human rights law that has taken root in African countries can offer a necessary reprieve for platform arrogance.

Mercy Mutemi represents Fisseha Tekle in the case outlined in the article. 

The views expressed in this article are the author’s own and do not necessarily reflect Al Jazeera’s editorial stance.

https://www.aljazeera.com/opinions/2025/8/16/african-courts-may-pave-the-way-for-holding-social-media-giants-to-account?traffic_source=rss

Share.

Leave A Reply

5 + 16 =

Exit mobile version