Kenyan lawsuit against Meta will test Big Tech accountability
A Kenyan court will hear a $1.6 billion lawsuit that alleges Facebook helped incite genocide in Ethiopia. The case could be a model for those looking to hold platforms responsible for online hate and abuse with real world consequences.

ANALYSIS By Tekendra Parmar | Contributing Editor
When Meta CEO Mark Zuckerberg announced in January that the company planned to end fact-checking and scale back content moderation, he said the decisions were motivated by a desire to return to the company’s pro-free speech “roots” and by “shifting political winds” in the United States.
But for most observers, this was an unmistakable attempt to prepare for a new world order—or at least a new American one—under the incoming Trump administration. What would it mean for the rest of the world? Zuckerberg didn’t say. The question has loomed large for people across the global majority ever since, particularly for those working to hold Meta accountable for the undeniable impact of its platforms in situations of violent conflict.
One such attempt at accountability passed a major hurdle last week, when judges in Kenya ruled that a landmark $1.6 billion lawsuit against Meta for its role in promoting hate speech and violent content that helped fuel a genocide in Ethiopia could move forward. For more than two years, Meta argued that Kenyan courts had no jurisdiction over the firm and that the case should be dismissed. The Nairobi High Court disagreed.
From October 2020 to November 2022, the Ethiopian government waged a brutal war in the country’s northern region of Tigray. Amnesty International, Human Rights Watch and other advocacy groups accused the Ethiopian military and its paramilitary forces of engaging in ethnic cleansing and war crimes against ethnic Tigrayans since the conflict began.
On Facebook, instigators called on Ethiopians to “get organized” and “take action” against Tigrayans, whom they labeled as “spies,” “snitches” and “traitors.” Among those targeted by this rhetoric was Meareg Amare, a prominent Tigrayan university professor who often appeared on local television shows to encourage children to enter the sciences—like an Ethiopian version of Bill Nye the Science Guy. In October 2021, Facebook users doxxed the professor and called on others to “take action” against him, on the basis of unfounded accusations that he’d participated in a massacre of Amhara civilians and embezzled funds from the university where he worked.
His son, Abraham—the lead plaintiff in the case against Meta—reported these threats, which violated Facebook’s policies against hate speech and imminent threats to individuals' lives, through Facebook’s consumer channels, in hopes that the company would remove the posts. But he never got a response. On Nov. 3, 2021, Meareg Amare was shot to death outside of his home.
The lawsuit, brought by the U.K.-based firm Foxglove Legal and its Kenyan partner, Nzili and Sumbi Advocates, alleges that Meta failed to remove harmful content that incited violence during the Ethiopian genocide; that the company’s algorithm amplified this violent material; and that it provided less oversight for its African users, violating equal rights and non-discrimination protections under Kenya’s constitution.
Although the events in question occurred in Ethiopia, the plaintiffs filed suit in Kenya, where the majority of Meta’s content moderation workforce for Africa was located.
In the media, Meta’s response to the lawsuit was always the same: It told journalists that the company uses feedback from vetted civil society organizations—known as Trusted Partners—to guide its content moderation policies in Ethiopia. In 2022, I began reporting on this Trusted Partner network in Ethiopia and around the world and found that, more often than not, Meta rarely acted on the concerns of its so-called partners. My investigation for Business Insider in 2023, where I worked as an editor from 2021 to 2024, revealed that a Trusted Partner had flagged the disinformation and doxxing targeting Amare in the weeks leading up to his death during two meetings with Meta’s Trust and Safety teams. Meta was aware that prominent Tigrayan intellectuals were being targeted on its platform but failed to heed its partners’ warnings.
When I asked about this inaction at the time, the organization that flagged posts targeting Amare told me that a Meta staffer responded with a familiar phrase: “We are not arbiters of truth.” But truth was not the only issue in Amare’s case. The principal concern was the threat to his safety. Another was that threats of violence and ethnic hate speech were explicitly forbidden under Meta’s own policies.
The issues that these Trusted Partner groups faced were not isolated to Ethiopia or to the African continent – they were being felt worldwide. A study conducted by Internews, one of Meta’s biggest partner organizations, showed that the majority of Trusted Partners faced "erratic" response times when flagging hate speech and other harmful content, even when they were trying to alert the company to imminent threats on people's lives, like those made against Meareg Amare. Some partners waited months before getting responses from Meta. Others got no response at all. The only exception to what seemed to be the rule came following Russia’s full-scale invasion of Ukraine in February of 2022, when Meta engaged closely with Ukrainian civil society and media organizations to respond to posts that could spark further violence.
The Ethiopia lawsuit is one of the only measures of accountability Meta currently faces in the Global South for how its platform has been weaponized by military and paramilitary groups to destabilize nations and target ethnic minorities. Last year, judges in the U.S. Ninth Circuit dismissed a lawsuit against the company brought on by victims of the Myanmar genocide, where the company’s platforms were used to target the Rohingya ethnic minority in 2017. (Though, the plaintiffs have since appealed that decision.) Estimates suggest more than 40,000 people were killed by the Burmese junta and its allies and more than 700,000 forced to flee to neighboring Bangladesh. There too, in the years leading up to the military’s “clearance operations,” civil society groups that had Meta’s ear tried to get the platform to take action. But the company chose not to listen.
Given this recent history, the Nairobi High Court ruling marks a significant milestone for those hoping the company will answer for the abuses of its platform. If the lawsuit is successful, it could serve as a model for similar plaintiffs from the global majority to seek some modicum of accountability from one of the most consequential companies of the last two decades. But it will likely take years for any resolution—and the reparations the plaintiffs are asking for are still less than one percent of the company’s market cap. Even if they prevail, the monetary consequences of helping fuel a genocide will be little more than a rounding error for Meta.
The decision also comes at a time when Meta is dismantling company programs that it once touted as industry-leading efforts to promote user safety and public accountability. Amid a flurry of political overtures to President Donald Trump, the company has shuttered its third-party fact-checking program in the U.S. earlier this week, scaled back on content moderation staff and rewritten policies that were once meant to protect vulnerable users from abuse. It is a scramble to regain good standing with Trump, who just four years ago was banned from Facebook following the Jan. 6, 2021, attacks on the U.S. Capitol.
In the U.S., Facebook instead has launched using a Community Notes-based approach to fact-checking, taking a page from the model used on X—which, since Elon Musk’s takeover, has become a haven for hate speech and far-right extremism. Meta says it intends to roll this out to other countries but has yet to specify a timeline.
In February, I asked a member of Meta’s African public policy team whether they had game-planned how their Community Notes feature would work in conflict zones where bad-faith actors are spreading hate speech, disinformation, and other instigating content through networks of troll farms designed to overwhelm Meta’s systems. At the time, they had not, the policy staffer said. The vibe, it seemed, was that this would be business as usual.
With the tenuous peace in Tigray set to unravel once more as conflict between Ethiopia and its Eritrean neighbor looms, business as usual for Meta is a bad sign for Ethiopian civilians. The U.S.-based Distributed AI Research Institute (DAIR), helmed by the prominent Eritrean computer scientist Timnit Gebru, put out a statement on the brewing conflict this week. The group admonished Meta for hosting unequivocal calls for violence in the region and for promoting viral videos that advocate turning Tigray “into dust.” “We are ringing the alarm bells for what is to come and urge you to act with us,” the group wrote.
Meanwhile, many of Meta’s Trusted Partners working in the region have become disillusioned with the program. At least one Ethiopian partner group, the Network Against Hate Speech, left its role as a trusted partner in 2023. Other Trusted Partners told me their programs had been cut earlier this year. Some relied on funding from USAID, one among a growing number of funding initiatives that the Trump administration has dissolved by executive order.
DAIR recently published a set of recommendations on content moderation, advocating for Meta to use more robust deliberation structures, consult diverse experts, and improve the working conditions of the moderation workforce. But those recommendations are only useful when there’s somebody at the company to heed them.
If Meta’s domestic political interests continue to push it away from content moderation and toward a laissez-faire approach to speech, there is little chance it will invest more time or resources into content moderation in the global majority. The looming conflict in Ethiopia may become the test case for Meta’s renewed cynicism: Without proper accountability and action, we could see more tragedies like Meareg Amare around the world.