Facebook actively fueled ethnic violence in Ethiopia’s civil war by prioritizing hateful and dangerous content, then not moderating that content fast enough, or sometimes at all, says a new lawsuit filed against Meta, the social media giant’s parent company.
Two Ethiopian researchers and a Kenyan constitutional rights group are behind the legal action, which was filed this week in a High Court in Nairobi, Kenya. The city houses Facebook’s East African content moderation hub, which opened in 2019.
The hub was too little, too late for the region, the lawsuit says. Facebook treated users in African countries differently than those in Western countries, fostering a “culture of disregard for human rights” that ultimately led to the murder of one of the plaintiffs’ fathers, the suit alleges.
The plaintiffs are seeking a $1.6 billion victims’ fund and a bigger and better-supported moderation team.
They’re also asking the court to deliver what would be a legal first: forced changes to Facebook’s algorithm, which has long been blamed for not doing enough to limit the reach of incendiary content.
Meta says it has invested “heavily” in moderation improvements
A spokesperson for Meta said the company has “strict rules” about what’s allowed on its platforms and is continuing to develop its capabilities to catch violating content.
“Hate speech and incitement to violence are against these rules, and we invest heavily in teams and technology to help us find and remove this content,” the spokesperson said in a statement shared with NPR.
“Feedback from local civil society organizations and international institutions guides our safety and integrity work in Ethiopia. We employ staff with local knowledge and expertise,” the spokesperson said.
The lawsuit cites language as one key factor in the company’s inadequate moderation. Of the 85 languages spoken in Ethiopia, only three were covered by Facebook’s content moderation practices, the lawsuit says.
The Meta spokesperson declined to say how many moderation staff worked in the African hub, which serves a collective population of over 500 million.
Another lawsuit filed in Kenya this year alleges that the hub created an exploitative and unsafe work environment, exposing moderators to high volumes of traumatic content and paying less than promised.
One plaintiff says Facebook is directly responsible for his father’s murder
Abrham Meareg Amare, one of the plaintiffs, holds Facebook’s algorithm and poor moderation directly responsible for the death of his father, Meareg Amare Abrha.
Court filings say that in October 2021, militants followed Meareg home from work, shot him in the back and leg, and left him to bleed.
Meareg, a well-respected chemistry professor according to the lawsuit, was targeted after Facebook posts spread his name, photo and false allegations that he was associated with a deadly rebel group because of his ethnicity as a Tigrayan, the country’s minority demographic.
“I knew it was a death sentence for my father the moment I saw it,” Abrham said in an interview with NPR.
“Facebook is a big gun in Ethiopia. […] People use Facebook as a reliable source of information because they don’t have trust in state media. Something posted on Facebook is considered a magic bullet — a valid thing.”
Like Abrham, Meareg’s friends and neighbors also warned him of the posts, but the professor chose to return from an overseas trip and was killed within weeks, his son said.
In an affidavit, Abrham says he asked Facebook multiple times to remove posts about his father — both before and after his death. Some posts still remained up as of this week.
In addition to restitution from the victim’s fund, he’s seeking a public apology from Facebook to him and other victims.
“To Facebook, it’s as if we’re idiots; we’re subhuman. […] Our family doesn’t matter if they’re making profits,” he said.
Ethiopia’s civil war has led to death and displacement
Researchers say media has played a key role in polarizing the country. Three of the country’s biggest outlets — ESAT, OMN and Tigrai TV — are divided along ethnic lines.
In a 2021 statement, Facebook said Ethiopia was on its list of countries “at the highest risk for conflict” but it was especially challenging to moderate given the number of languages spoken in the country.
“Between May and October 2021, we took action on more than 92,000 pieces of content in Ethiopia on Facebook and Instagram for violations of our Community Standards prohibiting hate speech, about 98% of which was detected before it was reported by people to us,” the company said at the time.
Facebook also said that less than 10% of Ethiopia’s population uses the platform — a low proportion relative to other countries, driven primarily by poor internet access.
Ethiopia has one of the lowest internet use rates in the world, with only 24% of residents logging on regularly, according to 2020 data from The World Bank.
Mercy Mutemi, the lawyer representing the two individual plaintiffs, says that Big Tech has been rapidly, but unethically, expanding in Africa.
“Not investing adequately in the African market has already caused Africans to die from unsafe systems,” she said in a statement shared with NPR by Foxglove, a nonprofit supporting the plaintiffs.
“We know that a better Facebook is possible — because we have seen how preferentially they treat other markets,” Mutemi said.
“African Facebook users deserve better. More importantly, Africans deserve to be protected from the havoc caused by underinvesting in protection of human rights.”
Meta’s oversight board previously recommended a human rights assessment in Ethiopia
Meta, which also owns Instagram and WhatsApp, reported nearly $40 billion in profits last year.
The lawsuit says that Facebook’s earnings are partially tied to how long users stay on the platform, which leaves the company with little incentive to remove eye-catching violent content. Years of internal and external studies suggest that Facebook’s algorithm, the platform’s engagement engine, spurs extremist beliefs.
In 2018, the United Nations said that Facebook played a key role in fueling violence in Myanmar.
In 2021, leaked documents revealed Facebook did the same in India, its largest market, even as employees spoke out.
The new lawsuit draws a parallel between Meta’s allegedly hands-off role in the Ethiopian conflict and its responsive one in the Jan. 6 attack on the U.S. Capitol. For the latter, Facebook implemented a crisis plan to promptly mute inflammatory content. The plaintiffs say that crisis plan could be replicated for conflicts in Africa.
Abrham told NPR that demands like these constitute straightforward ways to treat everyone with dignity, which should be a chief goal of a platform that claims to value human connection.
He said Facebook should start by publicly acknowledging its role in generating hate crimes and disinformation.
“There are hundreds of stories, thousands of people who’ve lost their loved ones because of Facebook,” he said.
“After all that they’ve lost, there is still life. There is a family which is in distress, which has been victimized by arrogant and irresponsible treatment.
“We need to act now or never.”
By Emily Olson