$2 billion suit against Meta launched for spreading violence
The suit against Meta's Facebook argued that the platform allows people to expose others and puts them at great risk.
A legal case demanding algorithm adjustment and a $2 billion fund for the victims of Facebook hate crimes has been brought against Facebook, in Kenya's High Court, following its contribution to the spread of violence and hatred during the civil war in Ethiopia.
Abrham Meareg, the son of an Ethiopian scholar who was killed after being harassed in Facebook posts, is one of those bringing the case against Meta. The case was put forward in Kenya given that in Nairobi, Kenya's capital, Meta maintains a hub for content moderation.
According to Meta, however, the company had extensively invested in tech and moderation in an attempt to combat hatred.
A Meta representative responded to these allegations and noted that hate speech and incitement to violence are against Facebook's regulations.
The representative was cited saying that "our safety-and-integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions," the representative said.
Professor Meareg Amare Abrha was shot at close range while attempting to enter the family house on November 3, 2021, after being followed home from his workplace by armed men on motorcycles.
As he lay bleeding, bystanders were prevented from rescuing him due to threats from his attackers, his son reportedly claimed. Seven hours later, as he lay on the ground, he passed away.
His son argued that before the incident happened, Facebook posts defamed and exposed his father's personal information. Facebook's reporting tool was frequently used to voice complaints about the posts that posed a security risk to the professor, the son said, but Facebook "left these posts up until it was far too late."
Meareg said, "If Facebook had just stopped the spread of hate and moderated posts properly, my father would still be alive."
He further asserted in a sworn statement submitted to the court that Facebook's algorithm encourages "hateful and inciting" information because it is more likely to generate user interaction.
Additionally, he asserted that Facebook's content filtering in Africa is "woefully inadequate," as there aren't enough moderators to handle posts in the key Amharic, Oromo, and Tigrinya languages.
In response, Meta, the mother company of Facebook, told BBC News "We employ staff with local knowledge and expertise and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali and Tigrinya."
Read more: FB knew danger of Instagram on girls: Internal doc