Home | WebMail | Register or Login

      Calgary | Regions | Local Traffic Report | Advertise on Action News | Contact

Login

Login

Please fill in your credentials to login.

Don't have an account? Register Sign up now.

Posted: 2022-06-09T15:13:47Z | Updated: 2022-06-09T15:13:47Z

SAN FRANCISCO (AP) The test couldnt have been much easier and Facebook still failed.

Facebook and its parent company Meta flopped once again in a test of how well they could detect obviously violent hate speech in advertisements submitted to the platform by the nonprofit groups Global Witness and Foxglove.

The hateful messages focused on Ethiopia, where internal documents obtained by whistleblower Frances Haugen showed that Facebooks ineffective moderation is literally fanning ethnic violence, as she said in her 2021 congressional testimony . In March, Global Witness ran a similar test with hate speech in Myanmar , which Facebook also failed to detect.

The group created 12 text-based ads that used dehumanizing hate speech to call for the murder of people belonging to each of Ethiopias three main ethnic groups the Amhara, the Oromo and the Tigrayans. Facebooks systems approved the ads for publication, just as they did with the Myanmar ads. The ads were not actually published on Facebook.

This time around, though, the group informed Meta about the undetected violations. The company said the ads shouldnt have been approved and pointed to the work it has done building our capacity to catch hateful and inflammatory content in the most widely spoken languages, including Amharic.

A week after hearing from Meta, Global Witness submitted two more ads for approval, again with blatant hate speech. The two ads, again in written text in Amharic, the most widely used language in Ethiopia, were approved.

Meta did not respond to multiple messages for comment this week.

When ads calling for genocide in Ethiopia repeatedly get through Facebooks net even after the issue is flagged with Facebook theres only one possible conclusion: theres nobody home.

- Rosa Curling, director of Foxglove

We picked out the worst cases we could think of, said Rosie Sharpe, a campaigner at Global Witness. The ones that ought to be the easiest for Facebook to detect. They werent coded language. They werent dog whistles. They were explicit statements saying that this type of person is not a human or these type of people should be starved to death.

Meta has consistently refused to say how many content moderators it has in countries where English is not the primary language. This includes moderators in Ethiopia, Myanmar and other regions where material posted on the companys platforms has been linked to real-world violence.

In November, Meta said it removed a post by Ethiopias prime minister that urged citizens to rise up and bury rival Tigray forces who threatened the countrys capital.

In the since-deleted post, Abiy said the obligation to die for Ethiopia belongs to all of us. He called on citizens to mobilize by holding any weapon or capacity.

Support Free Journalism

Consider supporting HuffPost starting at $2 to help us provide free, quality journalism that puts people first.

Thank you for your past contribution to HuffPost. We are sincerely grateful for readers like you who help us ensure that we can keep our journalism free for everyone.

The stakes are high this year, and our 2024 coverage could use continued support. Would you consider becoming a regular HuffPost contributor?

Thank you for your past contribution to HuffPost. We are sincerely grateful for readers like you who help us ensure that we can keep our journalism free for everyone.

The stakes are high this year, and our 2024 coverage could use continued support. We hope you'll consider contributing to HuffPost once more.

Support HuffPost

Abiy has continued to post on the platform, though, where he has 4.1 million followers. The U.S. and others have warned Ethiopia about dehumanizing rhetoric after the prime minister described the Tigray forces as cancer and weeds in comments made in July 2021.

When ads calling for genocide in Ethiopia repeatedly get through Facebooks net even after the issue is flagged with Facebook theres only one possible conclusion: theres nobody home, said Rosa Curling, director of Foxglove, a London-based legal nonprofit that partnered with Global Witness in its investigation. Years after the Myanmar genocide, it is clear Facebook hasnt learned its lesson.

Support Free Journalism

Consider supporting HuffPost starting at $2 to help us provide free, quality journalism that puts people first.

Thank you for your past contribution to HuffPost. We are sincerely grateful for readers like you who help us ensure that we can keep our journalism free for everyone.

The stakes are high this year, and our 2024 coverage could use continued support. Would you consider becoming a regular HuffPost contributor?

Thank you for your past contribution to HuffPost. We are sincerely grateful for readers like you who help us ensure that we can keep our journalism free for everyone.

The stakes are high this year, and our 2024 coverage could use continued support. We hope you'll consider contributing to HuffPost once more.

Support HuffPost