Facebook adding 3,000 new moderators to curb violent videos - Action News
Home WebMail Tuesday, November 26, 2024, 12:49 AM | Calgary | -15.6°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Facebook adding 3,000 new moderators to curb violent videos

Mark Zuckerberg responded on Wednesday to a string of videos depicting murders and suicides that have been posted to the social network in recent months

A string of videos of murders and suicides have been posted to the social network in recent months

Mark Zuckerberg says Facebook is 'working to make these videos easier to report so we can take the right action sooner.' (Kimihiro Hoshino/AFP/Getty Image)

Facebook willhire 3,000 more people over the next year to respond to reportsof inappropriate material on the social media network and speedup the removal of videos showing murder, suicide and otherviolent acts, chief executive officer Mark Zuckerberg said on Wednesday.

The hiring spree is an acknowledgement by Facebook that, atleast for now, it needs more than automated software to improvemonitoring of posts. Facebook Live, a service that allows anyuser to broadcast live, has been marred since its launch lastyear by instances of people streaming violence.

Zuckerberg, the company's co-founder, said in a Facebookpost the workers will be in addition to the 4,500 people who
already review posts that may violate its terms of service.

Last week, a father in Thailand broadcast himself killinghis daughter on Facebook Live, police said. After more than aday, and 370,000 views, Facebook removed the video. Other videosfrom places such as Chicago and Cleveland have also shockedviewers with their violence.

Zuckerberg said: "We're working to make these videos easierto report so we can take the right action sooner whetherthat's responding quickly when someone needs help or taking apost down."

The 3,000 workers will be new positions and will monitor allFacebook content, not just live videos, the company said. Thecompany did not say where the jobs would be located.

Facebook is due to report quarterly revenue and earningslater on Wednesday after markets close in New York.

AI can only do so much

The world's largest social network, with 1.9 billion monthlyusers, has been turning to artificial intelligence to try toautomate the process of finding pornography, violence and otherpotentially offensive material. In March, the company said itplanned to use such technology to help spot users with suicidaltendencies and get them assistance.

However, Facebook still relies largely on its users toreport problematic material. It receives millions of reportsfrom users each week, and like other large Silicon Valleycompanies, it relies on thousands of human monitors to reviewthe reports.

"Despite industry claims to the contrary, I don't know ofany computational mechanism that can adequately, accurately, 100percent do this work in lieu of humans. We're just not there yettechnologically," said Sarah Roberts, a professor of informationstudies at UCLA who looks at content monitoring.

The workers who monitor material generally work on contractin places such as India and the Philippines, and they facedifficult working conditions because of the hours they spendmaking quick decisions while sifting through traumatic material,Roberts said in an interview.