Viral mosque shooting video raises questions about social media firms' responsibilities - Action News
Home WebMail Monday, November 11, 2024, 03:20 AM | Calgary | -1.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Viral mosque shooting video raises questions about social media firms' responsibilities

The Facebook livestreaming and subsequent widespread sharing of a mass shooting at two mosques in New Zealand raises questions about social media firms abilities and responsibilities to stop their platforms from being used to propagate hate and inspire violence.

Proactive removal of extremist content should be technically possible, expert says

The attack during Friday prayers on March 15 was livestreamed on Facebook until police contacted the social media company to remove the video. Copies were widely shared on other platforms such as Twitter and YouTube. (Kirill Kudryavtsev/AFP/Getty Images)

The Facebook livestreaming and subsequent widespread sharing of a shooting that killed 50 people at two mosques in Christchurch, New Zealand,is raising questions about social media firms' abilities and responsibilities to stop their platforms from being used to propagate hate and inspire violence.

The attack during Friday prayers on March 15 waslivestreamedon Facebookfor about 17 minutesuntil police contacted the social media company to remove the video, Reuters news agency reported.

Philip Mai, director of business and communications at Ryerson University's Social Media Lab, said it does appear that the original video was taken down faster than in previous incidents like this.

But, he noted, it still took some time because it required police to intervene.

"By then, the damage has already been done," he said.

The video, 17 minutes long at its full length,was subsequently shared on various social media platforms, including Twitter, YouTube, Whatspp and Instagramdespite police appeals not to share the videos and socialplatforms' reported attempts to stamp out circulating copies.

BuzzFeed tech reporter Ryan Mac reported that he was still seeing copies circulating 18 hours later.

Mai said social media sites are often able to remove content such asmusic videos that they believe violate someone's copyright far more proactively and automaticallyusing artificial intelligence.

And Facebook announcedFridayit was planning to use artificial intelligence to automatically flag "revenge porn" for removal.

"The technology's there," Mai said. "But for whatever reason, this kind of thing is not being flagged as quickly as other types of content."

He acknowledged live videos are unique and may be harder to detect and flag than content like music videos, butsaid there have been enough such incidents to program a computer to watch for certain patterns.

The alleged gunman used a smartphone app called LIVE4, which allows users to broadcast live on Facebook directly from a GoProor other action camera.

A still image from a video the alleged gunman filmed and streamed live online as the attack was unfolding shows him retrieving weapons from the trunk of his car. New Zealand says it's an offence to possess or share copies of videos showing Friday's mass shooting that killed at least 50 people at two mosques. (Twitter via Reuters)

The company that developed the app, VideoGorillas,told Reutersit does notview, analyse or store the content streamed using its app.

"The responsibility for content of the stream lies completely and solely on the person who initiated the stream," founder Alex Zhukovtold Reuters.

The company posted a statement on its websiteFriday condemning the "disgusting use" of its app in the Christchurch massacre and offering condolences to friends and family of the victims.

"LIVE4 has zero tolerance towards violence. We will do whatever is humanly possible for it to never happen again," the statement said.

Need for regulation?

Putting in place a system to flag violent and criminal content,along witha process forpeople toappeal removal, andhiring humans to make the final callcould be expensive,Mai said.

It'ssomething companies may choose not to do if it isn't required by law, Mai said.

"I think that governments should be looking into laws and have a public debate as to what responsibility these companies have, what does society want from these companies and what do we need to impose on them?"

Stephanie Carvin, an assistant professor of international affairs who researches terrorism, told CBC News Network she thinks that governments should be asking more questions of social media companies and their role in making it easyto share extremist information.

Omar Nabi holds up a picture of his father, Haji Daoud, who was killed in the mosque attacks in Christchurch, New Zealand, on Friday. (Edgar Su/Reuters)

While many social media companies have started taking Islamist extremistcontent more seriously and dealing with it, she said, "they've been far less willing" to do that with far-right extremism, and governments should ask why.

Policymakers and governments need toexamine whether more regulations are neededto force social media companies to adhere to theirstandards on how to handle and prevent violent extremism and hateful rhetoric, said Carvin, of Carleton University's Norman Paterson School of International Affairs in Ottawa.

She said it was obvious the gunman in New Zealand "wanted to make a splash and bring us all into his very ... demented world view," and social media made that "easierto pass on this kind of information."

Corrections

  • An earlier version of this story said the Ottawa expert on terror's name is Stephanie Carver. In fact, it's Stephanie Carvin.
    Mar 16, 2019 4:04 PM ET