How a 9/11 Truther may be influencing which Ontario election videos you see - Action News
Home WebMail Wednesday, November 20, 2024, 02:18 AM | Calgary | -9.4°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Toronto

How a 9/11 Truther may be influencing which Ontario election videos you see

He's a YouTube user who puts salacious titles and grabby headlines on otherwise innocuous videos, but he gets more views than many professional news organizations.

Analysis of election-related YouTube searches shows unknown channel outranks many mainstream media outlets

The man believed to be behind the Steeper33 YouTube Channel has ties to a Kitchener, Ont., 9/11 Truther group. (YouTube)

If you seek out Ontario election coverageon YouTube, you've no doubt come across a recommended list of material the video-sharing service suggests you watch next.

A CBC News investigation found that a YouTube channel devoted to putting misleading headlines on TV stories from other stations is getting recommended more often than many mainstream news outlets.

The video by YouTube user Steeper33 which is most likely to appear when searching for Ontario election content is a video of PC MPP Sam Oosterhoff during Question Period.

The video is titled Youngest Ever Ontario MPP Destroys Premier Wynne.

And while that headline might convince you to click play, the content you getisn't exactly asbilled. The 19-year old MPP asks a straightforward question about manufacturing job losses andWynne simply defers the answer to the minister responsible for economic development.

So how is that kind of content ending up alongside material from mainstream media outlets on YouTube's list of recommended videos?

It could be a result of the kind of sensational headlines Steeper33 puts on his videos.

Nina Jankowicz,a global fellow at the Wilson Centre in Washington, D.C., has studied how YouTubealgorithms work as research for a book about Russian influence over election campaigns in Eastern Europe.

How YouTubechooses videos to promote

"People are more likely to interact with that kind of salacious, emotionally based material," she said, adding that YouTube's algorithm "will suggest more and more kinds of salacious content in order to keep people on the platform."

Ryerson University's Anatoliy Gruzd has also studied YouTube's recommendations system. He thinks the other content on Steeper33's YouTube channel may be boosting his performance as well.

"If they already have an existing following that is very engaged in content, by default the algorithm might think it is an authoritative user, and any content they post is popular."

Steeper33's account has almost 25,000 subscriberswith videos that have been viewed more than 25 million times.

His most recent videos are political in nature, critical of both Kathleen Wynne and Justin Trudeau, but his older content includes a variety of videos that support conspiracy theories around vaccine safety, U.S. "deep state" operatives, the Bilderberg group, and the destruction of the World Trade Centre.

A selection of conspiracy-related videos on the Steeper33 account. (YouTube)

Steeper33 has connections to a Kitchener, Ont.-based 9/11 Truther organization that routinely gave out DVDs to people on the street trying to promote the theory thatthe World Trade Centrewas taken down,not by the jet planes, but by explosives planted inside the buildings.

CBC News has made several attempts to contact the man believed to bebehind the Steeper33 account, but has never received a response.

CBC News started its analysis of election-rated search terms on YouTube on March 13, employing an algorithm createdby an ex-YouTube developer named Guillaume Chaslot.

Election-related searches

It is set up to mimic the behaviour of a YouTube user with a clear search history who searches for a topic, clicks on a video, then follows up by watching the choices recommended by YouTube.

Kitchener Truth, a group peddling a conspiracy theory about 9/11, hand out pamphlets and DVDs in April of 2010. (YouTube)

Twice a week, CBCwould run election-related search terms (such as the names of the party leaders) through the algorithm and catalogue the videos that were suggested.

The algorithm would then select the videos, and catalogue the next round of suggestions. This was repeated through several levels.

Whenthe data set generated by the end of April was analyzed, several days were found where Steeper33'scontent wasamong the most recommended videos, or where his content was recommended several times during one search.

While a search for election material was most likely to come up with content provided by TVOntario'spolitics show The Agenda, CBC's analysis found Steeper33 was the fourth-most recommended YouTube page, and the only non-news source in the top 10.

YouTube disputes the methodology developed by Chaslot, and questions whether an average YouTube user would get the same results.

A spokesperson from Google, the company that owns YouTube, tells CBC via email "In the last year our teams have worked hard to improve how YouTube handles queries and recommendations related to news. We made algorithmic changes to better surface clearly-labeled authoritative news sources in search results, particularly around breaking news events."

I absolutely think this is dangerous to the civil discourse-NinaJankowicz,global fellow at the Wilson Centre in Washington, D.C.

Jankowicz worries channels that appeal to the emotions of the audience may be better able to game YouTube's recommendation system, and may result in viewers getting stuck in filter bubbles, where "they're getting fed the same content over and over.

"I absolutely think this is dangerous to the civil discourse," she said. "This is just more and more polarizing, and making people exist on extremes where they didn't exist before."

She says YouTube has tweaked its recommendation system in light of some criticism, but she thinks it can do a better job curating the material it recommends its users watch.

Gruzd says the onus may be on the people clicking through to those suggested links.

"Social media users should perhaps realize how this algorithm works," he said, "and maybe question or think twice about whether it is coming from a reliable source."

Clarifications

  • Since this story was first published on May 26, 2018, it has been edited to include a statement from Google, the company that owns YouTube. The company says they have made changes to YouTube's algorithm to surface more clearly-labeled authoritative news sources in search results.
    May 27, 2018 1:54 PM ET

With files from Alexa Pavliuc