Only Facebook knows how it spreads fake election news - Action News
Home WebMail Monday, November 11, 2024, 03:59 AM | Calgary | -1.1°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Only Facebook knows how it spreads fake election news

Whether fake news stories shared to Facebook affected voter opinion is impossible to say because of how little insight we have into how Facebooks myriad algorithms work.

Secret algorithms make it hard to judge how too-good-to-be-true stories influence voters

Hillary Clinton campaign chairman John Podesta addresses a crowd of supporters on election night in New York. (Jim Bourg/Reuters)

If Facebook is to be believed, Hillary Clinton has deep ties to satanic rituals and the occult.

The post in question has nearly 3,000 shares, and links to a story on a conspiracy-laden political site. It is most definitely fake. But like many of the stories that were posted to Facebook in this U.S. election cycle, it was written specifically for those with a right-leaning partisan bias in mind. For this particular group of voters, it just begged to be shared.

Because the algorithms are a black box, there's no way to study them.FrankPasquale, law professor at the University of Maryland

And share they did. In an election dominated by the sexist, racist, and generally outrageous invective of America'spresident-elect Donald Trump, Facebook proved the perfect social platform for the sharing of fake, too-good-to-be-true style news.

At the end of August, The New York Times' John Herrmanreported on the subtle shift in Facebook feeds across America, many of which were increasingly filled with questionable news sources and fake stories specifically designed to be shared. More recently, BuzzFeed's Craig Silverman took on the daunting task of debunking fake news stories in near-real time.

Democrats and Republicans alike clicked and shared on what they hoped to be true, whether or not there was any underlying truth.

In both the run-up to the election and its immediate aftermath, there have been arguments that Facebookhelped make a Trump presidency possible that, by design, Facebook helps breed misinformation and encourage the spread of fake news, and that it can shape voter opinion based on the stories it chooses to show.

Whether or not this is true is practically impossible to say because of how little insight we have into how Facebook's myriad algorithms work.

High school students in San Francisco protest on Thursday against the election of Donald Trump. (Jeff Chiu/Associated Press)

"I think that if we were to learn how, for example, networks of disinformation form, that would give people a lot more information of how to create networks of information," said Frank Pasquale, a law professor at the University of Maryland, and author of The Black Box Society,a book on algorithms."But because the algorithms are a black box, there's no way to study them."

Facebook is notoriously tight-lipped about how its algorithms are designed and maintained, and has granted only a handful ofcarefully controlled interviews with journalists.We know that signals such as likes, comments, and shares all factor heavily into what Facebook shows its users,but not which signals contribute to a particular post's appearance in a user's feed, nor how those signals are weighted.

"Anything that gets clicks, anything that gets more engagement and more potential ad revenue is effectively accelerated by the platform, with very rare exceptions," Pasquale said.

Algorithmic transparency

Inevitably, posts that hewed to partisan beliefs proved especially popular, whether or not they were true. And how much of an impact these voices had on the voting public, only Facebook knows.

For us to have any insight would requirea level of algorithmic transparency, or algorithmic accountability into systems that few understand, though they increasingly shape the way we think.

"Election information is one of those domains where there's a pretty clear connection between information that people are being given access to and their ability to make a well informed decision," says Nicholas Diakopoulos, an assistant professor at the University of Maryland's journalism school.

He says algorithmic transparency is "one method to increase the level of accountability we have over these platforms."

Facebook board member Peter Thiel, who donated $1.25 million to Donald Trump's campaign, speaks at the Republican National Convention in July. (Mark J. Terrill/Associated Press)

Both Diakopoulos and Pasquale believe that Facebook is actually a media company despite its repeated claims otherwise and as suchneeds to take more responsibility for the quality of news that appears on its site.

One concern is that Facebook has so much power and influence over the content its nearly 1.2 billion daily users see that it could conceivably influence the outcome of an election. In fact, Facebook actually did somethingto this effect in 2012, assisting academic researchers witha "randomized controlled trial of political mobilization messages delivered to 61 million Facebook users during the 2010 U.S. congressional elections."

The study's authors concluded that, both directly and indirectly, the Facebook messages increased voter turnout by340,000 votes. Without more insight into how Facebook places news stories in its users' feeds, no one would ever know if a viral political hoax site was responsible for doing the same.

Trusted sources

There is little insight into how Facebook identifies trustworthy sources of informationand penalizes those that are not. But in light of past censorship squabbles such as Facebook's removal and subsequent reinstatement of a photo from the Vietnam War of a young, napalm-burned Kim Phuc the question is whetherusers will feel comfortable with Facebook having that role.

"Do we really want Facebook deciding what's misinformation or not?" asked Jonathan Koren, who previously worked at Facebook on the company's trending news algorithm, and is a software engineer at an artificial intelligence company called Ozlo.

"And that's why they don't want to do it, because they don't want to be responsible for it. But at the same time, there's nobody responsible."