Social media is blinding us to other points of view - Action News
Home WebMail Monday, November 11, 2024, 06:05 AM | Calgary | -1.6°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
ScienceAnalysis

Social media is blinding us to other points of view

In the recent U.S. election, everyone saw a different reality depending on their world views, which were amplified by social media's tendency to reinforce people's existing opinions.

Everyone saw a different reality during the U.S. election, depending on their own opinions

These are two examples of social media postings by Donald Trump's campaign staff members discovered by The Associated Press. Trump's paid campaign staffers have declared on their personal social media accounts that Muslims are unfit to be U.S. citizens and stated their readiness for a possible civil war. Such posts can reinforce supporters' opinions. (Associated Press)

Were you surprised by the results of last week's U.S. presidential election? If so, you're not alone. Many people found the results werea stark contrast to what they'd been seeing in their social media feeds.

But for everyone who was shocked by the election of Donald Trump, there are equal numbers of people who weren't surprised.

That is what's notable about theelection:Everyone saw a different reality, depending on their world views, which were amplified by social media's tendency toreinforcepeople's existing opinions.

This was the first U.S. presidential election in which the majority of voting adults got their news from social media.In North America, an estimated 170 milllion users log onto Facebook each day, and 44 per centof U.S. adults get their news from the site.

But as Facebook keeps reminding us, it's not in the news business. Facebook and Twitter are in the business of clicks and data. Their mandate isn't to deliver balanced newsor information that is representative of what is really happening in the world.

The recent slew of fake news stories suggests they're not even all that concerned with how accurate or truthful stories are. Their priority is to give users access to the media they want. It turns out the stories people want are the ones that align with their beliefs.

Often, a catchy headline is enough. According to a recent study from Columbia University, 59 per cent of links shared on the internet have never actually been read.

Users tend to click links that affirm their existing opinions. "Facebook is designed to prevent you from hearing others," says media scholar Douglas Rushkoff. "It creates a false sense of agreement, a 'confirmation bias' when you are only seeing stuff that agrees with you or makes the other side look completely stupid."

A man walks past a mural in an office on the Facebook campus in Menlo Park, Calif. Some Facebook users received a shock last week when an unexplained glitch caused the social networking service to post a false notice that implied they were dead. (Jeff Chiu/Associated Press)

The trouble is, when your pre-existing opinions shape the news you see, you're not getting an accurate picture of what is really happening.

Two nights before the election, on the CBS program 60 Minutes, Republican pollster Frank Luntz said theelection was about people wanting to be heard. Unfortunately, while everyone wanted to be heard, no one wanted to listen. Thanks to social media, those who previously felt unheard now had a platform to share their views, but more often than not, the only ones listening were those who already agreed.

'A soapbox culture'

"We've developed a soapbox culture," says Elamin Abdelmahmoud, editor of news curation for BuzzFeed News. "We get to share what we want to share, and have the desire to be heardwhile forgetting that everyone else has that desire."

By design, platforms like Facebook and Twitter promote this kind of soapbox behaviour. As we encounter news and media through the self-selected group of friends that make up our social networks, we subject ourselves to what's become known as the filter bubble, whereby we come across information from people who think like usand more often than notvote like us.

As a result, despite having more access to information than ever, we're not engaging with points of view that differ from our own. In a recent Pew Research study, 79 per centof social media users said they have never changed their views on a social or political issue because of something they saw on social media.
Twitter headquarters in San Francisco bears this sign. As users encounter news and media through a self-selected group of friends that make up social networks, they subject themselves to what's become known as the filter bubble. (Jeff Chiu/Associated Press)

It's not just self-selected social network that creates this echo chamber. Facebook filters the news we see on the site, by suggesting media like the ads we see that is tailored to our preferences. If you're curious to know what Facebook thinks you'll like (or what your political learnings are) you can go into the ad preferences and see how they tailor content to you based on your interests and opinions.

In the election, filters blinded halfthe population to what was hidden in plain sight.

Allthe major U.S. newspapers endorsed Hillary Clinton, and that confirmed the world view of her supporters. Butthe comments under the articles in allthose news outlets were full of opposing views.

'Don't read the comments'

Many of these comments were vile and hateful, and "don't read the comments" has become a coping mantra for dealing with online toxicity. But by doing so, we choose not to see what is right in front of us.

No doubt, the polls and predictions contributed to the election result surprise, too. Social media users were cushioned in their own world views, and got solace and confidence knowing their perspective was backed up with data. But data is not infallible. Ask the wrong question, and you get the wrong answers. So, instead of the discomfort of breaking out of bubbles, we opted for the comfort of agreement, of sameness, even if what we were hoping for, fighting for, was the preservation of diversity.

How do we begin to reconcile what Harvard Neiman Lab's Joshua Benton calls "segregated social universes"?
This image made from a video posted on Donald Trump's official Facebook account on Oct. 8 shows the Republican presidential nominee apologizing for sexually charged comments caught on tape in 2005. (Associated Press)

Aresearcher of technology and society at Microsoft Research, danah boyd,who spells her name without capital letters, says that while many critics think that the answer is to get rid of social media, "we need to actively work to understand complexity, respectfully engage people where they're at, and build the infrastructure to enable people to hear and appreciate different perspectives. This is what it means to be truly informed."

Let's shape our tools

The xenophobic, racist and sexistvoices that some found disturbing during the U.S. election campaigndidn't come out of nowhere. They were there all along, but perhaps confined tosomeone else's bubble. If some people were surprised by them, it was probably because the filters and data designed to please ushad weeded them out.

Even Mark Zuckerberg, who has adamantly maintained that Facebook is not a news provider, responded to the election saying he is deeply concerned about how Facebook could affect democracy, and the company could do a better job ofdistributingnews.

"There's a real weight to the idea of letting people who are programmers and coders and engineers design how we interact with each other," says Abdelmahmoud."They will do it with a painful attention to strict logic that defies, at least to some extent, the social norms of being in person."

He suggests that weremember what a big role listening plays in the way we engageoffline.

"We can listen to understand, not listen to respond. That's really difficult behaviour to encourage, since platforms do better when you contribute and respond, so of course they'll encourage the responding."