- Screenshot via Facebook
Like many Americans, entrepreneur and software developer Krishna Kaliannan was taken aback by the 2016 election.
“I was very shocked when Trump won,” Kaliannan said.
In the lead-up to Election Day, Kaliannan said he was fairly certain Hillary Clinton would win the White House, and the variety of news he was getting from his Facebook News Feed seemed to support that.
“I was almost completely left in the dark as far as how a very large number of Americans felt about Donald Trump and what direction they wanted to take the country in,” Kaliannan said.
Noting the discrepancy between what he thought was happening and what was actually happening, he developed Escape Your Bubble, a Google Chrome extension that inserts news stories into a user’s Facebook News Feed. The extension is designed to expose users to political views that are different from their own.
Once installed, Escape Your Bubble asks users to identify whether they want to learn about Republicans or Democrats and inserts stories into their feeds that align with the political leanings they wish to be exposed to. The extension then adds one clearly marked story that falls outside of the user’s established political viewpoint every time a user visits the social network.
Facebook has recently been criticized for its role in spreading fake news – forms of disinformation and propaganda that some have speculated may have affected the election.
Last month a BuzzFeed study showed that several fake news stories on Facebook significantly outperformed stories from legitimate news sources in the days leading up to the election.
As Facebook has become an ever-larger source of news for some Americans, questions have risen about how the social network will develop stronger editorial standards.
In its attempt to curate news, Kaliannan acknowledges that Escape Your Bubble may be wading into the same issues plaguing Facebook. He says that’s why he is taking a cautious approach to how much the extension influences what users read, ostensibly by inserting only one opposing story at a time.
- REUTERS/Dado Ruvic
Currently, Kaliannan and a team of about 12 volunteers choose news stories and opinion pieces from a variety of major news outlets that are then shown to users to help balance out their regular diet of news. The selection process is informal, meaning Kaliannan and his volunteers use their own judgment when verifying and selecting reading material.
For instance, a user with liberal political leanings can be shown an article with the headline “5 Facts About Republicans” from The Pew Research Center, a nonpartisan think tank, that they may not have otherwise seen on their Facebook feed.
Kaliannan asserted that it’s “dangerous to show people things they disagree with,” suggesting that doing so could cause some people to dig their heels in and reject the information.
There is some evidence that appears to support this. Researchers at Dartmouth note that users who are exposed to media that contradicts their political views may question its credibility, thus reinforcing their own opinion in what is known as the “backfire effect.”
According to Kaliannan, Facebook should be held accountable for how it serves up news to users. But there is also an onus on users to be mindful about news sources they choose to consume.
“This is not a problem that can be solved entirely by software,” Kaliannan said. “People need to understand and they need to be taught that seeking out a diverse set of opinions is a constructive way to broaden your world view.”