Facebook’s attempt to cut out fake or — as the social network puts it — “disputed” news has been in vain so far, according to a new study from Yale University.
It shows that flagging fake news on the platform does little to convince readers of its inauthenticity. David Rand, associate professor of psychology at Yale, told TheWrap his study demonstrated little evidence the flagging feature on the social network made much of a difference.
“The big take home for us, in terms of getting people to not believe fake news or to be better at telling fake news from real news, is it seems like it’s probably not really effective,” Rand said of the Facebook feature.
The academic pointed to two reasons for this: The impact of flagging was “pretty small,” with a 3.7 percent decrease in “perceived accuracy” for articles with the disputed label. The second indicated a backfiring of sorts from the labeling process, where articles without the disputed tag are, by virtue, assumed to be true — even if they’re false.
“We call this the ‘implied truth effect,’” said Rand. “Because if you tag some stories, some people will assume all of the untagged stories — rather than being stories that haven’t been checked yet — they will assume they’re stories that have been checked and verified.”
Facebook started flagging posts in March, after 2016 election fallout and claims of Russian meddling. Users are able to click on the upper right hand corner of news articles to flag them; Facebook also uses algorithms to spot sham posts. The questionable posts are sent to fact-checking organizations like The Poynter Institute for Media Studies. If confirmed as a fake story, the post will receive the “disputed” tag and an attached explanation. These posts are then bumped down in the News Feed — a death blow to drawing eyeballs and monetization.
(Facebook did not immediately respond to TheWrap’s request for comment on additional measures it takes to combat fake news.)
Rand and his partner on the study, Professor Gordon Pennycook, looked to see if adding large banners to the bottom of each article, emblazoned with the logos of news outlets, increased belief in a post. They compared the responses of a National Public Radio logo against a fabricated outlet, “Freedom Daily,” in their study. The results — for both real and fake stories — were negligible.
“Fascinatingly and very surprisingly to us, to be honest, is it didn’t seem to do anything,” said Rand, on inserting the banners. “Adding those logos didn’t influence accuracy judgement at all.”
This was telling, according to Rand, because it pushed back against psychological studies that show more credible news outlets are usually seen as more accurate. “The fact that we don’t see that effect here, suggests that people aren’t really seeing traditional mainstream outlets as more credible than whatever random website.”
Of the more than 7,500 people surveyed in the study, this was particularly a problem for fans of President Trump and people ages 18-25. The data showed Trump supporters are more likely to believe fake news than Hillary Clinton supporters. They’re also more likely to write off established news sites as deceptive or downright false.
“We find that Clinton supporters are more likely to stop and think, whereas Trump supporters are more likely to go with their gut,” said Rand. “And this tendency to go with your gut, explains part of why Trump supporters are worse at telling fake news than real news.”
Rand said it also doesn’t help that Trump is “very explicitly on the offensive against real news.”
The results of the study do little to inspire confidence in the effectiveness of flagging fake news. But Rand doesn’t think Facebook’s attempts are futile. Instead, adding a third label would help mitigate the “implied truth effect,” where the absence of a label grants a post credence.
“One thing that would make this fact-checking more effective, I think, is if not only flagging disputed stories as ‘disputed,’ they basically put little flags on all stories that either said ‘disputed by third-party fact checkers,’ ‘confirmed by third-party fact checkers, or ‘not yet checked.’”
With the 2018 midterm elections on the horizon — not to mention the 2020 Presidential race — it’ll be imperative for readers to scan their news feeds with a discerning eye. This will be made easier if Facebook can continue to bolster its weapons in the fight against fake news.