Better late than never
After an election season infused with hoax stories, more than a month of global outcry and at least one real-world incident of gun violence in the U.S., Facebook is finally stepping up to combat the spread of fake news.
The social network announced Thursday that it will work with fact-checking outlets to label fake stories, flagged by users, as “disputed,” Adam Mosseri, a vice president in charge of the news feed, wrote in a press release.
Now before sharing a fake story on the site, you’ll get a warning that its accuracy has been “disputed.” To find out why, you’ll be able to click a link for a fact-check of the article.
The updates are rolling out Thursday, so you won’t see the disputed flags right away, but expect them shortly.
Facebook also announced several other steps that chief executive Mark Zuckerberg had hinted at in a post last month.
The moves come after some full-throated denials from Zuckerberg ― in the wake of Donald Trump’s Election Day victory ― that fake news was even a serious issue.
It’s too early to know whether these efforts will be effective in combating the problem, which is perhaps larger in scope than many initially realized.
“The company is finally coming to terms with the fake news problem,” Alexios Mantzarlis, director of the International Fact-Checking Network at Poynter, told The Huffington Post. Mantzarlis, who has been writing about this issue for months, says that in recent weeks Facebook told him of its intention to partner with members of Poynter’s network ― sites like Snopes and FactCheck.org ― who adhere to a clear fact-checking code of principles.
Those principles include a commitment to nonpartisanship, fairness and transparency of sources. The Poynter group is a nonprofit that receives funding from the Bill and Melinda Gates Foundation, Google, the National Endowment for Democracy and others.
The Thursday announcement comes after a rash of hoax stories shared by pro-Trump websites were widely read in the run-up to the presidential election. Indeed, a BuzzFeed analysis found that in the three months before the U.S. voted, election-related fake news stories generated higher rates of Facebook engagement (likes and shares) than the top election stories from 19 major news sites. Other reporting from BuzzFeed showed that it was the anti-Hillary Clinton, pro-Trump fake stories (she’s a murderer, she runs a child sex-trafficking ring out of a pizza shop) that grabbed the most attention.
And Facebook attention means a massive audience. While other sites, including Reddit and Twitter, have made some moves recently to combat fake news and vile trolling, Facebook is the juggernaut.
With its 1.8 billion monthly users, Facebook is now a critical platform for news distribution. News websites now get a majority of their traffic from Facebook, as internet users switch from visiting home pages on desktops to using apps on their mobile devices. In the U.S., 44 percent of American adults get their news on Facebook, according to Pew data.
“A fake story claiming Pope Francis — actually a refugee advocate — endorsed Mr. Trump was shared almost a million times, likely visible to tens of millions,” Zeynep Tufekci, an associate professor at the University of North Carolina, told The New York Times. “Its correction was barely heard. Of course Facebook had significant influence in this last election’s outcome.”
Reporters have found that fake stories propagated by pro-Trump websites went increasingly viral in the months leading to the U.S. presidential election, possibly influencing voters ― and widely shared posts have incited violence around the world, even leading to the beating death of one woman in Brazil, according to the Poynter group.
Initially, Zuckerberg dismissed the notion out-of-hand that fake news influenced the election, calling it a “pretty crazy idea.” He barely acknowledged that fake news was a problem at all.
However, Facebook employees themselves were taking the idea very seriously, The New York Times reported.
Later in November, Zuckerberg, under pressure from a rising public outcry, seemed to sort of reverse himself, writing in a post that the company would start looking at ways to fight fake news.
The changes announced Thursday ― previewed in that post ― signal how far the company has evolved on this issue.
This would mark the first time Facebook’s worked with outside organizations on its content stream.
It’s worth emphasizing that the social network is deliberately not making itself the arbiter of what’s real or fake news. The company has repeatedly insisted it’s not a media outlet ― and is not responsible for the content of the stories and statuses that appear in the feed.
Instead, Facebook considers itself an agnostic platform for sharing news. “We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties,” Zuckerberg said in his November post.
A spokesperson did acknowledge on Thursday that the company bears some responsibility for stories that spread through Facebook. However, the announcement makes clear just how hesitant the company is on the practice of judging content or being seen as restricting speech.
“We believe in giving people a voice and that we cannot become arbiters of truth ourselves,” Mosseri writes. “We’re approaching this problem carefully.”
The site will not ban links. “That’s not an area we want to get into,” the spokesperson said.
Members of Poynter’s fact-checking network will review articles that have been most frequently flagged by users as fake. This method would hopefully quash stories that are getting the most viewers and traffic.
The fact-checking group wrote an open letter to Facebook last month urging it to take the problem of fake news seriously.
Last month, following Google’s lead, Facebook took the very minor step of restricting ads from dubious sources. However, ads aren’t the real problem on Facebook. It’s the stories people share freely, driving up huge amounts of traffic to bogus websites profiting from the volume of readers.
Facebook said it’s primarily concerned with the flat-out made-up articles (like the pizza-place pedophile story) and pieces produced by hoax sites (think abcnews.com.co), rather than articles from outlets that may take a progressive or conservative angle on the news.
The company will block disputed stories from getting paid promotion ― a practice that allows websites to buy increased circulation, placing promoted stories in your news feed along with items shared by friends.
The site will also lower the rank of content that is clicked on but not shared ― a likely sign of poor quality. Think of it like trying a sample at Costco, but not buying the food, the spokesperson said.
Finally, Facebook said it will further crack down on dubious websites that try to advertise on the social network.
The fake news issue has roiled the internet and the real world.
President Barack Obama took time to assail fake news in speeches and in a late November interview with Rolling Stone’s Jann Wenner. It’s a problem when an article on climate change by a Nobel prize winner looks as credible as something written by a guy “in his underwear in the basement,” Obama said. “In an Internet era where we still value a free press and we don’t want censorship of the Internet, that’s a hard problem to solve.”
Indeed, the problem of hoax news seemed to reached a crisis point two weeks ago when a gunman walked into a Washington, D.C., pizza shop looking to bust-up a (nonexistent) pedophile sex ring, supposedly run by the Clintons, that had been written about on hoax websites.
“This is not about politics or partisanship. Lives are at risk, lives of ordinary people just trying to go about their days to do their jobs, contribute to their communities,” Hillary Clinton said in a speech at the Capitol last week. “It is a danger that must be addressed and addressed quickly.”
Trump and his circle have not been vocal on this issue. The son of Trump’s incoming national security adviser Michael Flynn himself tweeted the pizza place conspiracy story.