By Sarah Frier Bloomberg News
WWR Article Summary (tl;dr) A fundamental question that will be asked this week when internet companies testify in front of congressional committees: How responsible should Facebook, Google and Twitter Inc. be for information others distribute through their systems?
Facebook Inc.'s strategy to stamp out fake news is struggling.
The company outsources the process to third-party fact checkers who can only tackle a small fraction of the bogus news that floods the social network, according to interviews with people involved in the process. And screenshots obtained by Bloomberg reveal a process that some partners say is too cumbersome and inefficient to stop misinformation duplicating and spreading.
"There is no silver bullet," Facebook said in a statement. "This is part of a multi-pronged approach to combating false news. We have seen real progress in our efforts so far, but are not nearly done yet."
The flaws highlight a fundamental question that will be asked this week when internet companies testify in front of congressional committees: How responsible should Facebook, Google and Twitter Inc. be for information others distribute through their systems?
Facebook started noticing fake stories trending on its network as early as the summer of 2016, and it took a long time for the company to take any responsibility.
A few days after President Donald Trump's November election win, Facebook Chief Executive Officer Mark Zuckerberg said it was "crazy" to think fake news had swayed voters. But as it became clear that some fake political stories garnered more traffic on Facebook than work from traditional outlets, criticism of Zuckerberg's stance mounted.
After reflecting on the problem he said he would prioritize fixing it. His main solution has been the fact-checking effort.
In early 2017, Facebook contracted for one year with PolitiFact, Snopes, ABC News, factcheck.org and the Associated Press to sniff out fake news on its social network.
The company argued that paying outside firms helped address the problem without making Facebook the arbiter of what is true or untrue. Some critics say the company wants to avoid this responsibility because that could make it subject to more regulation and potentially less profitable, like media firms.
A previous Facebook effort to hire people to curate articles was criticized as biased and the company's artificial intelligence systems aren't yet smart enough to determine what's suspicious on their own. However, an inside look at Facebook's fact-checking operation suggests that the small-scale, human approach is unlikely to control a problem that's still growing and spreading globally.
When enough Facebook users say an article may be false, the story ends up on a dashboard accessible by the fact-checking staff at the five organizations, according to screenshots obtained by Bloomberg that showed a rash of bogus news.
A list of questionable stories appears in Facebook's signature dark blue font, accessible only after the organizations' journalists log into their personal social-media accounts.
"LeBron James Will Never Play Again," according to Channel 23 News. "BOMBSHELL: Trey Gowdy Just Got What He Needed To Put OBAMA IN JAIL," said dailyworldinformation.com. "Four Vegas Witnesses Now Dead or Disappeared," claimed puppetstringnews.com.
A column to the right of the articles shows how popular they were among Facebook's 2 billion users, according to the screenshots. In the next column over, fact-checkers can mark it "true," "false," or "not disputed," providing a link to a story on their own websites that explains the reasoning behind the decision.
The fact-checking sites sometimes have to debunk the same story multiple times. There's no room for nuance and its unclear how effectively they're addressing the overall problem, workers for the fact-checking groups said in interviews. They only have time to tackle a small fraction of the articles in their Facebook lists, the people added. They asked not to be identified discussing private activity.
Once two of the fact-checking organizations mark an article as false, a "disputed" tag is added to the story in Facebook's News Feed. That typically cuts the number of people seeing the piece by 80 percent, Facebook said recently. But the process typically takes more than three days, the company said.
"It might be even longer, honestly," said Aaron Sharockman, executive director of PolitiFact. "Everyone wishes for more transparency as to the impact of this tool." The group has marked about 2,000 links on Facebook as false so far, but he said he's never personally seen a "disputed" tag from this work on the social network.
PolitiFact, known for fact-checking politicians based on what they say in speeches, ranks their comments on a scale of "true" to "pants on fire", as in "liar, liar."
Before the election, the organization mostly steered away from obviously false news or hoaxes, assuming reasonable people would see a story about, say, the Pope endorsing Donald Trump and understand that it was clickbait.
But when it became clear that fake stories were going viral and gaining traction with people who may have been predisposed to believe them, PolitiFact expanded its focus.
There are non-political examples that illustrate this new world of bogus news on Facebook that PolitiFact is dealing with.
In recent weeks, there's been a surge of stories about celebrities moving to small towns. Bill Murray's car breaks down in Marion, Ohio, he's charmed by the locals and resolves to retire there. That story was repeated for many other towns and there are similar stories about Tom Hanks and Harrison Ford.
PolitiFact wrote one article entitled "No, a celebrity's car didn't break down in your hometown," then rated all those pieces "pants on fire." On the Facebook dashboard, a PolitiFact employee had to go through and manually mark each of these stories as false.
"There are whole hosts of copycats that spread a story," Sharockman said. "By the time we've done that process it's probably living in 20 other places in some way, shape or form."
Handling the Facebook dashboard is a good job for the interns, he added. Sharockman declined to discuss the mechanics of the dashboard, saying PolitiFact's deal with the company limits what he can say.
Out of hundreds of potentially false stories a day, many of which are duplicates, the five fact-checking organizations only have time to address a fraction.
An employee of one group said they aim to debunk five a day; another person targets 10 a week and estimated that the entire program may debunk 100 stories a month, including duplicates.
Facebook confirmed it sends hundreds of dubious stories to fact checkers daily, but it wouldn't comment how many are corrected.
Facebook expects this manual fact-checking work to help the company improve its algorithm over time, so it can get smarter at automatically spotting patterns and figuring out what stories might be worth showing to human partners, even before they're flagged by users.
Facebook also plans to extend its contracts beyond the first year. The deals currently offer about $100,000 annually to some sites, while others do it for free, according to a person familiar with the matter.
Facebook is also working on adding two new partners to help with the workload. One is the conservative magazine the Weekly Standard, said the person, who declined to be named because the information isn't public.
To be a fact checker, an organization has to sign a code of principles that includes a commitment to be neutral.
The Weekly Standard hadn't been verified as a signatory as of Friday, according to Alexios Mantzarlis, head of the International Fact-Checking Network at The Poynter Institute, which produced the code.
If Facebook truly wants to stamp out fake news, it should fund an in-house group of fact-checkers, Mantzarlis has argued. Facebook executives reject that idea.
Security chief Alex Stamos warned earlier this month that a technology company in charge of the facts would create a "Ministry of Truth," referring to the propaganda machine in George Orwell's novel "1984."