Taming Facebook's Fake News Problem

The Guardian:

Facebook has been accused of potentially swinging the election … by failing to acknowledge the fact that its algorithm was promoting fake news to millions of users. According to Buzzfeed news, more than 100 pro-Trump fake news sites were being run from a single Balkan town in the run-up to the election.

Cutting off the revenue to such sites by limiting the amount of money they can make from advertising may help limit their proliferation. But Facebook in particular faces a more fundamental issue given the ways in which its algorithm selects posts: if users engage more with fake news than real news, as seems possible, then Facebook’s algorithm will promote the fake news.


Tech2:

Called the B.S. Detector, this Chrome extension claims to identify and flag news that seems to be fake.The new project was released on Tuesday and can identify articles on Facebook that seem to be from a questionable source. When a user scrolls over an article that seems to be fake, a warning appears informing the user that the source of the article may not be from a credible source.

“I built this in about an hour yesterday after reading [Mark Zuckerberg’s] BS about not being able to flag fake news sites. Of course you can. It just takes having a spine to call out nonsense. This is just a proof of concept at this point, but it works well enough,” said {creator of the extension Daniel} Sieradski.

 
 


Update: via The Verge:

Today, Google announced that its advertising tools will soon be closed to websites that promote fake news, a policy that could cut off revenue streams for publications that peddle hoaxes on platforms like Facebook. The decision comes at a critical time for the tech industry, whose key players have come under fire for not taking neccesary steps to prevent fake news from proliferating across the web during the 2016 US election. It’s thought that, given the viral aspects of fake news, social networks and search engines were gamed by partisan bad actors intending to influence the outcome of the race.

"Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher's content, or the primary purpose of the web property," a Google spokesperson said in a statement given to Reuters. This policy includes fake news sites, the spokesperson confirmed. Google already prevents its AdSense program from being used by sites that promote violent videos and imagery, pornography, and hate speech.


Update 2: solid think piece from Slate:

People tend to read, like, and share stories that appeal to their emotions and play to their existing beliefs. Without robust countervailing forces favoring credibility and accuracy, Facebook’s news feed algorithm is bound to spread lies, especially those that serve to bolster people’s preconceived biases. And these falsehoods are bound to influence people’s thinking.

And yet, in the days following the election, as criticisms of the company mounted, Facebook CEO Mark Zuckerberg downplayed and denied the issue—a defensiveness that says even more about the company than the fake news scandal itself. Zuckerberg’s response points to a problem deeper than any bogus story, one that won’t be fixed by cutting some shady websites out of its advertising network. The problem is Facebook’s refusal to own up to its increasingly dominant role in the news media. It’s one that is unlikely to go away, even if the fake news does.