Facebook refuses to remove fake news, but will demote it

Forget about getting rid of fake news, Facebook said on Thursday. It might be raw sewage, but hey, even raw sewage has a right to flow, right?

In the name of free speech, Facebook said, it’s keeping all the bilge water, be it pumped out by the right or left… though the platform intends to push fakery down deeper into the holding tank by demoting it.

As Facebook said in its tweet, demotion translates into an 80% loss of future views, and the punishment extends to Pages and domains that repeatedly share bogus news.

This latest fake-news spasm comes on the heels of an event Facebook held in New York on Wednesday that blew up in its face. Journalists got to feed on shrimp cocktail, listen to a short presentation, and then engage in a question-and-answer session, all in the name of convincing the press that the social media network has finally reached some kind of beachhead in the war against disinformation.

Facebook’s effort fell apart when CNN reporter Oliver Darcy began to grill Facebook Head of News Feed John Hegeman about its decision to allow Alex Jones’ conspiracy news site InfoWars on its platform.

How, Darcy asked, can the company claim to be serious about tackling the problem of misinformation online while simultaneously allowing InfoWars to maintain a page with nearly one million followers on its website?

Hegeman’s reply: the company…

…does not take down false news.

CNN quoted Hegeman’s rationalization:

I guess just for being false that doesn’t violate the community standards. [InfoWars hasn’t] violated something that would result in them being taken down.

I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view.

InfoWars is the site where conspiracy theorist Alex Jones airs his notions: notions that include labelling as “liars” the grieving families of children gunned down in school shootings such as that at Sandy Hook Elementary School. In YouTube videos, Jones has over the years said that the Sandy Hook shooting has “inside job written all over it,” has called the shooting “synthetic, completely fake, with actors, in my view, manufactured,” has claimed “the whole thing was fake,” said that the massacre was “staged,” called it a “giant hoax,” and suggested that some victims’ parents lied about seeing their dead children.

Sandy Hook is only one of his many focuses: earlier this year, InfoWars smeared student survivors of the Parkland, Florida shooting with baseless attacks, portraying them in one video as actors, just as he’s classified Sandy Hook victims as child actors. Most recently, InfoWars has pushed an unfounded conspiracy theory about how Democrats, “infuriated” by President Trump “bringing America back,” planned to start a civil war on 4 July.

Facebook isn’t the only social media platform that publishes this type of gunk, declaring that it passes muster with regards to community standards. Google, just like Facebook, considers Jones’ YouTube rants to be kosher as far as community standards go.

That, in spite of multiple defamation lawsuits having recently been filed against Jones by Sandy Hook parents. Those parents claim that Jones’s “repeated lies and conspiratorial ravings” have led to death threats, among other trauma. Another lawsuit has been filed against Jones by a man whom InfoWars incorrectly identified as the Parkland school shooter.

A bit of recent history regarding Facebook and its wrangling with fake news: In April, Facebook started putting some context around the sources of news stories. That includes all news stories: both the sources with good reputations, the junk factories, and the junk-churning bot-armies making money from it.

You might also recall that in March 2017, Facebook started slapping “disputed” flags on what its panel of fact-checkers deemed fishy news.

As it happened, these flags just made things worse. The flags did nothing to stop the spread of fake news, instead only causing traffic to some disputed stories to skyrocket as a backlash to what some groups saw as an attempt to bury “the truth”.

When Darcy pressed Facebook’s reps with more questions about the company’s tolerance of InfoWars at the press event on Wednesday, Sara Su, a Facebook product specialist for News Feed, said that Facebook is choosing to focus on tackling posts that can be proven beyond a doubt to be demonstrably false:

There’s a ton of stuff – conspiracy theories, misleading claims, cherry picking – that we know can be really problematic, and it bugs me, too. But we need to figure out a way to really define that in a clear way, and then figure out what our policy and our product positions are about that.

Facebook spokeswoman Lauren Svensson followed up with Darcy after the event, telling him that questions about InfoWars hit “on a very real tension” at Facebook, and that demoting fakery seems to strike the right kind of balance:

In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.

That said, while sharing fake news doesn’t violate our Community Standards set of policies, we do have strategies in place to deal with actors who repeatedly share false news. If content from a Page or domain is repeatedly given a ‘false’ rating from our third-party fact-checkers …we remove their monetization and advertising privileges to cut off financial incentives, and dramatically reduce the distribution of all of their Page-level or domain-level content on Facebook.

It’s not as if the platforms aren’t retaliating at all to outrageous material such as that doled out by InfoWars.

At YouTube, Jones’s channel got its first strike on 23 February for a video that suggested that David Hogg and other student survivors of the Parkland mass shooting were crisis actors. The video, “David Hogg Can’t Remember His Lines In TV Interview,” was removed for violating YouTube’s policies on bullying and harassment.

The second strike was on a video that was also about the Parkland shooting. The consequences of getting two strikes within three months was a two-week suspension during which the account couldn’t post new content. A third strike within three months would mean InfoWars would get banned from YouTube. At the time, InfoWars had more than two million YouTube subscribers.

It is easy to make the argument that Facebook and Google are reluctant to poke the hornets’ nest when it comes to groups of users known to be volatile – that has certainly applied to InfoWars followers in the past – but at the end of the day, we have to accept that the social media companies are still just at the beginning of trying to figure out how to police their massive amounts of user-generated content.

With any set of community guidelines that have to play catch-up with current events – after all, “Sandy Hook parents” aren’t a named category when it comes to protected groups in hate speech guidelines – we’re going to have to suffer the consequences of Facebook, et al., scrambling to make it up as it goes along.

The will is undoubtedly there. But so are the ad dollars. Demoting content, Pages and domains is in service of the truth over reader engagement and marketing numbers. Is it enough to turn the tide?

Readers, your thoughts: does content demotion have any chance of making headway in this battle?