From Fake News to Real Murder: Facebook’s Incentive Problem

Fake news did not originate with Facebook, nor with the 2016 presidential campaign. Planting damaging stories of dubious provenance about a political opponent in the newspaper  is a tradition nearly as old as newspapering itself. And spreading false rumors is as old as human society.

But as we saw in last year’s election, Facebook and other social media platforms have elevated merely spurious information into a weapon of mass dysfunction. During the final three months of the 2016 campaign, the top 20 fake news stories circulating on Facebook racked up 8,711,000 shares, reactions, and comments on the platform, including such classics as “Pope Endorses Donald Trump” (960,000), and “FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide” (560,000).

BuzzFeed, which compiled those data, notes that those 20 fake stories attracted nearly 1.5 million more instances of engagement than the 20 top-performing stories 19 major news outlets over the same period. But the issue here isn’t so much real vs. fake but the role that Facebook’s massive scale played in encouraging the production of fake stories.

While some of the top fake news items were created by partisan outlets (or foreign intelligence agencies) for expressly political purposes, far more were created for profit, by people like Bela  Latsabidze, a 22-year old computer science student in Tbilisi, Georgia.

“In Tbilisi, the two-room rented apartment Mr. Latsabidze shares with his younger brother is an unlikely offshore outpost of America’s fake news industry,” the New York Times reported shortly after the election. “They say they have no keen interest in politics themselves and initially placed bets across the American political spectrum and experimented with show business news, too.”

Their pro-Hillary Clinton site did not attract many readers and its made-up news stories rarely went viral, so the brothers shifted their focus to making up positive stories about Donald Trump, where they found a more avid and engaged audience. More engagement on Facebook meant more hits on Google, which translated into more ad impressions, which in turn translated into more revenue for the brothers.

“For me, this is all about income, nothing more,” Latsabidze told the Times. Had his pro-Clinton site taken off, he added, he would have pressed on with that.

Facebook is now taking steps to try to limit the amount of fake news on its platform. This week it shut down 30,000 fake accounts in France ahead of that country’s upcoming national elections, and is currently looking to hire a head of news products to help it deal with the problem. It’s also trying to teach its algorithm to better recognize fake news stories and either flag them or deprecate them in users’ news feeds.

But no algorithm can solve the underlying incentive problem created by Facebook’s sheer size. It’s so big, and it’s reach is so vast and indiscriminate that even reaching only a small percentage of Facebook users adds up to substantial audience. You only need to fool some of the people some of the time on Facebook to make money pedaling nonsense.

Facebook obviously didn’t set out to create a platform for pedaling fake news. Nor is it the only social media platform with a fake news platform. But it’s size and scope make fake news profitable. And so long as that profit motive exists, entrepreneurs like Mr. Latsabidze will find ways to defeat whatever tweaks Facebook makes to its algorithm.

A far more chilling example of the unintended effects of Facebook’s ubiquity occurred on Easter Sunday, in Cleveland, when Steve Stephens broadcast himself on Facebook Live as he shot and killed a 74-year old man, seemingly to make some sort of depraved point.

Stephens explained in a rambling narration to the video that he had just broken up with the “love of [his] life” and had recently lost everything gambling in the top 5 casinos. “I’ve run out options,” he could be heard saying. “Now I’m just doing some murder-type shit.” He then picked out his victim at random, forced the victim to say his ex’s name, and shot him dead.

Stephens eventually took his own life, after being cornered by police. Fortunately, he did not broadcast his suicide, but others have, along with beatings, torture, and rapes.

It is neither possible nor reasonable to try to pin responsibility for those horrific acts on Facebook; Facebook did not cause anyone to commit those crimes, any more than it compelled anyone to create fake news sites.

But for better worse, Facebook is becoming the media platform of choice, by dint of its size, for the depraved as well as the decent. Act out on Facebook and you act out in front of the world. And it’s hard to see what Facebook can do to prevent that.

Facebook will try because it must. “We have a lot of work and we will keep doing all we can to prevent tragedies like this from happening,” CEO Mark Zuckerberg said at the company’s F8 developers conference this week. But scale is now the essence of Facebook’s business model, as Snapchat is now learning.

While good for Facebook’s share price in the short term, that scale has begun to generate forces that do not easily yield to algorithms. And in the wake of last year’s fake news controversy and this week’s very real horror show it’s hard not to wonder whether Facebook is still in full control of its own platform.