In case you missed it, last night Mark Zuckerberg published a response to accusations that fake news on Facebookinfluenced the outcome of the U.S. election, and helpedDonald Trump to win.
The CEOclaimed that at least 99% of news content on Facebook was authentic. Zuckerbergwrote:
Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.
Facebook boasts1.79 monthly active users, and it generated $7.01 billion in revenue in the third quarter of 2016.
The company has not enumerated the volume of posts that were categorized as news, and distributedthrough Facebooks News Feedduring the months leading up to the election.
In its earnings reports, Facebook does not break out how much of its revenue comes from political advertising or the promotion of news posts.That makes ithard for the public to evaluate what the impact of even 1% of hoax news could have been on users of Facebook who had the right to vote in the U.S. election.
Questions remain as to whether hoaxes could have been so well-targeted that they did, in fact, swayopinions of U.S. voters on candidates and issues.
We know that Facebook has the power to influence emotions and has testedits own abilities in this regard throughits 2012 Emotion Manipulation Experiment.
As TechCrunch reported when news of that experiment first broke,to impactusers moods,Facebook showed themless positive posts in their News Feed. As a result, users included.1% fewer positive words in their own posts, Facebook found.
Last night,Zuckerberg emphasized that Facebook currently relies on the wisdom and involvement of its users to flag hoaxes and fake news.He admitted the company could do more to improvethe quality of information shared via its News Feed.
However, he also warned the company would not rush to release new solutions around factchecking or quality-rating news content on the platform.
This is an area where I believe we must proceed very carefully though. Identifying the truth is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
Zuckerbergs comment draws a false equivalency between mainstream sources of news (including TechCrunch) and political groupsmasquerading asnews brands.
The Denver Guardianwas one site that posed as a news publisher to bombarded readerswithcontent full ofmisinformation meant to sway theiropinions about candidates and issues on the ballot. And another group, based in Macedonia, had been posting fake news to Facebooks News Feed simply to make money.
Fake news circulatedvirtually everywhereonline, and on Facebook,ata time when voters neededfacts to inform theirdecisions, unfortunately.
There is a possibility that Facebook may not even want tobecome arbiters of truth, because doing so could reduce engagement.
As a former Facebook designer named Bobby Goodlatte wrote on November 8th on his own Facebook wall,Sadly, News Feed optimizes for engagement. As weve learned in this election, bullshit is highly engaging.
Other social media playersunder fire for helping to spread false stories ahead of the U.S. election includeTwitter, Reddit,and others.
But unlike other social networks, Facebook can proudly claim that it helped 2 million people register to vote in this most recent election.What good is that if those voters arent effectively informed, though?