Shortly before Facebook held its blockbuster public offering in 2012, one of its top execs attempted to refute a nagging question about the company.
No, it wasn’t about whether Facebook could make money off of smaller screen devices, or continue its streak of limitless user growth. The thorny issue was whether Facebook should be considered a media company.
“We actually define ourselves as a technology company,” Carolyn Everson, VP of global marketing solutions at Facebook, said at a event that year after an advertising exec had called it a media business.“Media companies are known for the content that they create.”
Four years later, that dividing line between media and tech is fuzzier than ever, as Facebook and its peers increasingly package, host and yes even create some news and entertainment content. Yet Facebook appears to be hellbent on clinging to its tech identity.
This week, we saw more clearly than ever how Facebook’s refusal to be seen as a media company comes at its own peril.
Facebook endured a prolonged outcry over allegations that contractors hired to help curate Trending Topics had inappropriately downgraded certain conservative news topics.That particularly explosive charge has yet to be proven, butin its attempts to ease concerns about systemic editorial bias, Facebook unintentionally kickstarted new line of criticism.
The company released its editorial guidelines for the first time, which clearly show that human editors play a greater role in selecting the most important news stories at Facebook. That quickly led to charges that Facebook has been misleading the public about the role that humans, not just algorithms, play in picking the news.
Much of this could have been avoided if Facebook had been more upfront and transparent years ago about its reliance on human editors. Few would have been shocked to hear Facebook needed real people to help surface interesting articles.
The problem wasn’t the approach, or even the guidelines for the approach, which were actually far more detailed and thought out than what you’d find for curators at many news outlets. The problem was Facebook’s cageyness about its own approach, suggesting it had something to hide. Because it did.
Move fast and break news
Sources at Facebook have long been skittish and vague in conversations with me talking about the role of people in an editorial capacity.It was easier to focus on the whims of a cold, unfeeling algorithm.Saying morewould have meant leaning in to its alter-ego as a media company one with editors, editorial guidelines and, yes, occasionally editorial biases even if it’s simply a bias against certain types of garbage content.
Indeed, The Guardian reports that Facebook recognized the need for human editors in 2014 partly because “fluffy” viral stories about the Ice Bucket Challenge were gaining more exposure on the site than news about the riots in Ferguson.
Facebook wanted to prove that it, like Twitter, could be a destination for hard news. With enough transparency, the hiring of a public editor or other watchdog and the right guidelines, which it seems to have had in place, that would have been viewed by the media as a difficult but commendable goal.
Now Facebook’s Trending Topics project is tarnished with controversy. Worse still: it has re-ignited trust issues with Facebook’s other news efforts, whether it be the much more influential News Feed, or more recent projects like Instant Articles and Facebook Live.
Facebook wants to build what founder Mark Zuckerberg has called “the best personalized newspaper in the world.” But until now it has done so while reflexively shying away from being viewed as a media company and putting in place the additional safeguards one would expect from a media company.
“They are the world’s #1 source of news. Therefore there is a unique set of responsibilities incumbent upon them,” David Kirkpatrick, author of The Facebook Effect, the definitive history of the company’s early days, said in one interview this week.
“I think to some degree they are still a little immature as a company to even know how to deal with that,” he continued. “Being a news source and having editors is a relatively new thing.”
Be like Snapchat and… Yahoo?
It may be a new problem for Facebook to solve, but this week’s events are a reminder that it needs to move fast in finding a solution.
On Thursday night, after staying silent on the controversy for days, Zuckerberg noted in a Facebook post that he cares about fixing the issue because it is “core” to the company. He meant it in the sense that Facebook must stay an “open” platform for conversation, but it also touches on what is increasingly becoming Facebook’s new core: media.
Facebook is devoting tremendous resources to pushing live video on its platform, including paying some publishers for video content and courting celebrities. It is partnering with an ad agency on a branded morning show. It is hosting more and more news in the form of Instant Articles. And it considered buying rights to stream NFL games, a deal ultimately won by another tech company that has claimed not to be a media company, Twitter.
The road ahead for Facebook is clear. It will be home to more exclusive video content, some of which it pays partners for, others that it works with partners to produce. It wants to be a go-to destination for curated news selected and packaged by human editors to compete with the likes of Twitter. And it wants to do all that while pretending to be nothing more than a technology company.
Facebook would do well to go the route of Snapchat or Yahoo, both of which have embraced their dual roles as tech and media companies. Snapchat isn’t shy about exercising editorial discretion and picking favorites whether it be its selective Discover media partners or its curation of user content in stories. Yahoo may be lampooned for not being able to decide whether it’s tech or media, but at least it doesn’t hide the latter.
Some unsolicited advice, Facebook: Make your editorial staff known with a masthead. Publish similar editorial guidelines for each of your other media products. And bring on a public editor to help keep the staff honest and the public informed.
If you’re lucky, it may just limit the amount of bad news in the future.