An embarrassing blunder in Facebook’s Trending section demonstrates the value of human editors and the shortcomings of algorithms in evaluating news and social media content.
Facebook terminated human editors of its Trending topics section and replaced them with an algorithm managed by engineers last Friday. Within a few days, the Trending topics headlined a fake news story saying Fox News personality Megyn Kelly had been fired for supporting Hillary Clinton. Based on Twitter mentions, the fake article remained at the top Trending post for several hours, according to the Washington Post.
Previously, editors wrote descriptions and short article summaries and enforced quality control, removing fake and obviously false articles. Now, the algorithm pulls excerpts directly from news stories.
While Facebook maintains it always intended to move to automation, accusations they were biased against conservative issues prompted the company to speed up the transition and lay off the editors, 15 to 18 in all, according to Quartz. Conservatives alleged the Facebook editors replaced conservative posts in favor of more liberal ones. Facebook said its investigation found no evidence of systematic bias.
Facebook Loses Face
Not surprisingly, commentators criticized Facebook and its engineers for not spotting the fake story and allowing it to remain on the list for hours. The numerous typos in the post, its source, and the outrageous headline would have alerted most any editor to a potential hoax.
“There were so many problems with this story, ranging from plagiarism to falsity, that even a fairly simple-minded robot editor should have caught them,” comments Annalee Newitz, editor at Ars Technica. “The Trending algorithm is clearly not ready for prime time, or maybe Facebook is just trying to redefine what it calls “’a breadth of ideas and commentary about a variety of topics.’”
Some commentators urged Facebook to rehire human editors. Some said it should eliminate the Trending section all together. The Kelly story wasn’t the first Trending blunder. The section has suffered an ongoing problem of spam, errors and links to questionable sources.
The Problem with Algorithms
So far at least, Facebook shows no sign of abandoning its deep-seated faith in algorithms. But some observers noted that algorithms can also be biased. In fact, human bias can be embedded in the formulas and is difficult to remove. Given that Facebook employed the editors to “train” the algorithm, bias may already be ingrained, Quartz warned.
The lack of journalistic review means more than embarrassing gaffes. Facebook has become the primary news source for millions of people. Spreading misinformation, a common tactic of partisan extremists, is becoming increasingly easy to do as a result of Facebook’s changes.
With the lack of human gatekeepers, marketers will also attempt to game the algorithm to link their messages into current events and gain exposure in Trending topics. “In the past, there’d be no chance of getting past the editorial gatekeepers, but now, with the new system in play, it seems that the gate is a little more open for marketers to creep through,” wrote Andrew Hutchinson at Social Media Today.
Note, however, that Facebook has said that the Trending feature would be more automated, not fully automated. “There are still people involved in this process to ensure that the topics that appear in Trending remain high-quality,” the Facebook announcement said. The implication is that marketers and PR folks will still find it difficult to manipulate the Trending feature, especially since the Megyn Kelly hoax has already embarrassed Facebook.
Nonetheless, Facebook with its vast programming resources has been unable to develop an algorithm to evaluate news and social media content with fool-proof accuracy. Algorithms to assess sentiment in media mentions have even greater shortcomings. Human editors, while not flawless, remain the best “tool” to evaluate content.
Bottom Line: A fake news story in Facebook’s Trending section demonstrates in a very public way the fallibility of algorithms and the value of human editors. Because so many people obtain their news almost exclusively from social media, the hoax also highlights the disturbing power of algorithms in shaping public opinion.
William J. Comcowich founded and served as CEO of CyberAlert LLC, the predecessor of Glean.info. He is currently serving as Interim CEO and member of the Board of Directors. Glean.info provides customized media monitoring, media measurement and analytics solutions across all types of traditional and social media.
Well first of all, it is literally the trending section, if enough people are talking about something, it is my belief that it deserves to be on there. Facebook isn’t a media company, they are a platform where users can share content. It isn’t really their responsibility to verify the validity of the content now is it?