01 Sep 2016

Are algorithms always smarter than humans?

Human error is something we want to eliminate…right? Well Facebook tried to do just that and although they indeed succeeded in removing the potential bias that accompanies any human involvement, they in turn lost all sensibility and credibility, made apparent by their recent monumental Trending Topics screw-ups. 

Facebook loves its algorithms and pride themselves on ensuring users’ feeds are filled only with valuable content from relevant sources. So in a bid to find the algorithm to rule them all, Facebook’s most recent change was to the algorithm curating the Trending Topics. Although it’s hard to talk about any curation to begin with. Until this Friday, the Trending Topics were controlled by a human team who found themselves all swiftly fired following an earlier scandal claiming that conservative stories have been consecutively suppressed by the team. Outraged at the misrepresentation, the public queried the justice in having a select number of people controlling topics with such immense outreach and visibility as those on Facebook.

So what was Facebook’s solution? An algorithm of course! Theoretically sound, it was meant to improve the Trending Topics section, getting rid of potential subjectivity. Unfortunately, what also vanished is a set of guidelines the previous team followed, such as limiting the sources based on their credibility; as well as ensuring the quality of news and keeping it ‘appropriate’. What they got was an algorithm; what they lost was human sensibility. Inevitably, a few days on and Facebook’s new algorithm became a sensation, although not in exactly the way they might have expected.

Random and wrong are two words which may spring to mind. The Trending Topics algorithm has so far brought to the front of our newsfeed’s incorrect information about the Fox News’ host, Megyn Kelly, being fired from the network…she wasn’t. A story about Rob Lowe calling Ann Coultera a ‘Racist C*nt’ in reference to the Comedy Central Roast; and a series of wrongly named stories such as Watch Dogs 2, linking us not to a film, but to a video where we can literally watch dogs. However, Facebook’s biggest McNightmare came when #McChicken was brought to the worlds attention. If you happened to see it…I’m sorry.  A video of a man masturbating using said McChicken Sandwich doesn’t exactly represent the carefully sourced news Facebook claims to stand for.

watch dogs


With  drastic changes such as replacing the entire team with bots, one might expect some mistakes. But what’s most interesting about the entire issue, is that Facebook claim to have kept on some people to review the Trending Topics; ensuring  that they only refer to real-world news. Judging by #McChicken I really don’t believe that’s the case.  

On Friday, Facebook announced that the only people involved with Trending Topics from now on will be the engineers tweaking the algorithm (might need more than tweaking).


A solution suggested Harvard professor Jonathan Zittrain opted for a more ‘give power to the people’ approach, focusing on allowing users to select their preferred method of picking the trending topics, through an algorithm, being curated by ‘actual’ people, or based on the subject itself. Another approach may be to follow the flightpath of Twitter’s Trending Topics in both form and interestingly, lack of credibility; focusing only on the most talked about subjects. Fake news, or even a man desecrating a sandwich; everything’s allowed and everything’s tolerated. Twitter has never attempted to provide people with relevant, actual, or sensible trends, only the popular ones…similar to what Facebook accidentally did. 

Facebook made the mistake of promising its users accurate stories and then delivering the opposite. Hopefully it’ll be sorted before we end up with #McChickenRound2 or an equally hideous Burger King equivalent. Ugh.