12 May 2015

Facebook says the political echo chamber is your fault

dont-panic-facebook-general-election-london-eye-advertising-agency-london

(Facebook lit up the London Eye as a pie chart during the General Election 2015)

If your news feed seemed to be full of support and content related to the party of your choice during the lead up to the UK General Election, then it’s probably your own fault according to Facebook.

A recent article by Michael Rundle for Wired talks about how, for years, there’s been controversy over the way Facebook have managed the content you see on your news feed. Once a basic chronological feed of posts from your friends, the news feed has become an algorithmically structured selection of content based on what you have and haven’t clicked on over time. The idea is to give us content that is more likely to be engaging.

A recent study by the social network highlights that a person’s activity and who they follow determines the political tone of the content they see on their news feed.. This, in turn, creates a bit of an ‘echo chamber’ of posts that reflect only a selection of views, opinions and attitudes.

“…if algorithms are going to curate the world for us, then… we we need to make sure that they also show us things that are uncomfortable or challenging or important.”

Eli Pariser, CEO of Upworthy

In order to get an idea of the impact this would have in terms of bias, Facebook conducted the study with 10.1 million users. Only those who stated a political opinion were considered, of whom made up less than one in ten of the segment.

What they found is that Facebook does curate information, showing 8% less posts about the opposite spectrum than if randomly selected. If, however, the algorithm was not to curate any information, the difference would only be 1% less.

In a nutshell, Facebook are trying to say they haven’t done too much at all – that we are doing most of the content curation ourselves by consistently picking and choosing the content we consume.

If there was never any curation at all to begin with would things not be vastly different to begin with? Would we have the same opinions and views of the world that we currently have if not for Facebook’s news feed algorithm?

To read the full article by Michael Rundle, simply click here.