Excellent short TED talk from Eli Pariser talking about the dangers of allowing personalisation of information to go uncontrolled. Most information that we access online is processed invisibly. As users, we have no real awareness of what information is selected for us to see – nor what information is removed from what is made available to us.
Pariser gives a range of examples of this – from his Facebook page which over time removed some of the links which he clicked on less frequently, to everyday news facilities such as Yahoo news and the Huffington Post. Both will give different users different items of news to view and in different viewing orders. Something out there is making decisions, based on our historical online behaviours, to present a personalised, but restricted service.
He states that Google uses its search algorithms to examine over 50 factors – what kind of computer we’re on, where we are, the browser we use – as well as past online usage – to generate a set of search results that will be specific to us. Two people can run a search at the same time, using the same words, and get completely different results – not just in terms of what is included, but also in terms of the priority given to each in the sort order.
So, the internet has begun now to show us what it thinks we need to see, but not necessarily what we do need to see. Pariser provides this unnerving quote from Eric Schmidt of Google: ‘It will be very hard for people to watch or consume something that has not in some sense been tailored for them’.
He refers to this as a filter bubble – our own personal, unique universe of information that we live in online, which will depend on who we are and what we do. The problem with this is that we don’t get to decide what gets in, or see what gets edited out.
The challenge lies with the algorithms which create these information filters. Algorithmic gatekeepers don’t seem to have the embedded ethics that a human filter might include – making sure that we get some information vegetables with our information dessert. There is a need to ensure that algorithms explicitly recognise both the relevance of information to our needs and include important or challenging perspectives. We need code developers to begin to embed an ethical perspective into the code that they write and to capture some sense of civic responsibility.
And information filters should be transparent – so that we can see what the rules are and give back at least some control to users – in terms of what gets through and what doesn’t.