Beware online filters #LAK12


Excellent short  TED talk from Eli Pariser talking about the dangers of allowing personalisation of information to go uncontrolled. Most information that we access online is processed invisibly. As users, we have no real awareness of what information is selected for us to see – nor what information is removed from what is made available to us.

Pariser gives a range of examples of this – from his Facebook page which over time removed some of the links which he clicked on less frequently, to everyday news facilities such as Yahoo news and the Huffington Post. Both will give different users different items of news to view and in different viewing orders. Something out there is making decisions, based on our historical online behaviours, to present a personalised, but restricted service.

He states that Google uses its search algorithms to examine over 50 factors – what kind of computer we’re on, where we are, the browser we use – as well as past online usage – to generate a set of search results that will be specific to us. Two people can run a search at the same time, using the same words, and get completely different results – not just in terms of what is included, but also in terms of the priority given to each in the sort order.

So, the internet has begun now to show us what it thinks we need to see, but not necessarily what we do need to see. Pariser provides this unnerving quote from Eric Schmidt of Google: ‘It will be very hard for people to watch or consume something that has not in some sense been tailored for them’.

He refers to this as a filter bubble – our own personal, unique universe of information that we live in online, which will depend on who we are and what we do. The problem with this is that we don’t get to decide what gets in, or see what gets edited out.

The challenge lies with the algorithms which create these information filters.  Algorithmic gatekeepers don’t seem to have the embedded ethics that a human filter might include – making sure that we get some information vegetables  with our information dessert. There is a need to ensure that algorithms explicitly recognise both the relevance of information to our needs and include important or challenging perspectives. We need code developers to begin to embed an ethical perspective into the code that they write and to capture some sense of civic responsibility.

And information filters should be transparent – so that we can see what the rules are and give back at least some control to users – in terms of what gets through and what doesn’t.

About these ads

About sharonslade

Dr Sharon Slade is a senior lecturer in the Faculty of Business and Law at the Open University in the UK working to support both tutors and students on Open University distance learning modules and programmes. Her research interests encompass online delivery learning and tutoring, online learning communities and ethical issues in learning analytics. Project work includes the development of a student support framework to improve retention and progression and the development of a university wide tool for tracking students and triggering relevant and targeted interventions. She is leading the development of new policy around the ethical use of learning analytics within the Open University, UK.
This entry was posted in internet, software, Uncategorized and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s