Learning analytics and higher education: ethical perspectives

Students leave behind lots of information about themselves, with little or no realisation of what Higher Education Institutions (HEIs) do with it. Does it matter? We ran a  ½ day workshop at the LAK12 conference in Vancouver in late April 2012 to explore some of the ethical complexities that are introduced by using learning analytics to categorise and predict student cohorts and behaviours.

HEIs regularly access and capture student data, for example around gender, log in frequency, study goals, and assignment scores, in the hope that this will lead to a clearer and simpler means of understanding and driving student engagement and performance. That all sounds fine. So what’s the big concern?

Many studies which have reviewed ethical considerations around learning analytics have focused on privacy as the main issue, but what more might there be? During the session, participants considered Transparency and robustness (Who can see the data collected? Who can see/influence the models? How reliable and robust are the models?); Power (Who gets to decide what happens next? Who can choose which students get more support? Do teachers, learners, and administrators have the same authority/rights to determine what support is provided?); Ownership (who else can mine data collected? Can learners opt out, and if so, what happens? How long do HEIs keep collated data?) and Responsibility (Is there a shared responsibility to ensure that information is accurate? Can students opt to disguise themselves online? Do we have a responsibility to ensure equitable treatment of students based on what we know? (or despite what we know)).

We agreed that there are additional risks introduced by making assumptions which then determine and limit how HEIs behave toward and react to the student – both as individuals and as a member of a number of different cohorts. What are the rights of the student to remain an individual? Should the student have an awareness of their own label?

To explore some of the issues around how students might react in a world where support could be governed by learning analytics, we introduced the workshop participants to a game loosely based around snakes and ladders.

Each player was provided with a set of student attributes – gender, age, geographic location, study preference, study history, etc and embarked on their journey from the beginning of their studies toward (hopefully) successful completion. Various snakes and ladders around the board offered opportunities for rapid progress (students living close to campus can form self help groups which aid progress, move on 5 spaces) as well as potential pitfalls (younger students have less funds available and need to work to supplement fees, if you are under 25, move back 10 spaces). One player took on the role a self-sufficient student with no interventions based on their characteristics and could ignore both snakes and ladders (opting out of the learning analytics system). The games ran over several tables and we stopped play after around 20 minutes. Although all participants realised that this was a game, their reactions to some of the differential treatment was interesting.

It was clear that many felt that there was an issue when others got additional support but weren’t seen as being any worse off to start with. Someone had had an attribute, or label, removed partway through the game and could see the obvious impact of that – although we’re not as transparent as this with our own students. There was lots of talk about ‘being punished’ and ‘harmed’ by a decision based on an assigned label. Positive boosts toward the end goal based on player characteristics were rarely questioned, although when these came at the expense of others (as they would in our resource constrained times), the advantages of some balanced by the knockback of others became somehow less ‘fair’. ‘Student’ players felt that they had a lack of control over the game and the lack of clear justification and route of appeal was a cause of frustration. What we decided as a result was that there is a clear danger when learning analytics is very rule-based and there is obvious potential for unethical outcomes. In this world, the decision-makers are key and the rules should be fair (whatever that is). On the flip side, we asked whether there were potential dangers to opening up the transparency around this – almost certainly, was the response.

Following on from the game, workshop participants were asked to take on the role of a particular stakeholder group (students, tutor or institution) and were given a range of questions around the potential impact and issues of learning analytics to consider.

The tutor group felt that existing models would give at best a rough representation of student progress based on reliability and accuracy. There was a feeling that instructors are crying out for information about students at risk – but don’t want to be told what to do about it. Concerns were expressed around unhelpful assumptions and labelling and whether students might perceive asking for help as a failure – this needs to be recast as asking for help is a successful behaviour. The big question emerging from this group related to how to best deploy support resource ethically and appropriately. 

The institutional group found a lot of commonalities across their diverse institutions. There was a big issue around the locus of control. The sophistication of analysis isn’t quite there to make the focused teaching-and-learning interventions we might like to. Questions were raised  around for-profit institutions: what’s the institutional stake? Improving learning, or is cash the bottom line? Early analytics are focusing on early alert, intervention, and helping students who may fall out of the mainstream – but what about the successful students, and using that analytics to uncover what makes tem successful, or what makes a quality module?

The student group talked much about how the system might be set up. It seemed that most assumed that students wouldn’t expect to know what the system captures about them – after all, most of us don’t know (or care?) much about what Facebook does with our data. Perceptions and social issues seemed to be important in labelling. Are you your academic record? College as a fresh start – should your record be cleaned? People do change – are labels for life? The pluses and minuses of transparency. It can change student behaviour – after all, if only one person can get the top marks, the drive to share and help others is impacted – because others’ misfortune can only improve your own outcomes…

What next? One hope is that HEIs will move toward the adoption of a code of conduct or framework which set out some guidelines for the use of learning analytics. There are plenty of existing guidelines in place on, for example, data protection which might be usefully expanded. Many of the issues under consideration are usefully identified in Campbell, DeBlois and Oblinger (2007) EDUCAUSE Review 42(2) . It’s safe to say that there’s sufficient difference between HEIs in terms of scale and operating model to make the adoption of a single code of conduct less likely, although there’s a great deal of interest in establishing a framework at least.


About sharonslade

Dr Sharon Slade is a senior lecturer in the Faculty of Business and Law at the Open University in the UK working to support both tutors and students on Open University distance learning modules and programmes. Her research interests encompass ethical issues in learning analytics and online learning and tuition. Project work includes the development of a student support framework to improve retention and progression and the development of a university wide tool for tracking students and triggering relevant and targeted interventions. She led the development of new policy around the ethical use of learning analytics within the Open University, UK.
This entry was posted in education, internet and tagged , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s