• -

Aaron Silvers discusses the xAPI specification

Category : Learning Analytics

Aaron SilversHere at the Jisc / Apereo LAK16 Hackathon in Edinburgh we have a group of very talented and motivated people from around the world, beavering away to produce things like new visualisations of learning analytics, and a “personal learning locker” so students can take their learning records with them.

One of the key technologies which is facilitating the development of learning analytics systems is the Experience API – or xAPI. To find out more about this technology, I took time out from the conference to chat to Aaron Silvers, President of the Data Interoperability Standards Consortium, which is driving the development of the xAPI specification.

In this 15 minute podcast I ask Aaron about his organisation, what xAPI is, what the challenges are with it, and where it’s heading next.


  • 2
Summer of Student Innovation in Birmingham

Student app for learning analytics: functionality and wireframes

Category : Learning Analytics

Jisc's Summer of Student Innovation in Birmingham

Jisc’s Summer of Student Innovation in Birmingham

Our app to display learning analytics to students themselves is taking shape.  After brainstorming requirements for the app with staff and students at a workshop in London and seeking further input from students at Lincoln University we obtained useful feedback on our draft designs from students at our Summer of Student Innovation (SOSI) session in Birmingham in early August.

Continuing my student app design tour of England, I joined colleagues from Jisc and Therapy Box last week in Manchester to apply the same methodology to app design that our SOSI students have been using.  The technique, developed by live|work includes devising personas and user journeys, and competitor and SWOT analyses, defining business and technical requirements, walking through concepts with other teams, and the development and user testing of wireframes.

This was a highly effective process, enabling us in a day and a half of intensive work to narrow down a large number of potential requirements to a manageable feature set, and to tackle key aspects of presentation and navigation. The result is a set of wireframes for the designers at Therapy Box to get their hands on before they start the build.

A major influence on our thinking is the use of fitness apps such as MapMyRide and Google Fit: some of us are already avid users of these technologies. To emulate their addictive qualities in an app for learning is one of our aims. In developing the concepts we were informed by some assumptions which have emerged from earlier work:

  1. That students can be motivated by viewing data on how they’re learning and how they compare with others
  2. That making the app as engaging as possible is likely to result in greater uptake and more frequent access to it, with a corresponding positive impact on motivation
  3. That increased motivation is likely to lead to better learning – with positive impacts on retention and student success

We do of course recognise that the app may have negative effects on some students who find it demotivating to discover just how badly they’re performing in relation to others. However there’s a strong argument that it’s not in these students’ interests to remain with their heads in the sand. Meanwhile if data exists about them shouldn’t we be helping students to see that data if they want to?

Moving on from ethical issues, which I’ve covered extensively in an earlier post, six principles which we want to embody in the app are now explicit.  We believe it should be:

  1. Comparative – seeing how you compare with class averages or the top 20% of performers for example may provide a competitive element or at least a reality check, driving you to increased engagement.
  2. Social – being able to select “friends” with whom to compare your stats may add another level of engagement.
  3. Gamified – an app which includes an element of gaming should encourage uptake by some students. This may be manifested in the use of badges such as a “Library Master” badge for students who have attended the library five times.
  4. Private by default – while data that the institution knows about you from other systems may be fed to the app, the privacy of any information you input in the app yourself will be protected. However anonymous usage information may be fed back to the app backend.
  5. Usable standalone – by students whose institutions are not using the Jisc architecture.
  6. Uncluttered – the app should concentrate for the time being on learning analytics data and not be bloated with functionality which replicates what is already present in the VLE/LMS or in other applications.

So let me now take you through the draft wireframes to show how these principles are being taken forward (- click the images to enlarge).

Student App screenshot

When first logging in the student is able to select their institution from a pre-populated lists of UK universities. If the students’ institution is using other parts of Jisc’s learning analytics architecture, in particular the learning analytics warehouse, then much more data will be available to the app.

For simplicity we’re ignoring for the time being the use case of a student signed up with more than one institution.

But we’re incorporating functionality which we think will be of interest to students regardless of whether their institution has signed up. That’s setting targets and logging their learning activities, about which more later.

Student App screenshotWhile what should go into an activity feed or timeline is likely to be the subject of much debate and future educational research, we plan to integrate this dynamic and engaging concept, so essential to applications such as Twitter and Facebook.

The wireframes are intentionally black and white and allow space to incorporate images but not the images themselves – in order to concentrate on concepts, layout and navigation at this stage.

Here the images may be of your friends or badges awarded. We include examples of possible notifications such as “Sue studied for 2 hours more than you!” but at this stage make no comment as to whether these would be motivational, irritating or otherwise. Future user testing will help clarify what should be included here and how the notifications should be worded.

Student App screenshotThe engagement and attainment overview mirrors what many fitness apps do: it provides an overview of your “performance” to date. Critically here we show how you compare to others. This will be based on data about you and others held in the learning analytics warehouse. It may include typical data used for learning analytics such as VLE/LMS accesses, library books borrowed, attendance records and of course grades.

We’ll research further how best to calculate and represent these comparators or metrics. At this stage we’ve chosen to avoid traffic light indicators for example as these would require detailed knowledge of the module and where the students should be at a particular point in time.

Now let’s see what happens when you click the More button.

Student App screenshot

In the activity comparison screen you’ll see a graph of your engagement over time and how it compares with that of others. You can select a particular module or look at your whole course.  We’ll populate the drop-down list with options for who you can compare yourself with such as people on my course, people on this module and top 20% of performers (based on grades).

Comparing yourself to prior cohorts of students on a module might be of interest in the future too.

We may show a graph here with an overall metric for “activity” based on VLE/LMS usage, attendance etc. Or we may want to break this down into its components.

Student App screenshot

The next feature of the app allows you to log your activities. This is some of the “self-declared” activity that we think students might want to input in order to gain a better understanding of what learning activities they’re undertaking and how much effort they’re putting into each.

Let’s click Start an activity.




Student App screenshotStarting an activity allows you to select the module on which you’re working, choose an activity type from a drop-down list such as reading a book, writing an essay, or attending a lab, and select a time period you want to spend on the activity and whether you want a notification when that period is up.

A timer is displayed in the image box and you can hit the Stop button when you’ve finished.  The timer will continue even if you navigate away from the app.



Student App screenshotSetting a target is the final bit of functionality we want to include in the app at this stage. Again this is building on the success of fitness tracking apps where you set yourself targets as a way of motivating yourself.

In this example the user has set a target of reading for 10 hours per week across all their modules. The image will show a graphic of how close they are to achieving that target based on the activity data they have self-declared in the app.

Navigation to your next target may be through swiping.


Student App screenshotSetting a target involves selecting a learning activity from a pre-populated list and specifying how long you want to be spending on it.

We added a “because” free text box so that learners can make it clear (to themselves) why they want to carry out the activity e.g. I want to pass the exam, tutor told me I’m not reading enough).

Users may be more likely to select a reason from a pre-populated list than to fill in a text field but we’ll monitor this to see whether it’s being used.

We’re also considering the use of mood indicators here to show how happy or otherwise you’re feeling about how you’re getting on with meeting your target. Lots of potential for comparing your mood with others, showing how it’s changed over time or even sentiment analysis by institutions if students want to share such information with them – but that’s one to tackle later.

This doesn’t include all the screens we’ll need but we do now have a pretty good idea of initial functionality to be incorporated in the app, its layout and navigation. There’ll no doubt be a fair bit of tweaking before v0.1 is built but you should get the general idea of what’ll be available from the screens above. We make no attempt at this stage to incorporate predictive analytics, which might show for example if you’re on track to succeed or drop-out. That will come in future iterations as, no doubt, will some of the other great ideas people have come up with that we’re leaving out for this first version of the app scheduled for release in April 2016.

  • -
A model for the development of a code of practice

Paper submitted to Journal of Learning Analytics: “Developing a Code of Practice for Learning Analytics”

I’ve just submitted a paper to a forthcoming “Special Section on Ethics and Privacy” in the Journal of Learning Analytics (JLA).  The paper documents the development of Jisc’s Code of Practice for Learning Analytics through its various stages, incorporates the taxonomy of ethical, legal and logistical issues, and includes a model for developing a code of practice which could be used in other areas.

A model for the development of a code of practice

A model for the development of a code of practice

As an open journal the JLA suggests that authors publish their papers before or during the submission and review process – this results in the work getting out more quickly and can provide useful feedback for authors. So here’s the paper – and if you have any feedback it would be great to hear from you.

Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organisation which champions the use of digital technologies in UK education and research, has attempted to address this with the development of a Code of Practice for Learning Analytics. The Code covers the main issues institutions need to address in order to progress ethically and in compliance with the law. This paper outlines the extensive research and consultation activities which have been carried out to produce a document which covers the concerns of institutions and, critically, the students they serve. The resulting model for developing a code of practice includes a literature review, setting up appropriate governance structures, developing a taxonomy of the issues, drafting the code, consulting widely with stakeholders, publication, dissemination, and embedding it in institutions.


  • -
Media City Uk

Code of Practice for Learning Analytics launched

Today Jisc is launching the Code of Practice for Learning Analytics at the UCISA Spotlight on Digital Capabilities event here in the amazing MediaCityUK at Salford Quays.

Media City UK, Salford

Developing this document was chosen by institutions as the number one priority for Jisc’s learning analytics programme.  The Code aims to help universities and colleges develop strategies for dealing with the various ethical and legal issues that may arise when deploying learning analytics.

Code of Practice coverIt’s a brief document of four pages and is available in HTML or in PDF. The development of the Code was based on a literature review of the ethical and legal issues. From this a taxonomy of the ethical, legal and logistical issues was produced. The Code was drafted from this taxonomy and is grouped into seven areas:

  1. Responsibility – allocating responsibility for the data and processes of learning analytics within an institution
  2. Privacy – ensuring individual rights are protected and data protection legislation is complied with
  3. Validity – making sure algorithms, metrics and processes are valid
  4. Access – giving students access to their data and analytics
  5. Enabling positive interventions – handling interventions based on analytics appropriately
  6. Minimising adverse impacts – avoiding the various pitfalls that can arise
  7. Stewardship of data – handling data appropriately

The Code was developed in the UK context and refers to the Data Protection Act 1998 however most of it is relevant to institutions wishing to carry out learning analytics anywhere, particularly in other European countries which have similar data protection legislation. It can be adopted wholesale or used as a prompt or checklist for institutions wishing to develop their own learning analytics policies and processes.

If you find the document helpful or feel that anything is unclear or missing please let us know. Keeping it concise was thought to be important but that meant leaving out more in-depth coverage of the issues. Over the coming months we’ll be developing an associated website with advice, guidance and case studies for institutions which wish to use the Code.

The process has been overseen by a Steering Group consisting of Paul Bailey (Jisc), Sarah Broxton (Huddersfield University), Andrew Checkley (Croydon College), Andrew Cormack (Jisc), Ruth Drysdale (Jisc), Melanie King (Loughborough University), Rob Farrow (Open University), Andrew Meikle (Lancaster University), David Morris (National Union of Students), Anne-Marie Scott (Edinburgh University), Steven Singer (Jisc), Sharon Slade (Open University), Rupert Ward (Huddersfield University) and Shan Wareing (London South Bank University).

It was particularly good to have the student voice represented in the development of the Code by David Morris of the NUS. I’m also especially grateful to Andrew Cormack and Rupert Ward for their perceptiveness and attention to detail on the final draft. I received additional helpful feedback, most of which I was able to incorporate, from the following people (some in a personal capacity, not necessarily representing the views of their organisations):

Helen Beetham (Higher Education Consultant), Terese Bird (University of Leicester), Crispin Bloomfield (Durham University), Alison Braddock (Swansea University), Annemarie Cancienne (City University London), Scott Court (HEFCE), Mike Day (Nottingham Trent University), Roger Emery (Southampton Solent University), Susan Graham (Edinburgh University), Elaine Grant (Strathclyde University), Yaz El Hakim (Instructure), Martin Hawksey (with other members, Association for Learning Technology), Ross Hudson (HEFCE), John Kelly (Jisc), Daniel Kidd (Higher Education Statistics Agency), Jason Miles-Campbell (Jisc), George Munroe (Jisc), Jean Mutton (Derby University), Richard Puttock (HEFCE), Katie Rakow (University of Essex), Mike Sharkey (Blue Canary), Sophie Stalla-Bourdillon (Southampton University), Sarah Stock (University of Essex) and Sally Turnbull (University of Central Lancashire).

Finally, many thanks to Jo Wheatley for coordinating the production of the print and HTML versions of the Code.


  • 2
The group in Edinburgh

Notes from the Learning Analytics Network event in Edinburgh

Category : Learning Analytics

On a fabulous spring day last Friday around 44 people made it to Edinburgh for the second meeting of the Learning Analytics Network, jointly organised by Edinburgh University and Jisc.



I suspect a number of us may have been somewhat bleary-eyed after witnessing the political landscape of the UK being redrawn during the previous night.  However the quality of the presentations and discussion throughout the day seemed to keep everyone awake.  It was particularly interesting to hear about the various innovations at Edinburgh itself, which is emerging as one of the pioneering institutions in learning analytics.

DraganEdinburgh’s visionary approach is highlighted by the recent appointment of Dragan Gašević as Chair in Learning Analytics and Informatics, and it was great to have Dragan give the first presentation of the day: Doing learning analytics in higher education: Critical issues for adoption and implementation PDF – 1.21MB.

Dragan outlined why learning analytics is increasingly necessary in education and examined some of the landmark projects so far such as Signals at Purdue and the University of Michigan’s development of predictive models for student success in science courses.

In an Australian study Dragan was involved with he found there is a lack of a data informed decision making culture in universities and that, while researchers are carrying out lots of experimentation, they are not focussed scaling up their findings. Finally Dragan looked at ethics and mentioned the Open University’s policy and Jisc’s (soon to be released) Code of Practice for Learning Analytics.

Next up was Sheila MacNeill on Learning Analytics Implementation Issues (Presentation on Slideshare). Sheila gained expertise in learning analytics while working for Cetis and has now been attempting to put this into practice at Glasgow Caledonian University. On arriving at the institution 18 months ago she found it was difficult to get to grips with all the systems and data of potential use for learning analytics. She started by identifying the areas: assessment and feedback, e-portfolios, collaboration and content.  This data is hard to get at and needs a lot of cleaning to be able to be used for learning analytics.

Sheila’s summary slide outlines the main issues she’s encountered:

  • Leadership and understanding is crucial – you need both a carrot and stick approach.
  • Data is obviously important – ownership issues can be particularly problematic.
  • Practice can be a challenge – cultural issues of sharing and talking are important.
  • Specialist staff time matters – learning analytics has to be prioritised for progress to be made.
  • Institutional amnesia can be an issue – people forget what’s been done before and why.

wilmaZipping back to the East Coast, Wilma Alexander talked about Student Data and Analytics Work at the University of Edinburgh (PDF – 866kB).  She discussed attempts to use Blackboard Learn and Moodle plug-ins for learning analytics, finding that neither of them were designed to provide data to students themselves. They then collected 92 user stories from 18 staff and 32 students. Much of what people wanted was actually already available if they knew where to look for it. Students wanted to understand how they compare with others, to track their progress, and to view timetables, submissions and assessment criteria.

The next presenter, also from Edinburgh, was Avril Dewar: Using learning analytics to identify ‘at-risk’ students within eight weeks of starting university: problems and opportunities (PPT – 351kB). Avril discussed her work at the Centre for Medical Education to develop an early warning system to identify disengaged students. 80% of at-risk students were identified by the system. Metrics included: engagement with routine tasks, completion of formative assessment, tutorial attendance, attendance at voluntary events, and use of the VLE.

Yet another Edinburgh resident, though this one working for Cetis rather than the University, was next.  Wilbert Kraan presented on The Feedback Hub – where qualitative learning support meets learning analytics (PPT – 1.86MB). The Feedback Hub is part of Jisc’s Electronic Management of Assessment project, working with UCISA and the Heads of eLearning Forum. It aims to provide feedback beyond the current module, looking across modules and years.  Wilbert proposed that feedback related data could be a very useful input to learning analytics.

My colleagues Paul Bailey and Michael Webb (most definitely neither from Edinburgh) and I (from Edinburgh originally!) then updated attendees on progress with Jisc’s Effective Learning Analytics programme (PDF – 318kB).  In particular we described the procurement process for the basic learning analytics system (which will be the subject of further publicity and another imminent post on this blog) to be made available freely to UK universities and colleges.  We also discussed the Discovery Stage where institutions can receive consultancy to assess their readiness for learning analytics. Paul concluded by mentioning the next Network Event at Nottingham Trent University on 24th June (Booking Form).

Later we had several lightening talks, the first from Prof Blazenka Divjak of the University of Zagreb, though currently a visiting scholar at, you guessed it, the University of Edinburgh.  Blazenka presented on Assessment and Learning Analytics (PPTX – 385kB). She’s found the main challenge in learning analytics to be the management and cleansing of data.  She discussed two projects undertaken at Zagreb.  The first examined the differences in performance between groups based on gender, previous study etc. The second analysed the validity, reliability and consistency of peer assessment.  She demonstrated visualisations which allow students to compare themselves with others.

Paula Smith from Edinburgh gave another interesting lightening presentation on The Impact of Student Dashboards.   She reported on an innovation in their MSc in Surgical Sciences which expanded on existing tracking of students via an MCQ system to create a student dashboard. This allowed them to monitor their performance in relation to others, to provide information on at-risk students to staff and to evaluate the impact of any interventions that took place as a result. Most students welcomed the dashboard and many thought they would want to view it monthly.

Finally, Daley Davis, from Altis Consulting talked about what his company is doing in Learning Analytics (PDF – 663kB). Altis is an Australian company and Daley discussed how Australian institutions are extremely focussed on retention due to the funding regime. Working with the University of New England, Altis cut attrition rates from 18% to 12%.  A student “wellness engine” was developed to present data at different levels of aggregation to different audiences. Data used included a system which asked students for their emotional state.

In the afternoon we split into groups to discuss the “burning issues” that had emerged for us during the day.  These were:

  • Make sure we start with questions first – don’t start with a technical framework
  • Data protection and when you should seek consent
  • When to intervene – triage
  • Is the data any use anyway?
  • Implementing analytics – institutional service versus course/personal service
  • Metrics and reliability
  • Institutional readiness / staff readiness
  • Don’t stick with deficit model – focus on improving learning not just helping failing students
  • Treating cohorts / subject disciplines / age ranges differently
  • Social media – ethics of using Facebook etc for LA
  • Can’t not interpret data just because there’s an issue you don’t want to deal with
  • Using LA at either end of the lifecyle
  • Ethics a big problem – might use analytics only to recruit successful people
  • Lack of sponsorship from senior management
  • Essex found through student surveys that students do want analytics

I’m immensely grateful to Nicola Osborne for her comprehensive live blog of the event from which this summary draws heavily. I wish there was a Nicola at every event I attended!

  • -
Students at workshop

What do students want from a learning analytics app?

In February we ran a workshop in London with university staff and a couple of students to gather requirements for a student app.  I’m now carrying out sessions directly with students to find out what they would find most useful.  Yesterday I had the pleasure of visiting the University of Lincoln at the invitation of Marcus Elliott. The students were from a variety of levels and backgrounds, ranging from engineering to drama.

MAB Main Admin Building  (Credit - University of Lincoln)

MAB Main Admin Building (Credit – University of Lincoln)

Most of them had little idea of what learning analytics was about so I introduced the session by describing a few things that were being done in the area – attempting not to influence their thinking too much. Marcus and I had agreed that we were better starting with a blank slate and then looking at whether there was any common ground with the conclusions of the London workshop.

As with the previous event it was a challenge to keep the group focussed on the applications of learning analytics without straying into all the useful things that apps could do for students.  I felt it was better though just to let the ideas flow, and not to impede the creativity in the room.

Students at workshop


The students came up with ideas for functionality, put them on stickies, and discussed them with a partner.  Then they all came together and spontaneously grouped the ideas into four categories: academic, process of learning, social integration and system monitoring / institutional data.

At this stage we didn’t want to look too much at presentational issues however we provided the students with blank smartphone screen templates to scribble on in order to focus them on what the functionality might involve.

student app 5



Inevitably there was a focus on assessment and, as with the London workshop, up to date data on grades was thought to be one of the most useful things a student app could provide.

Is this learning analytics?  I don’t know – but ideas such as showing your ranking in the class and being able to manage processes from this screen such as clicking to arrange a tutorial would certainly be useful.

Other ideas included calendar reminders of assessment due dates and exams.


Student app 2The app could provide a one-stop shop for all of a student’s results.

It could also show what percentage of assessments the student has completed but also what grades they need to obtain in future assignments in order to receive different levels of degree award.







student app 4Better feedback to students from their lecturers was also thought to be something the app could facilitate. This student neatly links personalised feedback to more detailed suggestions on how to improve particular skills e.g. academic writing skills and options for self-development such as links to help sessions which could be placed directly into the student’s calendar.

Giving real-time feedback to lecturers rather than waiting till it’s too late via student surveys was another option. This could help speed up improvements to courses.





student app 6Providing reading list functionality was also popular with the attendees. Here students are presented with metrics showing how much they’re engaging with the reading list on each of their modules.

Reading list functionality could also include reviews, comments and recommendations from other students (perhaps building on the features of goodreads).

They also suggested Amazon-style recommendations for reading e.g. “if you liked x you may like y”.





student app 1

How you spend your time was another application which the students thought could be useful.  This example shows the percentage of time spent by the student on various activities. The data itself could be assembled from timetables, calendars, geo-location and self-declared activity.

Recommendations on how much time should be spent on different activities could be another helpful feature.








student app 7Managing event attendance was another popular option. The student could be contacted about societies and social events, workshops, guest lectures etc – all of which would be based on their interests, which they could also specify via the app. This would cut down on the amount of “spam” messages from the University which they say have led to many students not bothering to read their emails.

You could invite people to events you are organising, or push events to their app – again based on the interests they have specified.

Rating events would also be a useful feature.

If analytics determined that a student was becoming disconnected the app could introduce them to opportunities such as open day volunteering. There was a suggestion that University and Student Union data could be combined to suggest such opportunities based on career aspirations and interests.

Another option is to use the app to check-in to lectures, perhaps automatically using geo-location, and to enter reasons (such as illness) for non-attendance.  There could also be notifications on lecture cancellations.

The app could contribute the events you attend to a portfolio of attended lectures.

Finding other students with similar or complementary interests was a popular suggestion too. This idea came from a postgraduate student who recognised the value of interdisciplinary contact so that you could look for someone to help you in an area you were less familiar with. You could specify what skills you have on offer and what you’re looking for assistance with.


student app 3Though we didn’t ask her to do it, showing how the different functions of the app could be accessed was important to this student in order to understand how everything would fit together.

Another generic suggestion was that the app should keep you logged in all the time.







So all in all some great suggestions from this group of students in Lincoln.  Some of them aren’t what are normally considered learning analytics applications but they all rely on data – some of it new data such as students being able to specify their interests in more detail in order to receive targetted materials and details of events.

There’s a lot of complementarity with what staff thought of in the London workshop.  It’ll be interesting to see now what students at other institutions come up with.



  • -

Code of Practice for Learning Analytics – public consultation

Jisc’s draft Code of Practice for Learning Analytics is now available for public consultation for the next two weeks: Code of Practice for Learning Analytics v04 [MS Word].  I’ve already had some very useful comments back from a number of invited organisations and individuals which will help to enhance the document and the accompanying online guidance.

The background to the Code is provided in an earlier blog post.  The deadline for comments from the public consultation is 5th May.  These will then be presented to the Code of Practice Advisory Group which will agree a final version.

I’d be very grateful for any further thoughts from individuals or organisations either by commenting on this blog post or emailed to me at niall.sclater {/at/} jisc.ac.uk

We intend to release the final version of the Code in June and will be developing accompanying online guidance and case studies over the coming months.

  • -
Jisc's earning analytics architecture

Explaining Jisc’s open learning analytics architecture

Jisc is currently procuring the different elements of its architecture for a basic learning analytics system which we plan to make available to UK colleges and universities later this year.  In this video I explain how it all fits together.

The service will consist of the following components, and institutions will be able opt in to use some or all the components as required:

A learning analytics processor – a tool to provide predictions on student success and other analytics on learning data to feed into student intervention systems.

A staff dashboard – a presentation layer to be used by staff in institutions to view learning analytics on students. Initially this presentation layer will be focussed on the learner but dashboards for managers, librarians and IT managers could also be developed.

An alert and intervention system – a tool to provide alerts to staff and students and to allow them to manage intervention activity. The system will also be able to provide data such as methods and success, to be fed into an exemplar “cookbook” on learning analytics.

A student app – based on requirements gathering with staff and students.  Integration with existing institutional apps will be supported.

A learning records warehouse – a data warehouse to hold learning records gathered from a range of institutional data sources. We will define an output interface and also support integration with a common set of institutional systems.

When will it be available?
A procurement process is underway, proposals from suppliers have now been received and we are at the selection stage to appoint suppliers to develop each of the components of the learning analytics solution. The agreements will be in place in early May.

The expectation is that a basic learning analytics system consisting of the processor, dashboard and warehouse will be in place to pilot with universities and colleges from September 2015. The other components will be developed over the next 6-12 months.  A full production service will be provided if the pilots prove successful and popular from September 2017.

  • -
London - view of the Thames

Gathering requirements for a student app for learning analytics

What data and analytics should be presented directly to students? This was the subject of a workshop on 27th Feb in Jisc’s London offices, attended by staff from across UK higher education.  We also had with us a couple of students with a keen interest in the area.

London - view of the Thames
In advance of the meeting members of Jisc’s Student App Expert Group had anonymously provided a range of potential requirements, which I then grouped and used as the basis for the workshop (they’re included at the bottom of this post for information).

The first area is around information provision to students, and comprises functionality for:

  • Monitoring your academic progress
  • Comparing your progress to others or to your prior performance
  • Monitoring your engagement with learning
  • Providing useful information such as exam times and whether there are free computers available

The second area is concerned with action – the student actively entering information or doing something to enhance their learning. It consists of:

  • Prompts and suggestions
  • Communication facilities with staff and students
  • The upload of personal data
  • Providing consent to what data is used for learning analytics

Various other issues were suggested relating to the interface (e.g. ensuring it is easy to use), ethics (e.g. being aware that the app can only ever give a partial view of student progress), and data (e.g. accepting data from a wide range of sources).

During the day, groups discussed a number of these areas for functionality. For each we defined an idea, a purpose, benefits, drawbacks & risks, and presentational aspects. Some of these ideas are fairly wacky and might not survive further interrogation or prioritisation but here they all are for the record. The next stage is to run the ideas past students themselves to find out what they want to see in an analytics app.

How engaged am I?
The most common application of learning analytics is measuring student engagement. Putting this information in the hands of the learners themselves could help to reassure those who feel they’re on track and prompt those who aren’t engaging. There’s always the risk of course that students will game the system to achieve better engagement ratings without actually improving their learning. However it could also result in them finding the library, attending more lectures, using the VLE more or reading more books.

An idea for presenting this information was to show overall engagement on a scale of 1 to 5. Clicking the indicator would result in a further breakdown for e.g. library usage, lecture attendance and VLE usage. VLE usage might be further broken down if required, showing forum participation perhaps if that was considered important. Data could be shown by module as well as across modules.

Compare my engagement
Learners’ engagement could be compared with that of their peers or even their own past performance. Again this could be potentially motivating and inspire students to change their behaviour. The risks include being demotivating, falsely equating engagement with success, and privacy issues e.g. the identification of individuals on small cohorts from anonymous data.

How am I progressing?
The aim here is to gather and surface academic progress indicators and to identify actionable insights for the student. Timely information would aim to change their behaviour and improve achievement. Having all the information in one place would be beneficial but would there be enough information to enable students to take action? One risk is that this could “kill passion” for the subject and further divert effort into assessed or measured activities. Providing context would also be important – a grade without feedback may not be helpful. It also could be counterproductive for high performing students. Meanwhile raised and unfulfilled expectations could result in worse feedback for institutions on the National Student Survey.

Data could be presented on a sliding scale, showing whether they were likely to pass or fail and allowing them to drill down into more granular detail on their academic performance.

Compare my academic progress
This functionality would allow students to compare where they were in key activities with previous cohorts and with peers. It could aid those who lack confidence and help them to realise that they are doing better than they realised. Of course it could also damage their confidence. Another risk is that the previous cohort might be different from the current intake or the way the course is being taught might have changed.

My assessments
A possibility would be to show analytics on what successful students do and how your actions compare e.g. if students submit assessments just before the deadline are they more likely to fail? This might result in students being better prepared for their assessments.

My career aspirations
The aim here would be to help understand whether the student is on track to achieve their chosen career based on records of previous learners. This might include networking opportunities with students who have already followed a similar path. It might help to increase engagement and with module pathway planning. Students could talk about their skills and better understand how to quantify them.

Meanwhile suggestions such as “you need to know about teamwork” or “identify opportunities for voluntary work” could be provided. The app might also suggest alternative career paths or that a student is on the wrong one e.g. “your maths does not appear to be at the level required for a nuclear physicist”.

Risks include that the app could be overly deterministic, restricting learner autonomy – and that students would need to ensure that their data was up to date.

Plan my career path
A related possibility is showing what educational journey a student needs to take to achieve their intended career, helping them to avoid the wrong choices for them e.g. what does the life of a midwife look like and what was their educational journey to get there?

My competencies
Another idea discussed was to enable students to monitor their competencies and reflect on their skills development, perhaps through some sort of game. This could encourage them to engage better with the materials and with their cohort. Again this wouldn’t of course guarantee success.

My emotional state
Enabling students to give an idea of their emotional state in some way would allow them to gauge how they were compared to their peer group, and to provide better feedback to the institution or to tutors.  This is highly personal information of course and you might want it to be visible to you only, unless it is anonymised.

Why I didn’t attend
The app could allow students to input their reasons for non-attendance e.g. “I didn’t attend this lecture because I had my tonsils out” and “but while recovering in hospital I watched the lecture video and read the lecture notes”. This might enable the adaption of engagement scores so that students felt they reflected the real situation.

We looked at whether the app should include communications facilities around the analytics. This might between students and tutors or perhaps with peer mentors. There was concern that this might be mission creep for the app however integrating communications around the interventions taken on the basis of the analytics might be useful. The app could also provide information about opportunities for communications around student support, with personal tutors, study buddies, peer mentors or clubs.

There would be potential for communications based on the programme rather than just the module, and the functionality might for instance be used to facilitate the take-up for personal tuition. The tools available might depend on the level of the students e.g. encouraging those on a one year taught Masters. One issue raised was that there would be student expectations of a quick response, and this might result in even more email “tyranny” for academics.

Link app to my social media accounts
The idea here is to enable students to link the app to a Twitter, LinkedIn or other social media accounts so that you can send status updates from the student app. This would enable the aggregation of for example of Twitter feeds from all those on the module with Twitter accounts, allowing learners to connect better with others. The institution could use the data for sentiment mining and updates could be fed to the lecturer, even while they’re giving the lecture.

Give my consent for learning analytics
In order ensure the legal and ethical collection and use of student data for learning analytics, a key part of the learning analytics architecture Jisc is commissioning will be a consent system, which is likely to be controlled from the student app. This could be particularly important in some of the more personal applications such as linking to your social media accounts or inputting your emotional state. It will also help users to understand what is being done with their data, feel a sense of control over the process, and help to reduce concerns that data could be misused. It would allow students to control any third party access to their data e.g. by future employers.

My location
Providing geolocation data to the app could have a number of applications such as helping vulnerable students to feel safer, campus mapping and self-monitoring. It could help institutions by enabling the tracking of the use of services. Students might also be prompted to attend campus more or spend more time in the library. This does of course have privacy implications and access to location data would need to be strictly controlled (by the student). It would also generate large quantities of data.

Fun analytics
The aim here would be to motivate and engage, and to get students to use the app, by providing fun or amusing analytics. Options discussed included “calorie burner info” e.g. “you read 2 articles today and used 5 calories”; a campus induction game; weekly challenges based on activity and studies; and a badge system of rewards.

Where next?
A recommendations engine could be presented through the app, providing relevant offers, signposting and information to students. Again this could potentially result in increased engagement, driving students to helpful services. On the downside it could be intrusive, add to information overload, and be used for marketing rather than benefitting learning.
Information could be presented on what’s trending, forthcoming local events, and silly facts e.g. “30% of students who eat here get a 1st class honours!” This could help students to be better informed and prompt them to do something they might not have before.

My students union
Increased engagement with the students union can help learners to feel better connected so the app could also be used to facilitate this by showing events and information – and potentially engage them more in the democratic process.

Car park
We parked a number of ideas during the day to return to perhaps at a later stage, including: assessment regulations, tutor performance, data literacy, the naming of the app, and how we get disengaged students to use it.

Suggested functionality for the app
The following possible functions were suggested by members of the Student App Expert Group in advance of the session and then expanded on in the discussions, as summarised above. This provides a good checklist of what we might wish to consider including:

Monitoring academic progress

  1. Progress. What percentage of the course materials, activities, formative assessments etc. have you done?
  2. Student should be able to see their progress with clear indication whether they are at risk or not
  3. Show students their academic progress, at a granular level: what marks they have for each assignment and how that contributes to their overall progress
  4. Ability to track own academic progress – get marks, compare own marks across modules and years
  5. Monitor student progress (provide overall picture of student performance and alert to potential problems)
  6. Could there be an area showing their student performance?
  7. Real-time, or near real-time updates on progress
  8. At a glance views of progress against criteria (such as assessment), links to personal progression tracking, and ‘useful’ traffic-light style
  9. Overview of essay marks, including marks for research skills, writing skills, originality etc. -> ability to compare to previous essay marks
  10. Access to formative and summative marks, and feedback
  11. Performance data: grades
  12. Performance data against entry profile and previous performance
  13. An integrated view of a student’s study career, from the programme level to the course/module level
  14. What does the rating mean?

Comparing academic progress

  1. Academic “performance” in relation to others on cohort, possibly to previous cohorts and grade predication
  2. Crucially, should be able to compare their data both to themselves over weeks/months/years of study, but also to the ‘average’ behaviour of the cohort with whom they study.
  3. Answering the question: “Am I in line with my cohort, both now and preferably historically too?”
  4. Comparison. How is your progress compared to others – in the class, best in class, last year’s class etc?
  5. Leaderboards? Actually I hate them but my research shows that for some classes of student they do encourage engagement.
  6. Benchmarking the student academic performance with peers
  7. Ability to compare essay marks to average marks of cohort
  8. Where would 1st class degree attainment be on the line – and 2nd class, 3rd class and so on?

Monitoring engagement

  1. Look at interactions/activity they have taken part in on VLE and/or other systems, number of journals accessed online/in the library
  2. Activity data on attendance, VLE usage and library…and if there are appropriate comparators then that

Useful information

  1. A ‘calendar plus’ function that tells you not just what your lectures are for the day, but what other activities there are around campus – sports classes, clubs, if certain lecturers have office hours, if there are free computers in the computer lab, etc. Needs to both respond to where you are on campus, as well as make suggestions based on how much time you have to spare as well as where you are at the moment. For example, ‘You have an hour until your next lecture – why not boost your ‘library score’ and visit there for a little while, or go talk to Professor Blogs since she has office hours’.
  2. Have information on the university’s important events and useful revision techniques
  3. Easy and better access to learning resources

Prompts and suggestions

  1. Student should know what to do next
  2. Provide a visual representation of the chosen metric at a granular enough level that student activity can clearly precipitate change
  3. Potential outcomes and necessary effort – predict 2:2 do this well here and here and get a 2:2
  4. Recommendations of training courses and resources based on essay marks
  5. Prompts to regular self-assessment of research skills, writing skills, presentation skills etc. -> allow students to take responsibility for learning/progress
  6. Provide students with a ‘to do’ list, showing what they have to do and by when. The difficulty here is making it all inclusive
  7. Right now immediately after reading this text, what do you expect students to actually do
  8. Give students access to people who can help them and identify the specific kinds of help that can be provided
  9. Tips to improve performance, what to do next
  10. Gives information on how to improve not just/only status
  11. Diagnostics. The system should be able to see where I’m not doing well and point me to support materials. E.g you don’t seem to be doing well at this bit if the syllabus – or you don’t seem to be doing well at more analytic questions…
  12. A recommendations aspect based on past use (and how others behave) – based on this module/this paper/this time of studying, we recommend that you consider this topic/this other article/this prime study time
  13. Have information on ways they could improve their student engagement


  1. A ‘question’ function to send concerns to the academic personal tutor or other intermediary
  2. Identify effective communication strategies
  3. Facilitating interactive and better Communications with academic and admin staff
  4. Ability to communicate between staff-student, student-student

Upload of personal data

  1. Ability to load personally captured data to provide context and information
  2. Allow students set their own notifications – which may be alerts, reminders, or motivating messages – triggered by circumstances of their choosing. (Making good decisions about this would need facilitation, but would help towards metacognitive awareness and understanding of the data and the software themselves).


  1. A way of opting in or out of sharing the data, or aspects of the data, with staff
  2. Granular control of who sees what – controlled by the student

Other issues

  1. The student app should be easy to use
  2. Easy access to visual information
  3. Provide a visual representation of the chosen metric at a granular enough level that student activity can clearly precipitate change
  4. Whatever the outcome is for the learning analytics app, I’d try and keep the core interface simple. I’d personally prefer one graphic ultimately, but I’m sure there are arguments for a range of options
  5. cross platform -brandable (the students may know their institutional brand but perhaps won’t respond to something plastered in jisc branding)
  6. Access to the underlying data, but also good conceptually-straightforward visualisation of that data.
  7. Analytics visualisations that will prove compelling for students to visit the app.
  8. Cross platform /device so all can access


  1. Transparency about the gaps – the app should avoid over-determining – or giving the impression of over-determining – students’ progress and achievement based on data, which is inevitably an incomplete representation of learning but which may carry more weight than the ineffable or unrecorded moments of learning.

Data sources

  1. Accept data from a range of simple or aggregated end-points – I appreciate it is likely to accept a feed from the Jisc basic LA tool, but it would be useful if we could provide a feed from the basic data we have in Blackboard ASR

Impact on teaching

  1. Identify effective teaching and assessment practices

  • -

A taxonomy of ethical, legal and logistical issues of learning analytics v1.0

Jisc, Apereo and the Lace Project held a workshop in Paris on 6th February to discuss the ethical and legal issues of learning analytics.  The focus of this meeting was the draft taxonomy of issues that I prepared previously.  It was extremely helpful to have comments from experts in the area to refine the list, which is forming the basis for Jisc’s Code of Practice for Learning Analytics.  I have subsequently reworked the taxonomy based on the group’s comments.


I’ve now re-ordered the table to reflect a slightly more logical lifecycle view of learning analytics moving from issues of ownership and control to seeking consent from students, ensuring transparency, maintaining privacy, ensuring validity in the data and the analytics, enabling student access to the data, carrying out interventions appropriately, minimising adverse impacts and stewarding the data.

I’ve added a “Type” column which states whether the issue is primarily one of ethics, legalities or logistics.  It’s become clear to me that many of the issues in the literature around ethics and privacy for learning analytics are more about the logistics of implementation than about doing what’s right or keeping within the law.  I’ve therefore renamed the taxonomy to reflect the fact it’s about logistics as well.

The Paris group suggested scoring the issues on the basis of their importance and we began to rate them on a scale of 1 to 5, highlighting the most important ones.  I have subsequently reduced the scale to three points, roughly equating to: 1 – Critical; 2 – Important; 3 – Less important / may not arise.  I have reflected the views of the group in the rankings but have had to make many choices as to their relative importance myself.  I’d like to find some more rigorous way of rating the issues though the ranking will always be dependent on the nature and priorities of the institution.

The group added a stakeholder column.  Subsequently I divided this into Stakeholders most impacted and Stakeholders responsible.  I then found that the most impacted stakeholders were almost always students so the column wasn’t particularly helpful and I’ve just included a Responsibility column which shows who is primarily responsible for dealing with the issue. Again there’s a level of subjectivity here on my part and these roles will be constituted differently depending on the institution. I’ve listed six stakeholders:

  1. Senior management – the executive board of the institution.
  2. Analytics committee – the group responsible for strategic decisions regarding learning analytics. This might be a learning and teaching committee, though some of the issues may be the responsibility of a senior champion of learning analytics rather than a more representative commmittee.
  3. Data scientist – while the analytics committee may decide on particular issues, there is a need for data scientists or analysts to advise on issues relating to the validity of the dataset and how to interpret it.
  4. Educational researcher – some issues would be best dealt with by staff with detailed knowledge of the educational issues who are able to monitor the impact of analytics on students.  This role may be carried out by teachers or tutors or those more dedicated to educational research.
  5. IT – the institutional information technology department will take primary responsibility for some aspects of the analytics processes.
  6. Student – while students are potentially impacted by almost every issue here, they are primarily responsible themselves for dealing with a few of them.
Group Name Question Type Rank  Responsibility
Ownership & Control Overall responsibility Who in the institution is responsible for the appropriate and effective use of learning analytics? Logistical 1 Senior management
Control of data for analytics Who in the institution decides what data is collected and used for analytics? Logistical 1 Senior management
Breaking silos How can silos of data ownership be broken in order to obtain data for analytics? Logistical 2 Analytics Committee
Control of analytics processes Who in the institution decides how analytics are to be created and used? Logistical 1 Analytics Committee
Ownership of data How is ownership of data assigned across stakeholders? Legal 1 Analytics Committee
Consent When to seek consent In which situations should students be asked for consent to collection and use of their data for analytics? Legal / Ethical 1 Analytics Committee
Consent for anonymous use Should students be asked for consent for collection of data which will only be used in anonymised formats? Legal / Ethical 3 Analytics Committee
Consent for outsourcing Do students need to give specific consent if the collection and analysis of data is to be outsourced to third parties? Legal 3 Analytics Committee
Clear and meaningful consent processes How can institutions avoid opaque privacy policies and ensure that students genuinely understand the consent they are asked to give? Legal / Ethical 1 Analytics Committee
Right to opt out Do students have the right to opt out of data collection and analysis of their learning activities? Legal / Ethical 1 Analytics Committee
Right to withdraw Do students have the right to withdraw from data collection and analysis after previously giving their consent? Legal 3 Analytics Committee
Right to anonymity Should students be allowed to disguise their identity in certain circumstances? Ethical / Logistical 3 Analytics Committee
Adverse impact of opting out on individual If a student is allowed to opt out of data collection and analysis could this have a negative impact on their academic progress? Ethical 1 Analytics Committee
Adverse impact of opting out on group If individual students opt out will the dataset be incomplete, thus potentially reducing the accuracy and effectiveness of learning analytics for the group Ethical / Logistical 1 Data scientist
Lack of real choice to opt out Do students have a genuine choice if pressure is put on them by the insitution or they feel their academic success may be impacted by opting out? Ethical 3 Analytics Committee
Student input to analytics process Should students have a say in what data is collected and how it is used for analytics? Ethical 3 Analytics Committee
Change of purpose Should institutions request consent again if the data is to be used for purposes for which consent was not originally given? Legal 2 Analytics Committee
Legitimate interest To what extent can the institution’s “legitimate interests” override privacy controls for individuals? Legal 2 Analytics Committee
Unknown future uses of data How can consent be requested when potential future uses of the (big) data are not yet known? Logistical 3 Analytics Committee
Consent in open courses Are open courses (MOOCs etc) different when it comes to obtaining consent? Legal / Ethical 2 Analytics Committee
Use of publicly available data Can institutions use publicly available data (e.g. tweets) without obtaining consent? Legal / Ethical 3 Analytics Committee
Transparency Student awareness of data collection What should students be told about the data that is being collected about them? Legal / Ethical 1 Analytics Committee
Student awareness of data use What should students be told about the uses to which their data is being put? Legal / Ethical 1 Analytics Committee
Student awareness of algorithms and metrics To what extent should students be given details of the algorithms used for learning analytics and the metrics and labels that are created? Ethical 2 Analytics Committee
Proprietary algorithms and metrics What should institutions do if vendors do not release details of their algorithms and metrics? Logistical 3 Analytics Committee
Student awareness of potential consequences of opting out What should students be told about the potential consequences of opting out of data collection and analysis of their learning? Ethical 2 Analytics Committee
Staff awareness of data collection and use What should teaching staff be told about the data that is being collected about them, their students and what is being done with it? Ethical 1 Analytics Committee
Privacy Out of scope data Is there any data that should not be used for learning analytics? Ethical 2 Analytics Committee
Tracking location Under what circumstances is it appropriate to track the location of students? Ethical 2 Analytics Committee
Staff permissions To what extent should access to students’ data be restricted within an institution? Ethical / Logistical 1 Analytics Committee
Unintentional creation of sensitive data How do institutions avoid creating “sensitive” data e.g. religion, ethnicity from other data? Legal / Logistical 2 Data scientist
Requests from external agencies What should institutions do when requests for student data are made by external agencies e.g. educational authorities or security agencies? Legal / Logistical 2 Senior management
Sharing data with other institutions Under what circumstances is it appropriate to share student data with other institutions? Legal / Ethical 2 Analytics Committee
Access to employers Under what circumstances is it appropriate to give employers access to analytics on students? Ethical 2 Analytics Committee
Enhancing trust by retaining data internally If students are told that their data will be kept within the institution will they develop greater trust in and acceptance of analytics? Ethical 3 Analytics Committee
Use of metadata to identify individuals Can students be identified from metadata even if personal data has been deleted? Legal / Logistical 2 Data scientist
Risk of re-identification Does anonymisation of data become more difficult as multiple data sources are aggregated, potentially leading to re-identification of an individual? Legal / Logistical 1 Data scientist
Validity Minimisation of inaccurate data How should an institution minimise inaccuracies in the data? Logistical 2 Data scientist
Minimisation of incomplete data How should an institution minimise incompleteness of the dataset? Logistical 2 Data scientist
Optimum range of data sources How many and which data sources are necessary to ensure accuracy in the analytics? Logistical 2 Data scientist
Validation of algorithms and metrics How should an institution validate its algorithms and metrics? Ethical / Logistical 1 Data scientist
Spurious correlations How can institutions avoid drawing misleading conclusions from spurious correlations? Ethical / Logistical 2 Data scientist
Evolving nature of students How accurate can analytics be when students’ identities and actions evolve over time? Logistical 3 Educational researcher
Authentication of public data sources How can institutions ensure that student data taken from public sites is authenticated to their students? Logistical 3 IT
Access Student access to their data To what extent should students be able to access the data held about them? Legal 1 Analytics Committee
Student access to their analytics To what extent should students be able to access the analytics performed on their data? Legal / Ethical 1 Analytics Committee
Data formats In what formats should students be able to access their data? Logistical 2 Analytics Committee
Metrics and labels Should students see the metrics and labels attached to them? Ethical 2 Analytics Committee
Right to correct inaccurate data What data should students be allowed to correct about themselves? Legal 1 Analytics Committee
Data portability What data about themselves should students be able to take with them? Legal 2 Analytics Committee
Action Institutional obligation to act What obligation does the institution have to intervene when there is evidence that a student could benefit from additional support? Legal / Ethical 1 Analytics Committee
Student obligation to act What obligation do students have when analytics suggests actions to improve their academic progress? Ethical 2 Student
Conflict with study goals What should a student do if the suggestions are in conflict with their study goals? Ethical 3 Student
Obligation to prevent continuation What obligation does the institution have to prevent students from continuing on a pathway which analytics suggests is not advisable? Ethical 2 Analytics Committee
Type of intervention How are the appropriate interventions decided on? Logistical 1 Educational researcher
Distribution of interventions How should interventions be distributed across the institution? Logistical 1 Analytics Committee
Conflicting interventions How does the institution ensure that it is not carrying out multiple interventions with conflicting purposes? Logistical 2 Educational researcher
Staff incentives for intervention What incentives are in place for staff to change practices and facilitate intervention? Logistical 3 Analytics Committee
Failure to act What happens if an institution fails to intervene when analytics suggests that it should? Logistical 3 Analytics Committee
Need for human intermediation Are some analytics better presented to students via e.g. a tutor than a system? Ethical 2 Educational researcher
Triage How does an institution allocate resources for learning analytics appropriately for learners with different requirements? Ethical / Logistical 1 Analytics Committee
Triage transparency How transparent should an institution be in how it allocates resources to different groups? Ethical 3 Analytics Committee
Opportunity cost How is spending on learning analytics justified in relation to other funding requirements? Logistical 2 Senior management
Favouring one group over another Could the intervention strategies unfairly favour one group over another? Ethical / Logistical 2 Educational researcher
Consequences of false information What should institutions do if a student gives false information e.g. to obtain additional support? Logistical 3 Analytics Committee
Audit trails Should institutions record audit trails of all predictions and interventions? Logistical 2 Analytics Committee
Unexpected findings How should institutions deal with unexpected findings arising in the data? Logistical 3 Analytics Committee
Adverse impact Labelling bias Does labelling or profiling of students bias institutional perceptions and behaviours towards them? Ethical 1 Educational researcher
Oversimplification How can institutions avoid overly simplistic metrics and decision making which ignore personal circumstances? Ethical 1 Educational researcher
Undermining of autonomy Is student autonomy in decision making undermined by predictive analytics? Ethical 2 Educational researcher
Gaming the system If students know that data is being collected about them will they alter their behaviour to present themselves more positively, thus distracting them and skewing the analytics? Ethical 2 Educational researcher
Abusing the system If students understand the algorithms will they manipulate the system to obtain additional support? Ethical 3 Educational researcher
Adverse behavioural impact If students are presented with data about their performance could this have a negative impact e.g. increased likelihood of dropout? Ethical 1 Educational researcher
Reinforcement of discrimination Could analytics reinforce discriminatory attitudes and actions by profiling students based on their race or gender? Ethical 1 Educational researcher
Reinforcement of social power differentials Could analytics reinforce social power differentials and students’ status in relation to each other? Ethical 2 Educational researcher
Infantilisation Could analytics “infantilise” students by spoon-feeding them with automated suggestions, making the learning process less demanding? Ethical 3 Educational researcher
Echo chambers Could analytics create “echo chambers” where intelligent software reinforces our own attitudes and beliefs? Ethical 3 Educational researcher
Non-participation Will knowledge that they are being monitored lead to non-participation by students? Ethical 2 Educational researcher
Stewardship Data minimisation Is all the data held on an individual necessary in order to carry out the analytics? Legal 1 Data scientist
Data processing location Is the data being processed in a country permitted by the local data protection laws? Legal 1 IT
Right to be forgotten Can all data regarding an individual (expect that necessary for statutory purposes) be deleted? Legal 1 IT
Unnecessary data retention How long should data be retained for? Legal 1 Analytics Committee
Unhelpful data deletion If data is deleted does this restrict the institution’s analytics capabilities e.g. refining its models and tracking performance over multiple cohorts? Logistical 2 Data scientist
Incomplete knowledge of data sources Can an institution be sure that it knows where all personal data is held? Legal / Logistical 1 IT
Inappropriate data sharing How can data sharing be prevented with parties who have no legitimate interest in seeing it or who may use it inappropriately? Legal 1 IT