Engagement reporting tools for Blackboard and Moodle

In my last post I described four types of learning analytics products.  Here I’ll go into more detail around some of the VLE-based engagement reporting tools. These products for Blackboard and Moodle sit within the virtual learning environment (VLE/LMS), look at its data only, and provide simple indications of a student’s progress, raising flags when the student looks to be at risk.  Unlike many learning analytics tools these are aimed at teachers rather than the students themselves or managers within the institution.

Blackboard Retention Centre
Bundled with Blackboard Learn is Retention Centre which provides a simple dashboard view of learner participation within a single course. The functionality evolved from earlier features in Learn such as an “early warning system”.

Retention Center is primarily aimed at identifying students at risk of failure, based on rules set by teachers.  The key dashboard provides an overview of a single cohort on a course, enabling you to identify students at risk of failure.  You can decide who you want to flag as at risk, add notes on individual learners, and contact them directly from the Retention Centre.  It’s also possible from here to change due dates for assignments.  There are four basic measurements of student engagement:

  • Course activity is a measure of the time a student first clicks on something in the course until he or she clicks outside the course or logs out. 
  • Thresholds can be set for particular grades, flagging students who have fallen below that value – or those who have fallen a certain percentage below the class average.
  • A flag for course access is set when users fail to access the system within a defined number of days.
  • The final metric is missed deadline which can be triggered when a deadline is not met within a set number of days or when a specified number of deadlines have been missed.

There are default rules for each of these (which can be customised):

  • Course activity in the last week is 20% below average
  • Grade is 25% below class average
  • Last course access was more than 5 days ago
  • There has been 1 missed deadline

Another screen allows instructors to view a summary of their own participation in a course and to link directly to activities that are required of them such as grading.  Blackboard suggests that instructors use this functionality to prioritise areas of their courses which need attention.  This is a kind of basic “teaching analytics” where instructors can see an overview of their activity and link easily to tasks such as marking or blogging.

The data on what actions you’ve taken through the Retention Centre as a teacher is available only to you.  While this may give you confidence that no-one is snooping on your teaching activities, it limits the options for institutions which want to understand better how learning and teaching are taking place.  Another limitation is that as Retention Centre works only at the level of a course you can’t get a view of programme or student-level activity.

Moodle options
Other than deploying a generic business intelligence system such as QlikView, Cognos or Tableau, Moodle-specific options include a few plugins for the system, in particular Engagement Analytics and Moodle Google Analytics, and a commercial option, Intelliboard.

As an open source system Moodle allows users to develop their own plugins, and a number of institutions have built analytics tools using the data from user log files.  Currently these appear to be considerably less developed than the analytics capabilities of other virtual learning environments, notably Desire2Learn Brightspace and Blackboard Learn.  The last release of Moodle Google Analytics was August 2013, though Engagement Analytics, initially released in August 2012, is still being maintained. As with Blackboard Retention Centre, the tools are primarily aimed at staff – not students as yet.

Documentation is limited for all these options.  Intelliboard’s product is clearly at an early stage of development, with an offer on its website that any paying customer can request additional reports for free. Moodle Google Analytics takes the data available from Google Analytics and the web server logs and presents it in Moodle and is thus more of a web analytics than a learning analytics tool, though analysing how learners navigate through a course website may be of interest.

Engagement Analytics presents risk percentages for students based on three indicators:

engagement1

  • Login activity: how often and how recently are students logging in, and for how long?
  • Forum activity: are students reading, posting and replying?
  • Assessment activity: are students submitting their assessed work, and are they submitting on time?

You can configure the weighting of each indicator e.g. 60% for logins, 30% for assessment and 10% for forums – depending on the relative importance of these activities in the course.  You can also add other indicators such as downloading files.

engagement2

Limitations and take-up
For institutions using Blackboard or Moodle these tools provide simple ways of viewing engagement by students.  It’s surprising that, given how long the VLEs have been in existence, it’s taken so long for such basic reporting facilities to emerge.

As I noted earlier these systems use data from the VLE only; there’s no integration with student information or other systems.  None of them appear to facilitate any automated interventions so teachers have to decide what action to take based on the dashboards. As Retention Centre comes bundled with Learn, no additional technical expertise is required to install and maintain this functionality – it merely has to be switched on for a particular course by the instructor.  It should be relatively easy for a Moodle administrator to install the plugins.

It’s unclear how widespread the use of these tools is, however many institutions are no doubt experimenting with Retention Centre.  One university I spoke to found the interface “ugly” and the functionality not very useful but I’m sure many teachers will find it does give them a better indication of students at risk.  Retention Centre is functionality which allows users to try out some basic reporting and analysis, perhaps later leading institutions to consider purchasing the much more sophisticated Blackboard Analytics for Learn or some business intelligence software.

As far as the Moodle tools are concerned Intelliboard claims a few corporate clients on their website – none so far in the UK.  It is not clear how many institutions are deploying the plugins but initial response to Engagement Analytics on the Moodle forums is positive and it’s been downloaded nearly 10,000 times from the moodle.org site.

Indicators of engagement
What is particularly of interest about these tools is to what extent this data provides an accurate indication of student engagement – which we know can correlate with measures of success. Michael Feldstein points out that the four Retention Centre indicators for activity in the VLE are considered the most indicative of student success according to the inventor of Purdue’s Course Signals, John Campbell.

But how do we know that the Retention Centre indicators are more accurate than those measuring login, forum and assessment activity in Engagement Analytics?  Courses have different types and balances of content, communication and assessment – and this is recognised by the tools in allowing you to customise the indicators.  However there are all sorts of other factors at play such as the features of the software itself, alternative ways that students have to communicate, the institutional context and nature of the student body, and to what extent the teacher encourages students to use the tools.

Learning analytics is an inexact science and there will always be individuals who perform differently from how we think they will.  Monika Andergassen and her colleagues at the Vienna University of Economics and Business found that there were correlations between time spent in the VLE and final grade, and betweeen self-assessment exercises undertaken and final grade. The correlations in both cases though were modest, and the repeated solving of the same exercises didn’t correlate with better results, implying unsurprisingly that what you do online may be more important than how long you spend in the VLE.

Various people I’ve spoken to at my recent visits to UK institutions believe that the more information we have about students the more accurately we’ll be able to predict their success or likelihood of dropout.  A key question is whether adding all the possible sources of data we have will make a significant difference to the accuracy of these predictions.  Will a few indicators from the VLE be sufficient to pick up the majority of your students who are at risk or do we need more sophisticated predictive analytics software which also take data from the student information system and elsewhere?

This post was first published on the Jisc Effective Learning Analytics blog, 3rd Oct 2014.

Images are from https://docs.moodle.org/22/en/report/analytics/index and are copyright Kim Edgar, available under the GNU General Public License

Posted in Blackboard, Learning Analytics, Moodle | Leave a comment

Learning analytics: what types of product are available?

Every educational institution wants its learners to reach their full potential.  Learning analytics can help us to measure and predict student success using data relating to engagement, grades, retention, graduation and employability.  But what products are out there to enable institutions to improve on the indicators of success, and to help visualise and analyse the increasing amounts of data being collected around our students? How well do the various tools facilitate interventions with learners and help us to monitor the effectiveness of those interventions?

I’ve been reviewing some of the main analytics solutions for Jisc, and visiting some of the institutions who are beginning to use them – I’ll be discussing my findings in a series of blog posts.

There’s a diverse range of learning products available, most of which are still at a very early stage of development.  The marketplace is developing rapidly though, and I’m told that universities and colleges are being bombarded by vendors to try to get them to buy into their nascent analytics products.

Picking the wrong solution is likely to be an expensive mistake. On the other hand those institutions who seem to have got it right are already providing significantly enhanced information about their students to managers, academics, librarians and the learners themselves.  Crucially, they’re attributing initial improvements in some of the indicators of student success to the use of these tools and the interventions being taken as a result of the improved information.

In order to carry out comprehensive analytics, data is generally extracted from various institutional systems, in particular the virtual learning environment (VLE/LMS), the student information system (SIS – also known as student records system) and a variety of library systems.  So the development of analytics products is complicated by the range of underlying software being used by educational organisations.

To make things slightly simpler, in recent years the VLE market has consolidated, with most further and higher education institutions in the UK deploying either the commercial market leader Blackboard Learn or its open source competitor Moodle.  Meanwhile though some universities have in-house developed student information systems there is a limited number of commercial products  in use such as Tribal SITS:Vision and Ellucian Banner.   In further education these products tend to be called Management Information Systems (MIS) and include Capita UNIT–e Student Management Information System.

Vendors of VLEs and SISs are rapidly developing products to help institutions make sense of the large quantity of data that is being accumulated as students undertake their learning, drawing conclusions about their likely success by adding this to already established information such as ethnicity or prior educational attainment.  To try to make sense of the range of products out there I’ve classified them in four categories, based on the software they originate from:

  1. VLE-based engagement reporting tools – which sit within the VLE, generally look at VLE data only, and provide simple indications of a student’s progress.  Examples: Blackboard Retention Centre, Moodle Engagement Analytics plug-in.
  2. VLE-centric analytics systems – developed by VLE vendors, combining data from the VLE with data from the SIS to enable more extensive analysis.  Examples: Blackboard Analytics for Learn, Desire2Learn Insights.
  3. SIS-centric analytics systems – which sit alongside the SIS but may also draw in data from the VLE, providing learning analytics alongside wider business intelligence. Example: Ellucian Student Retention Performance, Compass promonitor.
  4. Generic business intelligence systems – developed to provide better analysis in any business, not specifically for education, sitting outside both the VLE and SIS but drawing data from those and other systems, often in conjunction with a data warehouse. Examples: QlikView, Tableau, IBM Cognos, HP Autonomy.

Most of the VLE-centric and SIS-centric solutions are themselves developed on top of one of the generic business intelligence (BI) solutions so these categories are not mutually-exclusive.

I discussed this classification with Richard Maccabee of the University of London Computer Centre, which hosts Moodle for around a hundred institutions.  He suggested a fifth type of learning analytics system: one which examines data from across a range of institutions and provides analysis back to the organisations.  It might be that there is more to learn by comparing your modern language students with those in another institution than there is by comparing them with your own engineering students.

Assuming institutional agreement, appropriate anonymisation and compliance with data protection legislation, the sum of big data from multiple organisations may indeed be greater than the parts.  This concept has not been lost on companies which provide hosted VLE services such as MoodleRooms and Desire2Learn.  Richard and I concluded though that rather than a separate type of analytics system, this is more of a dimension or addiional set of services which could be applied to any of the above categories.

An institution’s choice of learning analytics products will be limited to some extent by the systems it has already deployed.  Thus if you’ve invested significantly in Blackboard Learn as your sole institutional VLE you’ll not be deploying analytics tools based on Moodle or Desire2Learn Brightspace.

Institutions can carry out limited analytics at low cost by deploying one of the VLE-based engagement reporting tools.  The choice for more sophisticated analytics based on multiple data sources may be whether to buy into an analytics system developed by a trusted vendor of your institutional VLE or SIS – or to purchase a generic BI tool.  Some of the latter can already be integrated relatively easily with underlying data sources and come pre-populated with education-specific dashboards.

A number of the products available are components of wider analytics offerings which cover activities such as student recruitment and fundraising (though some of these systems are currently quite US-specific). The learning analytics solution you choose may be best considered as part of a broader exercise to develop institutional business intelligence.  Some of the learning analytics initiatives report that they’ve prompted institutions to re-examine and clean-up a wide range of data sources and structures with positive knock-on effects for other aspects of the business.

The institution will also need to decide whether it has the expertise to go it alone or requires assistance from one of the increasing number of vendors eager for business in this area or consultancy from a third party.  Meanwhile communities of practice are springing up around some of the tools, which can provide excellent support and examples of successful implementations.

This post was first published on the Jisc Effective Learning Analytics blog, 2nd Oct 2014.

Posted in Learning Analytics | Leave a comment

Code of practice “essential” for learning analytics

We had a lively session on ethics and legal issues at the Jisc Effective Learning Analytics workshop last week, kicking it off by outlining some of the key questions in this area:

  • Who should have access to data about students’ online activities?
  • Are there any circumstances when collecting data about students is unacceptable/undesirable?
  • Should students be allowed to opt out of data collection?
  • What should students be told about the data collected on their learning?
  • What data should students be able to view about themselves?
  • What are the implications of institutions collecting data from non-institutional sources e.g. Twitter?

92307413_454537575a_z

 

 

 

 

 

 

 

 

 

 

 

 

 

Photo: Students Studying by Christopher Porter CC BY-NC-ND 2.0

Concern was also expressed about the metrics being used for analytics – how accurate and appropriate are they and could it be dangerous to base interventions by tutors on metrics which portray an incomplete picture of student activity?

A number of the participants had already been thinking in detail about how to tackle these issues. There was a consensus that learning analytics should be carried out primarily to improve learning outcomes and for the students’ benefit.  Analytics should be conducted in a way that would not adversely affect learners based on their past attainment, behaviour or perceived lack of chance for success. The group felt that the sector should not engage with the technical and logistical aspects of learning analytics without first making explicit the legal and ethical issues and understanding our obligations towards students.

Early conversations with students were thought to be vital so that there were no surprises. It was suggested that Informed consent is key – not just expecting students to tick a box saying they’ve read the institution’s computing regulations.  Researchers elsewhere have begun to examine many of these areas too – see the paper for example by Sharon Slade and Paul Prinsloo: Learning analytics: ethical issues and dilemmas.

Mike Day at Nottingham Trent University found that students in the era of Google and Facebook already expect data to be being collected about their learning. His institution’s approach has been to make the analytics process a collaboration between the learner and the institution. They have for instance agreed with students that it’s appropriate and helpful for both themselves and their tutors to be able to view all the information held about them.

Another issue discussed at some length was around the ethics of learners’ data travelling with them between institutions. Progress is being made on a unique learner number, and the Higher Education Data and Information Improvement Programme (HEDIIP) is developing more coherent data structures for transferable learner records. It will be possible for data on the learning activities of individuals to be transferred between schools, colleges and universities. But what data is appropriate to transfer? Should you be tainted for the rest of your academic life by what you got up to at school? On the other hand could some of that information prove vital in supporting you as you move between institutions?

Data on disabilities might be one such area where it could be helpful for a future institution to be able to cater for a learner’s special needs. Ethically this may best be under the control of the student who can decide what information to present about their disability.  However the technology may be in place to detect certain disabilities automatically such as dyslexia – so the institution might have some of this information whether the student wants them to know it or not.

Who owns the data about a students’ lifelong learning activity is another issue. Institutions may own it for a time, but once that student has left the institution is it appropriate to hold onto it? Perhaps the learner should take ownership of it, even if it is held securely by an institution or an outside agency. There may be a fundamental difference between attainment data and behavioural data, the latter being more transitory and potentially less accurate than assessment results and grades – and therefore it should be disposed of after a certain period.

There are of course different ethical issues involved when data on learning activities is anonymised or aggregated across groups of students. One parallel we discussed was that of medicine. A learner might visit a tutor in the way that a patient visits a GP.

The doctor chats to the patient about their ailment with the patient’s file including their previous medical history in front of them. Combining what the patient says with physical observations and insight from the patient’s records the doctor then makes a diagnosis and prescribes some treatment or suggests a change in lifestyle.

Meanwhile:

A tutor chats to a student about an issue they’re having with their studies and has analytics on their learning to date on a computer in front of them as they talk. The analytics provides additional insight to what the student is saying so the tutor is able to make some helpful suggestions and provide additional reading materials or some additional formative assessment to the student.

In both scenarios the professional takes notes on the interaction which are themselves added to the individual’s records. All the information is held under the strictest confidentiality. However the data in both scenarios can also be anonymised for research purposes, enabling patterns to be discovered and better treatment/advice to be provided to others in the future.

So in order to help institutions navigate their way around the complexities would a code of practice or guidelines be of interest to institutions? The consensus was yes it would and this was borne out in voting by the wider group later in the day. The schools sector has already developed codes of practice and obviously the NHS is well advanced in the ethics of data collection already so there is much to be learned from these professions – and from research ethics committees in many of our own institutions. There would need to be consultation with the various UK funding bodies – and working closely with the National Union of Students was seen as key to ensuring adoption.

A code of practice for learning analytics would have to be clearly scoped, easily understandable and generic enough to have wide applicability across institutions. The primary audiences are likely to be students, tutors and senior management. Mike at Nottingham Trent found the key to success was a four-way partnership between technology providers (who were required to adapt their products to meet ethical concerns), the IT department, students and tutors.

There was a strong consensus in the group that this work would significantly help to allay the fears of students and, often just as vocally, staff in their institutions in order to explore the real possibilities of using learning analytics to aid retention and student success.  In fact some stakeholders considered it an essential step at their universities and colleges before they could make progress.  Developing a code of practice for learning analytics will therefore be a key area of work for Jisc over the coming months.

This post was first published on the Jisc Effective Learning Analytics blog, 18th Sept 2014.

Posted in Ethics, Learning Analytics, Legal issues | Comments Off

Taking Learning Analytics to the next stage

How do higher and further education institutions in the UK best share their expertise in learning analytics?  Would a jointly developed code of practice for learning analytics help deal with the legal and ethical issues?  How can Jisc facilitate the development of tools and dashboards to help institutions develop their analytics capabilities to enhance student success?

These were some of the questions being addressed by 31 participants last week from UK universities, colleges and organisations including SCONUL, RUGIT, UCISA and the NUS at a workshop in Jisc’s offices in London.  Increasing amounts of data are being collected on students and their learning – but it’s clear that our understanding of how best to interpret that data and make use of it is still at an early stage.

Jisc effective learning analytics workshop

Back in April the Co-design Steering Group developed the idea of an effective learning analytics challenge.  Then over the summer we gathered 21 ideas from the community to address the challenge of learning analytics.  Voting from more than 100 people narrowed the list down to nine ideas and accompanying service scenarios that we discussed at the workshop.

One theme that emerged strongly from the discussions was a need to involve students in developing learning analytics products and services.  An app for students to monitor their own learning was seen as a critical requirement which Jisc could help provide.  While improving retention was regarded as important, the main long term goals would be to support learning attainment, assist decision making and pathway choice, and improve employability.

After some engaging discussions in smaller groups we came back together to vote on the top three themes, merging some of them in the process to form the following priorities:

Priority 1: Basic learning analytics solution and a student app
One of the most popular ideas was a “freemium” solution for further and higher education institutions, allowing them to gain experience and eventually progress to a more advanced toolset.  It would be based on an existing solution from an institution, vendor or open initiative that could be ready to provide a working product in early 2015.  The product would require the development of an open standard for analytics and an API enabling other compatible basic analytics solutions in the future.

Alongside the basic solution would be a student app which works with any learning analytics solution provider using a specified set of data inputs.  Students would be involved in scoping and designing their requirements for the app.

Finally we felt there was a need for a tool for tracking and recording interventions which take place as a result of analytics.  This will inform the development of a learning analytics cookbook (see below) which will suggest appropriate ways that staff and systems can intervene to enhance student success.

Priority 2: Code of practice for learning analytics
A huge priority for institutions is how to deal with concerns around data protection and privacy both from a legal and ethical perspective.  The potential benefits of learning analytics are well recognised but there are also possibilities for misuse. A guide to learning analytics practice is needed and will be informed by a comparative review of existing codes of practice in this and related areas.  Jisc will then develop the code of practice in partnership with the NUS and others.

Priority 3: Learning analytics support and networks
The group also prioritised the development of a support and synthesis service around the use of learning analytics to share expertise, working on:

  • Technical methods – the nuts and bolts of learning analytics such as what systems and data institutions are using
  • A learning analytics cookbook – with recipes for the use of data and metrics – documenting successful implementations at universities and colleges
  • Synthesis and analysis – giving a high level overview and showing trends across the sector
  • Networks – building networks of institutions keen to share experience both at a basic and advanced level

Next steps
A great deal of enthusiasm for the possibilities of learning analytics was expressed at the workshop – and we benefitted from the considerable experience that has already been gained by many of the participants. The priorities agreed will now be developed into a project plan that can be taken forward by Jisc over the next two years. Full details will be posted to http://analytics.jiscinvolve.org/wp/.

Educational institutions, vendors and other potential partners will be invited to comment and express interest in participating.  Meanwhile we’ll be looking for some expert critical friends to advise us on each of the projects as they progress.

This post was first published on the Jisc Effective Learning Analytics blog, 16th Sept 2014.

Posted in Learning Analytics | Comments Off

Snooping professor or friendly don? The ethics of university learning analytics


Is my professor watching every click? Mr_Stein, CC BY-NC-SA

Universities have been recording data digitally about their students for decades. No one would seriously question the necessity of collecting facts for administrative purposes, such as a student’s name and address, module choices and exam results.

But as teaching and learning increasingly migrate to the internet, huge amounts of data about individuals’ activities online are being accumulated. These include everything from postings on forums, to participation in video conferences, to every click on every university-hosted website.

Most of the records gather virtual dust in log files, never to be analysed by any computer system let alone viewed by a human. Universities have only recently started to realise the huge potential of using this data to help students succeed in their learning, or to improve the educational experience for others.

Privacy concerns

With these possibilities come dangers that the data could be used in ways undesirable to students. These include invading their privacy, exploiting them commercially by selling their data to third parties or targeted marketing of further educational products.

Meanwhile, well-intentioned pedagogical innovations which access the data may have unforeseen negative consequences, such as demotivating students who are told they are at risk of failure.

Institutions have clear legal responsibilities to comply with data protection legislation, restricting information from access by third parties and allowing students to view the data held about them when requested.

Universities are commercial organisations, but are also motivated by altruistic concerns such as enhancing the life chances of individuals through education. The multinational technology corporations which we unquestioningly allow to collect vast amounts of data about us have altogether different motivations.

For them, your data is of immense commercial value, enabling products to be targeted at you with increasing relevance. Most educational institutions need to act differently from for-profit organisations when dealing with users’ data.

What’s being done with the data?

Predictive modelling enables institutions to build a profile of a student. This can include information they have disclosed about themselves in advance of study, such as prior qualifications, age or postcode. This can then be mapped onto records of their online activity and assessment performance.

Predictions can then be made as to the likelihood of a student dropping out or what grade they can be expected to achieve. The Open University is developing models to target interventions at students thought to be at risk.

For example, a student who has no prior qualifications and has not participated in a key activity or assessment may be flagged for a telephone call by a tutor. Experience has shown that such a call may be what is required to motivate the student or help them overcome an issue which is preventing them studying.

Various ethical issues emerge here. If we establish early on that a student is highly likely to fail, should we advise them to withdraw or to re-enrol on a lower level course?

But what if we are limiting their opportunities by taking such an intervention? They might have continued successfully had we not intervened. Meanwhile, for those students thought not to be at risk, we are potentially denying them the possibility of beneficial additional contact with a tutor.

Opt out option

If the primary purpose of learning analytics is to benefit learners, then should a student be able to opt out of their data being collected?

There are two problems with this. We may be neglecting our responsibilities as education experts by allowing some students to opt out. This could deny them the assistance we can provide in enhancing their chances of success. The data collected can also be used to benefit other students, and every individual opting out potentially diminishes the usefulness of the dataset.

One environment where a student might reasonably assume they are free from data being collected about them is while accessing an e-book offline on a personal device such as an iPad or a Kindle.

Some US institutions are already providing students with e-reader software which captures data such as clicks and dwell times, storing them on the device and uploading it to a server for analysis. But unless users are made aware that this is happening, universities run the risk of being accused of unjustified snooping.

It is unclear to what extent the constant collection of data on online activity inhibits learning or even worries students. Do students care any more about what universities do with data on their educational activities than they do about the data collected by Google or Facebook on their personal interests, relationships and purchasing habits?

But the trust given to universities by students elevates the importance of caretaking their data and establishing clear policies for what we do with it.

Transparency about the data we collect, and how and why we are using it, will help to avoid a backlash from learners worried about potential misuse. Institutions need to develop clear policies arguing why the collection and analysis of data on students and their learning is in their interest. This is a necessary step before being able to exploit the full potential of learning analytics to enhance the student experience.

The Conversation

Niall Sclater does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

This article was originally published on The Conversation.
Read the original article.

Posted in Learning Analytics | Comments Off

Making ebooks more interactive: logistics and ethics

Ebook reader image (c) Andrew Mason licensed under Creative Commons

I had an interesting discussion yesterday with Phil Butcher, the Open University’s e-assessment guru.  He wanted to talk about whether we should invest in embedding our wide range of interactive question types in ebooks.

Since the 1970s Open University course teams have attempted to get students to think more deeply about the content they are reading by embedding questions within the course texts.  Accessing such questions on digital devices has some clear advantages over paper: many different question types are available and you can receive instant feedback, tailored to your response.

This is of course quite possible with web-based learning.  We have OpenMark questions embedded at various points in some of our texts and are considering whether we should adapt Moodle to be able to present single questions within texts in the same way.

But what do we do about ebooks?  Almost all of our content is now available in ebook format on a range of platforms.  If the interactive questions work in print and are even better presented online then shouldn’t we be incorporating them in the ebooks too?

To do this at scale we would have to have an automated process to export the items (questions together with possible responses etc) from Moodle into the ebooks.  Ebooks can render interactive questions using HTML5 but there is a variety of ebook formats, differing hardware and software platforms and a range of ebook reader apps.  The software for this export process would require continual tricky and expensive maintenance to stay on top of all the various formats and there would inevitably be features which worked on one platform and not another.  Another option would be to build our own ebook reader software to be able to optimise the user experience but that too would be complex, costly, have to work on multiple platforms and require ongoing maintenance.

There’s something about the paradigm of a book as a way of presenting digital content in the confusing world of the Internet which gives it appeal (I expanded on this in Are ebooks better than virtual learning environments?): in particular you can download an ebook as a complete self-contained package, access it offline and feel a sense of ownership over it in a way which you can’t with the content on websites.  You can also quickly grasp how many pages it is, navigate easily and know where you are in it.  And of course the ability to alter the font size of an ebook and have it repaginate automatically, to hold it in your hands, and to not have your experience cluttered with the many icons and menus of a PC-based interface all add to its usability.

One of the advantages of ebooks may however be problematic for educational institutions: offline reading.  That means no opportunity to see how students are using any interactive questions.  A valuable source of data for learning analytics to monitor uptake and performance is never gathered – and opportunities for enhancing problematic questions and the associated learning content are lost.  Meanwhile the learner potentially loses out too: there is no chance for institutions to target interventions at students who might be at risk of dropping out or might benefit from being able to compare their performance with other students.

A way around this would be to incorporate recording of user activity into ebook reader software and send it to a server every time the user goes online.  And if we’re recording information about the use of interactive questions why not record data such as how often often the book is being read, dwell time on pages or number of times pages are re-read.  Again that might tell us something about how effective our learning content is or what difficult concepts need to be explained better in future iterations, ultimately benefiting students.  This approach is already being taken by CourseSmart, a company which rents out textbooks and enables usage monitoring through their ebook reader software.

But what are the ethics of this?  While arguably most people are accepting, if not entirely comfortable, that anything they do online is potentially being monitored by those hosting the website this may not be true for ebooks.  Is there something fundamentally different about an ebook where we feel we own it, as we would a physical copy, and would resist the idea that we are being snooped on – even if the snooping was aimed at enhancing our learning?

Universities should be entirely transparent about what they do with any data gathered while students are accessing their systems and content.  It is quite easy to argue that most educational institutions will monitor usage primarily for the purpose of enhancing the educational experience for individuals and for future students.  This is in contrast to commercial entities and social networking sites which monitor usage in order to target marketing at you more effectively or to sell information about you to others.

But what if monitoring ebook usage has a negative effect on the learning experience?  If I’m spending a quiet evening at home reading an ebook and I know that every page turn, click or interaction is being monitored will that make me anxious and somehow less able to learn?   Assuming that we could build this monitoring facility into ebooks, or buy it from someone else, the best way forward from an ethical and pedagogical perspective may be to allow users to decide whether data about them can be collected and sent back to the institution or not.  Research is needed into what learners want out of their ebooks and whether they’re prepared to have data collected about their use of the interactive questions that are designed to promote deeper and more reflective learning.

Ebook reader image (c) Andrew Mason licensed under Creative Commons

Posted in Mobile Learning | Comments Off

Can mass sychronous events work with MOOCs?

Dave MiddletonMOOCs tend to involve consuming online content, taking automated assessments and peer networking. While students may feel some connection to the academics who create the courses by watching recorded videos of them, the opportunities for synchronous connection with subject experts are limited.

Dave Middleton is a tutor manager with the Open University and has been training tutors to use Elluminate effectively for several years. When online tutorials were offered to students in Wales on Exploring Psychology, one of our most popular modules with around 4,000 participants each year, students elsewhere began to complain that providing the events only to the Welsh was unfair.

So Dave spotted an opportunity to try something new. He opened up an Elluminate room to the entire module population, advertised a two-hour event, sat back and hoped that 3,850 registered students wouldn’t all turn up at once. In the event 200 did. The received wisdom is that online tutorials become unworkable when numbers exceed 20 but clearly the way this was being handled didn’t result in the expected chaos. Dave was able to enable group work and problem-based learning rather than simply lecturing at the students. 75% of them responded that the session had met or exceeded their expectations.

Comments included:

“The tutorial exceeded my expectations! It was well organised, easy to understand and packed with useful information”

“This was truly amazing and inspirational. The concept is fantastic.”

“Very enjoyable – I initially thought 2 hours would seem a long time for an online tutorial but the time just flew by. Great to be taking part from the comfort of your own sofa! ”

“I think these tutorials should be available for all modules as many, like myself, cannot attend face to face ones for the same reason we cannot attend brick uni’s and have chosen to study with the OU. I would like to say well done to the tutors, the organisation and structure was a great improvement. I only wish there were more of them.”

Faculty policy on online tutorial provision was changed after Dave’s experiment. For the first time there was evidence not only that the tutorials could offer an excellent learning experience to large numbers of students but would also be highly popular with those who didn’t otherwise have the chance to attend face to face sessions.

The lesson for MOOCs is that mass synchronous online sessions with subject expects can be motivational and effective. The tools available in Elluminate (now Blackboard Collaborate) and similar systems enable effective interactive teaching with hundreds of students simultaneously. Such sessions have to be properly planned of course both logistically and technically to avoid a “MOOC mess” such as the one which happened on Georgia Institute of Technology’s module with Coursera which resulted in the course being withdrawn.

Is Dave’s experience of dealing with a couple of hundred students at once the limit?  I suspect someone somewhere some time soon is going to push the technology and the logistics to accommodate many thousands of students in an engaging synchronous session.

Posted in Collaboration, Course design, MOOCs | Comments Off

MyOU: A seamless online experience for learners

Accessing online content and services has become a vital part of the OU experience. The virtual learning environment has been carefully designed over the last seven years and has some excellent features such as a custom-built forum tool and quiz engine.  Meanwhile we have other systems such as StudentHome, Open Learn and Library Services, full of useful content and tools.  These websites have grown up organically, are owned by different parts of the organisation, have different user interfaces and are not as well integrated as they could be.  Navigating through them to find the information or tools you need, particularly if you’re new to the University, can be a confusing experience.

Current OU systems are disparate

A new initiative called MyOU aims to put this right and will optimise the online experience for our students. Currently in the requirements gathering stage, we are consulting heavily with our learners and with the various stakeholders across the University. MyOU will provide a new layer on top of  existing systems making the online experience much better for students.

Future vision for MyOU:  My OU online experience is seamless. If I’m at multiple stages of the journey at the same time it’s still seamless. I get what I need at the right time.What I see is adjusted according to my profile.Content is presented in different blocks on the screen.The OU gives me a default set of content. I have lots of control over what I see. I can make it look the way I want.I don’t need to know which part of the OU is providing the content.I can access the whole thing with a simple URL e.g. open.ac.uk/myou

Posted in Architecture, OU VLE | Comments Off

Where next for tuition at the Open University?

Should Open University students be entitled to particular types and amounts of tuition during their studies?  Should provision be consistent across tutor groups, regions and nations, and qualifications?  What are the most successful pedagogical strategies for online synchronous sessions?  How should we engage with Facebook as an institution?  What role do face to face sessions provide in an increasingly online world?

I’m just out of a workshop with an enthusiastic and experienced bunch of people from various parts of the University which was examining some of these questions in greater detail.  The OU has never had a tuition strategy before; practice has developed organically across different faculties and regions.  This leads to inconsistencies of approach and inefficiencies, while also allowing great flexibility and responsiveness to local and individual requirements.

Various factors are coming together however which make the development of an overarching strategy for tuition a necessity for the University:

  • The necessity to optimise our use of tuition resource and methods in order to help retain students
  • The availability of a range of tools which can be used for tuition within the virtual learning environment – and understanding the many possibilities for how to deploy them
  • The greater use of the Internet in society as a whole and increased acceptance of technology (with the caveat that computer literacy and access to technology is variable)
  • Students’ increased expectations in a world of higher fees

The group today brought together a variety of perspectives but achieved consensus on how to develop the tuition strategy and on a number of key issues, namely:

  • There is a lack of evidence on current practice in tutorials and on student perceptions and expectations.  We need to build up an evidence base for what is working in tutorial provision.
  • The default situation for the University should be the provision of online tuition.  We should then supplement this with face to face provision where appropriate.
  • Whatever ends up being in the strategy there needs to be some flexibility to organise provision at a local level to meet changing needs.
  • It may make sense to organise face to face tuition on a local basis while organising online tuition across all regions/nations.
  • We need a clear policy about how we engage with external environments such as Facebook where we have limited ability to take action on misuse.
  • Finally we need to think about tuition at the earliest levels of module production.  In the past our Fordist production methods led us to think of tuition as an add-on, quite separate to the development of learning content.

I’ll be drafting the first version of the tuition strategy and then passing it to my colleague Pat Atkins and others to refine.  It will then travel through the University’s governance processes for further enhancement.  The aim will be for the document to set the direction of travel for the University in tuition, to provide guidance for faculties, module teams and associate lecturers, and to ensure that we maintain excellence in and enhance our tutorial support for students.  The challenge is to produce a document that is concise enough for people to be motivated to read, at a high enough and generic enough level that it is acceptable across all faculties and regions/nations, but low level enough that it can actually achieve something concrete.   It needs to help increase consistency in the student experience without being so prescriptive that it restricts the flexibility to respond to dynamic circumstances.  Fortunately I enjoy a challenge.

Posted in Course design, OU VLE, Policies, Strategy | Comments Off

Two paradoxes at the heart of MOOCs

My thinking on MOOCs has been consolidated after doing a fair bit of reading, chatting and thinking recently.  Much has been written on the disruptive potential of MOOCs and also about the problems associated with them such as lack of quality, plagiarism and lack of tutor support.   I have no desire to add to the noise and hype but want to set down two paradoxes that it seems to me are at the heart of the MOOC movement.

Paradox 1:  Most MOOCs are offered by elite institutions which don’t need to expand their student base

So why are they developing MOOCs?  Are they basically caught up in the hype and working on the proven Amazon business principle of build fast and worry about money later? Maybe, but here are some other reasons why they may be launching into MOOCs:

  1. Some providers have argued that MOOCs are aimed at helping them accumulate data on how students learn online which will then allow them to enhance their teaching for regular students.
  2. MOOCs, like open educational resources, provide a genuine opportunity to spread an institution’s educational mission outside the campus.  Call me old-fashioned but I believe that people in education are still frequently driven by altruistic motivations such as knowledge creation and a desire to spread the love of learning – as well as economics.
  3. It may help boost the profile of an individual professor and develop his or her international reputation.  When it comes to promotion will saying you’ve successfully taught thousands of students via a MOOC boost your career prospects?
  4. It may provide additional revenue though for the foreseeable future this is likely to be minimal for the institution and is dependent on developing as yet undiscovered viable business models.

Putting aside issues such as quality assurance, plagiarism and lack of tutor support, let’s suppose that MOOCs develop coherent curricula, peer support mechanisms and robust assessment processes which lead to qualifications at very low cost from credible institutions – and employers begin to take them seriously.  That leads us to the next paradox.

Paradox 2: Highly successful MOOCs attack the core business of those who are offering them

Elite institutions offering MOOCs will therefore never allow them to become as credible as their regular fee-incurring provision.  If an equivalent experience can be had for free no-one will pay fees.  MOOCs therefore will by definition remain an inferior educational experience and have to be offered under a sub-brand or a completely different brand – presumably one reason why institutions are rushing to sign up to Udacity and Coursera so they can jump on the MOOC bandwagon without diluting their own brands.  But successful partnerships where institutions club together to offer modules which build up to full qualifications are fraught with difficulties and have led to some spectacular debacles such as the UK’s £62m e-university.

High quality assessed and accredited MOOCs from Ivy League institutions will not be allowed to disrupt their own core business but may ultimately provide viable alternatives to expensive qualifications from less prestigious institutions.  This is where MOOCs could begin to disrupt the higher education market.  Learners are becoming ever more discerning and there is further evidence that the higher education bubble in the US has burst particularly in the for-profit sector with recent announcements such as the University of Phoenix closing half of its campuses.  MOOC-based qualifications will have to be very good and much cheaper to gain ground in an increasingly competitive market.

Posted in MOOCs | 4 Comments