JISC Learning Analytics networking event

On 8th May the University of Edinburgh welcomed around 40 people from across the UK for this meeting. As with previous JISC networking days, there was lots of time for interaction and discussion, and the variety of morning’s speakers provided plenty to discuss.

The notes here are my personal take-away points from the day. Niall Sclater has provided presenter’s slides and notes on the JISC Effective Learning Analytics Blog and my colleague Nicola Osborne created an excellent Liveblog of the day.

I found it hugely reassuring to hear Dragan Gasevic, Professor of Learning Analytics at the University, assert that Learning Analytics (LA) must be rooted in pedagogy and that enquiry should be based on having clear and sensible questions to answer. His introductory keynote reviewed some recent research and development activity in this area, focused on establishing how LA data can be used to change behaviour to support and improve learning. He cited recent publications from work at University of Michigan and in Australia which suggests that LA must be situated in the learning context to be effective in terms of providing guidance on appropriate actions, whether this is for the institution, programme, course tutor or student. I was especially interested in the notion that it could be useful to identify different patterns for courses, and to apply analytics algorithms differently depending on the type of course being reviewed. He also looked at the sophistication model for institutional adoption of LA (Siemens et.al, below) and considered some of the research which aims to help institutions understand how to progress to a better institutional use of LA.MaturityModel_Siemens_figure

 

 

 

 

 

Sophistication model: Siemens, G., Dawson, S., & Lynch, G. (2014). Improving the Quality and Productivity of the Higher Education Sector – Policy and Strategy for Systems-Level Deployment of Learning Analytics. Canberra, Australia: Office of Learning and Teaching, Australian Government. Retrieved from http://solaresearch.org/Policy_Strategy_Analytics.pdf

For me, this set a context for all of the presentations and conversations for the rest of the day. It was especially helpful to reflect on the project work currently being undertaken by Information Services, which I was able to outline to the group at the event. This project looks at making student data from within our VLEs available to students themselves. Our conclusions from the work done so far chime completely with Professor Gasevic’s point: context is all – this sort of information will only be meaningful, and useful to the student, if the activity within the VLE is a reasonable proxy for engagement and activity on the course as a whole. So we find that, for example, online distance learning (ODL) courses may present a very different picture from “blended” courses on-campus, and the same data types will carry quite different meaning.

Similarly, our concerns about the maturity of institutional policy, and staff understanding of the power and limitations of Learning analytics, suggests that the University of Edinburgh, like many other institutions, is only now beginning to address the barriers to a more mature use of LA, according to the maturity model described here.

During the development of the Student Data from VLEs project, we have redefined our objectives as we improved our understanding of staff and student needs, and the options available to us. While some of the project outcomes will be improved understanding and use of LA tools, others relate to improved institutional understanding of the issues and opportunities around LA work.

It is also clear that institutional context can make a significant difference. Edinburgh’s long history of linking student systems with the VLEs, and automating processes wherever feasible, means that we have not encountered the same issues as described by Sheila McNeill at Glasgow Caledonian University, where the multitude of different data owners and holders has made it difficult to get a useful overview of what data exists and where.

But ultimately my conclusions, from this and from the other research projects described at the meeting, are the same:

  • there is a need for a coherent over-arching policy on what can and cannot, should and should not, be done with student data.
  • there is a need to ensure staff and students have appropriate information and training in “analytics literacy”
  • there is a need to understand better which algorithms may be usefully applied to different types of courses – a typology of courses, for each institution, might be a useful way forward here.

Another high point of the day was the opportunity to hear about and discuss the JISC project plans for LA and related developments, in tools and guidelines. If you are interested in contributing to this work and hearing more about JISC progress, you can sign up here for the next JISC LA SIG meeting in Nottingham on 24th June

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *

 
Follow

Get every new post on this blog delivered to your Inbox.

Join other followers: