Data-driven approaches to teaching excellence
Two recent government announcements put learning analytics centre stage, explains Phil Richards, Chief Innovation Officer, Jisc
First, universities and science minister Jo Johnson confirmed the election manifesto pledge to develop a teaching excellence framework that, he says, will incentivise universities to focus on teaching quality and so improve student experience. Then, the chancellor’s July budget outlined plans to link the fee cap to inflation from 2017-18 for institutions that can demonstrate high quality teaching.
Remembering the sometimes painful Research Excellence Framework (REF) process a further framework to evidence teaching quality sounds potentially costly and time consuming but Mr Johnson has said he doesn’t want it to be either. Beyond that information is sparse but learning analytics offers institutions a way forward.
Many university and college systems already collect a huge amount of information about students, recording every swipe into the library or sports facilities, and every VLE login and logout. They can show us what people read, how long they spend doing it and often also identify instances where students have attempted to ‘game’ the system by swiping in multiple times in an effort to avoid scrutiny.
Experiments within UK universities have focused on analysing patterns in this data to identify, for example, learners who are not accessing resources or who seem to be spending more time than is wise in the student bar. Nottingham Trent University has conducted a large-scale learning analytics experiment to investigate how it can help to optimise student attainment and retention by fostering a sense of engagement and belonging. There, if students seem to be avoiding learning resources it is flagged in a dashboard for their tutor to see and a friendly, personal intervention follows. It is a timely and light-touch approach that often results, students report, in appropriate behaviour changes before the problem spirals into something bigger.
Promising results such as this were also reported by other institutions at a recent learning analytics workshop. It is still too soon for conclusive findings but it is easy to spot a correlation between improved student engagement and better outcomes.
Over the next 12 months we’ll be developing a service enabling universities and colleges that don’t routinely collect and analyse such data to begin to do so and we’re working on new uses for learning analytics techniques. Imagine how powerful it could be if institutions could identify each student’s preferred learning style from their digital footprint; their learning could then be personalised so that they could learn better and more easily, freeing them up to spend time on enriching their understanding and exploring aspects of their learning that particularly interest them. How would that be for proof of teaching excellence?
Which brings us back to those government pronouncements. At Jisc, we are convinced that learning analytics offers a practical way to demonstrate student engagement now, and has the potential to bring further benefits in the future. Inevitably, any such use of personal data requires the data owners’ permission, and this would be contingent on students having a clear understanding both of how it will be used and who will have access to it. The ethical issues surrounding data use are complex in this area and there are no definitive answers but we have developed a code of practice on the use of learning analytics that institutions can use to help them get started. We are also developing a learning analytics service for universities and colleges that will be fully ready in 2017, with some components available this coming autumn. If you’d like more detail read Niall Sclater’s latest project update and visit our effective learning analytics project page.