Tim Linsey: universities should share feedback in real-time

Tim Linsey, head of academic systems and evaluation at Kingston University, says his university is collecting and sharing more student feedback from underrepresented groups

Across the higher education sector, approaches to student engagement are multi-dimensional.

An example of how this close, creative relationship can work in practice is Kingston University’s student academic development research associate scheme (SADRAS).

SADRAS involves students and staff working together to design and carry out small research projects specifically focused on enhancing the academic experience of under-represented groups of students at the university.

These projects sometimes evolve from other types of student engagement activity, such as staff/student consultative committees or course evaluations.

The use of student feedback is part of Kingston University’s quality assurance and enhancement processes. How this feedback is used to support course teams – by delivering effective course design and enhancing students’ experience of learning, teaching and assessment – is a key area of discussion.

Feedback should be collected in real-time

Module evaluation questionnaires (MEQs) are an important component of an approach designed to be consistent, to ensure decision-making is guided by evidence, and that good practice is recognised and shared.

At Kingston University, we use 12 standard questions – including two text response questions – throughout the university to ensure consistency for students across modules, and to spotlight change over time. To conduct the MEQs we use Blue, a piece of software from Explorance.

Students have flexible access to the surveys and can complete them in their own time using their personal devices (47% of our 2018-19 surveys were completed on mobile devices). Within hours of the survey closing, reports summarising the feedback are automatically generated and published in our virtual learning environment (VLE) with tailored versions of the report being available for both staff and students. The scheme is also supported by trained course representatives and an explanatory video has been created by our students’ union.

Timing is crucial to our process. MEQ outcome reports are published to students and staff while the module is still running which allows the outcomes to be considered in class. This discussion adds value and generates further insight. The automatic scheduling and publishing of surveys to students by email and through our VLE helps to facilitate this process.

Data should be shared with staff and students

We recognise that making high-level decisions on the outcomes of a single, or a small number of MEQs would be problematic, particularly in the absence of ‘local’ module-level knowledge. We therefore ensure our decision-making is informed by multiple sources of evidence including MEQ results over time, progression rates and external examiner reports.

We also take care with how we present module data in summary reports. For instance, comparisons and rankings provided to senior university committees compare like-for-like modules (eg. by level and faculty) and include an additional indicator of statistical significance based on the response rate and sample size.

To ensure that student feedback is effectively embedded in our quality assurance and enhancement processes, the MEQ data – along with the results of other institutional surveys, our level five university student survey and the National Student Survey (NSS) – are also automatically published in university dashboards which are available to all staff.

Help course leaders to use data intelligently

The MEQ dashboard allows members of staff to view the quantitative data at both module and aggregate levels and also look at comparisons with previous years’ data.

Additional dashboards summarise other key module and course metrics such as progression rates (particularly at the first attempt), completion rates and value-added scores. This ready access to metrics and student feedback significantly supports the reflection, planning and decision-making carried out by module and course teams and at a strategic level within the university.

This process is further enhanced through the automatic pre-population of data in the academic monitoring and enhancement process plans completed at module and course level. These plans are automatically populated with MEQ quantitative responses and progression data, for example.

The module leader completes their pre-populated plan by reflecting on the performance of the module and identifying actions for the next year which are informed by this data and other sources of evidence.

Our course enhancement plan is also pre-populated with NSS data, Kingston student survey data and other metrics. Our course leaders then develop action plans citing the evidence used including the individual module plans.

With our  strategy, Kingston University is using student feedback to address student retention and ensure all our students achieve their full potential.

Dr Tim Linsey is head of academic systems and evaluation in Kingston University’s directorate for student achievement


Read more: Nottingham Trent wins Advance HE award for student dashboard


Your [FREE] In-depth Guide To Object Storage

Plus: How University of Leicester Saved 25% in Data Storage Costs