Fixing a pain in the ass-essment

We all know why remote exams and assessments became a necessity. The big question is, what do universities want to see happen next? Cris Warren investigates.

Chatty and buzzing with anecdotes and ideas, Andrew Cormack falls silent, brows furrowed, to consider the question: Can anything positive can be found among the debris after exams and assessments were hoist by the pandemic?

A few seconds, then: “Open Book!”

Jisc’s chief regulatory adviser is off again. The emergency pivot from exam hall memory tests to online, open-book exams could prove to be a catalyst for a shake-up in the way assessment is carried out: “It’s been a huge win for authenticity,” he beams.

A fan, then? “Oh yes,” replies Andrew, explaining that he is at a loss to recall a single instance in his working life where he’s been asked to perform a real-time, three-hour memory test. He can’t find a compelling reason why students should be expected to, either.

Assessment and exams are likely to continue online in some shape or form. If there’s a future for them, says Andrew, “they could do with being reimagined”. He’d start that process by taking a lot of the emphasis off recalling information.

“There may well still be things where we want a memory test,” he says, suggesting a short 30-minute burst of closed questions as an alternative instead. “But, even then, do we really care if the individual goes to a book? Because if they can answer all the questions in 30 minutes by looking them up in the books, then they know the subject, then they know where to look in the book!”

Andrew cites his own 24-hour open-book exam for his law master’s as the “most terrifying yet fulfilling exam I’ve ever had”. He’d covered two of the questions during his course; the other two he researched from scratch during the exam period. “It was a fantastic experience to know that I could do that. I learned far more from that experience.”

The University of Bristol says it is keen to find more ways to assess collaborative work – and opportunities for ‘slow’ assessments.

 

Doing it by the books

Andrew says the noise he’s hearing in academic circles, professionally and anecdotally, voices in favour of timed (usually a 24-hour window), remote, open-book assessment. A lot of academics say there is no alternative, he reflects. “For instance, there have been some really interesting conversations around inclusion, especially for groups who struggle with sitting for three hours at a desk, so it also seems to be winning out on that.”

Dr Mark Allinson is associate pro-vice-chancellor for learning and teaching at the University of Bristol. His university, he says, will continue with online examinations for the remainder of this academic year, which will accommodate overseas students unable to travel. But closed-book exams will still play a role because, he says, “some disciplines require students to be able to demonstrate that they can quickly access core information and work with it”. But, Bristol recognises that skill is not the most reliable indicator of student ability and that form of assessment doesn’t replicate authentic situations or stretch students to produce their best work.

“They are known to induce higher levels of stress in students,” Mark continues, “which is a concern when we, and other universities, are concerned to optimise student wellbeing. We are also interested in developing more authentic assessments, more opportunities for collaborative work, and more ‘slow’ assessments, which will allow students to develop more considered responses to more open-ended and potentially more authentic challenges.”

Bristol will run a few AI-aided invigilated exams this year – but they’re planning to phase them out entirely by the next academic year.

That would meet Andrew Cormac’s approval. Examinees should, he thinks, be under the stewardship of a human invigilator, not an AI jackboot. And the entire system discriminates between the tech-and-bandwidth haves and have nots. “They’re horrible for accessibility, you have to have a great broadband connection that stays ‘live’ for three hours. If your video link goes down, that’s a mark against you, you can’t have anyone walking past you in the background – so how many students does that cut out? I’m really hoping it’s (e-proctoring) a very temporary approach.”

Self-styled experts coach students via YouTube videos on ways to evade the e-proctoring programs.

 

The proctor will see you now

There’s little doubt that Christmas came in a pandemic shaped gift-box for the e-proctoring business.

In 2019 the market was valued at US$354.37m. A 2021 report by researchandmarkets.com [take a deep breath] called: ‘Online Exam Proctoring Market Forecast to 2027 – Covid-19 Impact and Global Analysis By Type (Advanced Automated Proctoring, Recorded Proctoring, and Live Online Proctoring) and End User (Schools and Universities, Enterprises, and Government’, says it’s projected to reach US$1,187.57m by 2027.

Many universities – especially in the US – viewed the tech as something of a saviour, which stepped up during an emergency that closed campuses across the globe. But there is growing disquiet in the US mainstream and student media that proctor plugins are, basically, spyware; consensual, yes, but in a ‘sign up or else’ way.

By and large, e-proctor features log key-strokes, snap-shot screens, monitor network traffic and cuckoo laptop cameras and audio to record the surrounding environment. All that data is collected for whom exactly? And the systems themselves are far from infallible.

We are also interested in developing more authentic assessments, more opportunities for collaborative work, and more ‘slow’ assessments, which allow students to develop more considered responses

Parallel to the massive business growth of e-proctor platforms runs a cottage industry dedicated to gaming them. A quick Google trawl of contract cheating services, aka essay mills (soon to be made illegal in the UK under the Skills and Post-16 Education Bill), reveals an abundance of embedded features on how to ‘game’ the e-proctors, services making hay while the sun shines, even as twilight looms. YouTube hosts a legion of young, by and large male, ‘influencers’ who front slickly produced, step-by-step guides to getting one over the AI invigilators.

Among them is a cocksure, 20-something American: Thatonemohamed. In an especially brazen style, he produces guides for students. One such video guides students through outsmarting downloadable browser apps that lock computers while they are taking a test, such as Respondus. Its Achilles heel: it does not have any recording capability.

Some of these YouTube videos have clocked over a million views. Perhaps reassuringly, most are in the low thousands. None of them beat around the bush, though; if you want to cheat, here’s how.

Of course, the vast majority of students don’t cheat. But even when a tiny minority do, it can cast doubt on legitimate examinees and profoundly damage institutional reputations.

Drs Alison Hill and Nic Harmer – from the University of Exeter – developed a computer program during the pandemic to ‘design out’ cheating.

 

A magic bullet?

It’s the thought of the ramifications of e-proctoring that prompted Drs Alison Hill and Nic Harmer – both senior bioscientists at The University of Exeter – to develop a novel computer program that ‘designs out’ cheating.

“Some universities have gone down the road of hijacking your camera, that’s the basics of e-proctoring” says Alison, “but you can immediately find videos and information of how to get around that. Other universities take the Honour Code route of a signed declaration that the work is your own. I think Nick and I agree that’s just naive. So the task comes down to redesigning the whole exam and effectively making it Google-proof.”

Harmer and Hill’s program models lab equipment that produces realistic data. The really clever bit, though, is the introduction of elements of randomness that results in each data-set being different. That means each student, sitting the same test, takes a unique route to reach the correct conclusion. The trials were limited to 60 bespoke datasets but Hill and Harmer say that it could potentially be used to generate thousands of unique datasets for assessments and exams across a range of different disciplines.

“Essentially,” explains Nic, “where you’ve got a numerical component to the exam, there’s no reason why everyone has to have the same numbers. So if there’s a problem that you’re asking people to solve and there are many possible answers depending on the data, you set up a system where each student is going to get a different set of data.”

The unspoken assumption of e-proctoring is that, given half the chance, students will cheat: Nic and Alison find that dispiriting. If exams weren’t already stressful enough, taking them with the feeling that you are under suspicion is just added pressure. “This is modelled so that the file is sent to the student, they open it and work through it. If there was an incentive to cheat, we’ve restructured the exam so it’s removed.”

“While every student gets their own paper, to ensure it’s fair they are all answering the same question, they’re just using different data,” explains Alison. “So, say there were four possible answers for the first part of the question – ‘H’, ‘M’, ‘N’ or ‘C’. No one answers ‘C’ because none of the data points that way, but even if they asked their friend, ‘Did you get this?’ it won’t help them because they might not have the same answer, the actual numerical value would be different for everybody. So all the people who had ‘M’ as the right answer would all have different numerical answers as well.”

The program has been designed for assessment for science and technology disciplines that use numerically based data, but there’s some scope for the humanities. Says Nic: “English and history tend to require creative interpretation of a question. It’s unique, and the opportunities for collusion go down and plagiarism checkers, like Turnitin, are already well established.” But, adds Alison: “It could work with images and dates and, of course, aspects of geography, are numerically based and very data-driven and could absolutely translate to the programme”.


Remote controls

Dr Mark Allinson, Associate pro-vice-chancellor for learning and teaching, The University of Bristol

Will Bristol continue to employ online examinations even after Covid rules are lifted?

For the remainder of this academic year we will, even though Covid rules have been lifted, because we’re running a number of programmes in ‘hybrid’ mode, where some students are studying online from overseas locations and we’re concerned to ensure parity of experience. This would be difficult if some students were sitting a traditional exam in an exam hall, while others were online. In an uncertain environment for the remainder of this academic year, a decision now to retain online exams provides certainty for students about what they can expect.

For the academic year ahead, we are not planning to run hybrid programmes, providing the Covid situation is resolved far enough to allow overseas travel to the UK. Consequently, we are not planning to run closed-book exams online, and therefore also not to run proctored exams. We may run open-book exams online in cases where our schools present a compelling pedagogical rationale for this.

We are likely to run timed assessments online under our examination regulations. In Bristol, these have been assessments which require students to complete a piece of work over a number of days, rather than a number of hours, but with a fixed submission time.

It is worth noting that only a subset of our assessments are examinations in any year.

What are the strengths and weaknesses of remote assessment?

Without proctoring arrangements, closed-book exams cannot be run in ways which would fully ensure the academic integrity expected of, and by, our students. This means that online exams need to be carefully designed in open-book formats. Even here, academic considerations are key, since it is hard to be sure that students have not collaborated on their responses. Beyond that, exams which require students to submit their responses by a precise end point are susceptible to all manner of technical disruption.

We have found that significant staff workload is incurred by apparent (but rarely real) breaches of exam rules in proctored exams, requiring extensive manual checking. We need also to be mindful of the environments in which students complete their exams – if they are at home, they may not have a suitable space in which to work quietly, even if they have suitable equipment and internet connections. If they instead come to campus, the university will need to provide suitable spaces and to consider whether these need to be invigilated or facilitated.

What makes for a well-planned online assessment?

Ideally, avoid assuming that the traditional exam format can be lifted and shifted online. In open-book exams, question design needs to reflect and see an opportunity in students’ ability to work with online materials. Teaching and preparation time needs to emphasise different revision approaches so that students consider what they need to know in advance, and learn it, and how they will be able to make ready use of other materials in the (short) time available.

Academics should consider that online exams should not normally have three questions to complete in three hours, as in some traditional exam room formats, but instead imagine how students will be more likely to work in a different environment. There also need to be clear criteria for the time by which students must cease work, and for the time available to submit (if this is additional or requires additional processing). This must also take account of students with additional time allowances of variable lengths – bearing in mind that a long exam may prove unfeasibly long for a student with extended additional time allowances, a consideration also for on-campus exams.


You might also like: Essay mills outlawed in government skills bill

Leave a Reply

Free live webinar & QA

Blended learning – Did we forget about the students?

Free Education Webinar with Class

Wednesday, June 15, 11AM London BST

Join our expert panel as we look at what blended learning means in 2022 and how universities can meet the needs of ever more diverse student expectations.

Send an Invite...

Would you like to share this event with your friends and colleagues?