What’s the future of chatbots in HE?
James Higgins looks at how universities are revolutionising chatbots and creating new, personalised tools for student engagement
What’s a chatbot?
Chatbots were – and perhaps still are – a frustration for anyone desperately trying to contact an insurance company or rearrange a delivery slot. The chatbots’ inability to ‘chat’ has led to an unpopular opinion of them. While a bot can only understand what a user says, a capable chatbot (or ‘chatterbot’ as they are also known) can engage and help its user communicate commands more effectively.
As the technology behind them rapidly improves – and universities take control of their own programs – there is a huge opportunity for these AIs to revolutionise student interfaces.
Gaps can emerge between educators’ needs, governments’ instruction and companies’ direction. But universities, who benefit from the sort of in-house expertise not enjoyed by schools, are better able to exploit technological developments for their own bespoke purposes. So, how are chatbots being developed, and what are they doing for universities, both today and on the campuses of tomorrow?
Where is the technology used?
Universities’ use of chatbots is, in many cases, in its infancy, but there are many notable and eye-catching examples of their versatility. In 2017, Leeds Beckett University announced chatbot technology to assist prospective students find the right course for them through clearing.
Using Facebook Messenger’s chatbot technology, the LBU bot – which is supported on desktop and mobile – uses menus and keywords to help students.
Dougal Scaife, head of digital at LBU, said: “We know that our prospective students already use lots of messaging software for communicating with their friends, such as Snapchat, WhatsApp and texting, so developing a chatbot was a natural evolution in order to engage with our prospective students in a medium that’s ubiquitous, familiar, and comfortable for them.”
Chatbots sift through the students’ information before making a provisional offer.
In last year’s admissions cycle, Ucas revealed over 60,100 people gained a place through clearing. The scope for streamlining is huge, particularly given how scalable chatbot technology is.
But programs like the one at LBU are at the less sophisticated end of the spectrum.
Andy Feltham, vice-president of development at Filament AI, says:
“The status of deployment at universities is developing and others are exploring options and developing. And that’s fairly common about the way people adopt this type of machine-based learning technology. It’s still at that stage where people need to explore it for themselves to understand what its strengths and weaknesses are.
“It is recognised as a fairly mature technology in terms of the way it works; what is not very mature is the understanding around how to drive it.”
Filament AI, Feltham says, is helping bridge the gap between the theory and practicality. How do you make a chatbot engaging, meaningful and well-structured?
What’s the next step for chatbots?
Lancaster University has recently launched a chatbot companion for students. Based on their widely used app, iLancaster, Ask LU’s voice interface can respond to questions about timetables, tutors and grades, but also suggest useful tips like where to find a free computer or washing machine.
Based on Amazon Web Services (AWS), Lancaster’s Information Systems Services team have created a chatbot that can have more complex interactions with students and even signpost them towards wellbeing and welfare support.
The university plans to use artificial intelligence (AI) and machine learning (ML) to identify patterns in the data created from students’ usage. The app uses AWS Cloudwatch, AWS Virtual Private Cloud and AWS ElasticSearch for logging data ingress and fuzzy searching. Every interaction with the chatbot could give direction on how to improve the platform more.
Chris Dixon, head of IT partnering and innovation, sees Lancaster’s innovations as key to its place in the HE sector: “We’re going to attract graduates coming to university to study computing or even languages. They’ll be interested in work we’ve done on chatbots, see we’re leading the way and be encouraged to come to the university.”
The project was built using the paid-for skills of undergraduates at the university, giving them valuable project management and coding experience. The university also benefited from student guidance from the very outset.
The chatbot currently links to intranet data, but Dixon wants to expand Ask LU’s welfare offering. “In theory, we’ve got the data to assist a student stuck writing an essay,” he says. “Based on what we know about them, and other students like them, we can recommend solutions and tips. And that will give you a better academic output.
“But if a student were really struggling, the chatbot needs to have a longer conversation. We’re working with the welfare team to develop coaching questions, ways to manage anxiety, and mindfulness tools.”
Dixon hopes Lancaster can commercialise its developments: “I’m aware some universities do not have the same development facilities. Perhaps in the future we can sell what we’ve done to other universities.”
Martin Hamilton, Jisc’s Futurist, thinks virtual tutors are around the corner. “Perhaps in five to ten years’ time, these chatbots will have evolved from helper to virtual tutor.
If universities plug in more data, such as from coursework, bots can offer students the ability to achieve their very best assisted with their own virtual advisor.
“We still tend to think of higher education in quite traditional terms, but AI could transform this model. Much depends on the extent of our risk appetite for technologies that could upend our whole concept of what HE means.
We are working with members to help mitigate the risk with the decision to embrace emerging technologies with our Step Up programmes and our contributions to discussions around the ethical use of AI.”
What happens next?
While many universities can understand the basic mechanics of the technology, making that technology sophisticated enough that it can supplement human resources is an ongoing task. Feltham says a pitfall for chatbots is when users reach a ‘misunderstood state’. In other words, when the chatbot has exhausted its programmed options and questions.
This leads to frustration and, in Feltham’s words, “very obvious clues that this is a bot and nothing more intelligent”. A chatbot that is intelligent enough to rephrase questions to seek new answers and options is a crucial breakthrough. But there is a trade-off.
“If you make the chatbot too humanlike,” Feltham says, “people do not open up as much. But you can’t ignore the fact that if people say something that is very negative, you need to reflect on that. If people contact a university and say I’m having trouble finding enough money to pay for food, it can’t be a generic response.”
Dr Ivan Mitchell, director of edtech startup Studious and a lecturer in organisational behaviour at the University of East Anglia (UEA), is working on the development of a new learning platform with colleagues from University College London (UCL).
He says: “We’re trying to develop a platform that can teach or provide tutor support through the chatbot.
If a student, for example, has an inquiry about what transformational leadership is, we can give them an answer, but we can then take them through various interactive learning paths, for example, to discuss various analyses or evaluations or applications of the concept being studied.
As the complexity of conversations improves, chatbots will have the ability to offer not only emotional support, but pedagogical tools, too
“Through an interactive dialogue, we can start probing the student about their knowledge and curiosities, and tailor that learning specifically. This is what makes it a more adaptive and personalised learning experience.”
A chatbot that provides simple administrative support is one thing, but are there more complexities to confront if universities are to use chatbots as a pedagogical tool? How to deal with the myriad of questions that might be posed, is just one.
Mitchell agrees that chatbot technologies face challenges in this area.
“Ultimately, students will ask the same question in many different ways. The chatbot has to identify what that question is, and then give a relevant answer, which is the key challenge, and why we’re focusing on identifying key concepts to then take students down a fairly closed interactive dialogue path from that.”
Chatbots, as a learning aid, are still in their infancy because, in Mitchell’s view, “of the complexity of the information they have to match”.
“It is not,” Mitchell explains, “an open system at present that you can ask complex questions of. For example, what’s the difference between such and such, or compare and contrast these two things. That may come eventually, but we’re not there yet.”
And are there moral complexities to consider if AI programs teach students?
Not least, from where the chatbot draws information? Herein lies the advantage for universities creating bespoke products.
“Having credible knowledge on the platform is critical,” Mitchell agrees. “We will be feeding the knowledge from academic writers and all the content for our platform is going to be drawn from that.”
Mitchell also sees the potential for chatbots to help students in the social sciences and arts “sift through the reading material they must digest”.
As developments in emotional recognition improve and as the complexity of conversations improves, chatbots will have the ability to offer not only emotional support, but pedagogical tools, too.
The fusion of university- led startups, student-led in-house projects and the powerful edtech sector, chatbots present an opportunity for universities to make the tools for their trade, and to disseminate those tools to the entire educational sector.
You might also like: Could tech help open the gates to higher education?