Watch out: there’s a robot about

Bristol Robotics Laboratory is joining other leading research institutions in a new robotics in open spaces project

Bristol Robotics Laboratory (BRL) is joining other leading research institutions in a new project looking at how remotely operated robots could enable people to take part in public spaces – without actually being there.

Along with the Universities of Bath, Exeter, Oxford and Queen Mary University of London, BRL researchers will look at how using remotely operated robots might enable people to participate in public spaces – a key aspect of developing successful citizenship and public cohesion – if accessibility or geography prevents them from being physically present in the space.  BRL is collaborative partnership between the University of the West of England (UWE Bristol) and the University of Bristol.

The £2 million three-year project, Being There: Humans and Robots in Public Spaces, funded by the EPSRC and led by Exeter University, will examine how robotics can help to bridge the gap between the way we communicate in person and online.

It will look at the social and technological aspects of being able to appear in public in proxy forms, via a range of advanced robotics platforms. The robots will be controlled remotely, a method called tele-operation, and a tele-operator will be able to see through the robot’s eyes and speak through its mouth, while directing where it looks and how it moves.

UWE Bristol Robotics Research Associate, Dr Paul Bremner,  said, “Public spaces play a valuable role in providing shared understanding and common purpose, but if you are ill or disabled, or live too far away, this can be a barrier to participation.  The aim of our research is for the robot to be an avatar for a remote person so it will be taking part in the same activities as those actually present in the venue. 

“To investigate this we will use several robots such as Engineered Arts’ Robothespian, Aldebaran’s NAO, and MobileRobots’ PeopleBot .   The robots will be tele-operated to produce speech, gestures and other non-verbal social behaviour so that we can look at the way robot avatars transmit social presence, first using motion capture (using a Microsoft Kinect) and later using desktop control (a keyboard and mouse).  Over the course of the project some autonomy will be added to the robots to enable better social interaction and allow simple desktop control. We will also investigate how different robot appearances and behaviours affect the social interaction.

“An example of how this could eventually be used might be a NAO robot in a museum , acting as an avatar – looking round at the exhibits and interacting with other visitors – on behalf of someone who was in another part of the city, unable to visit the location because of disability or illness.  This research will enable us to develop the technology to enable this to happen, and to evaluate the human interaction with the robot.  Developing robots that will be effective in their interactions with humans in different social situations is crucial to the future robotic development that is useful to human society.”

The research team will create a ‘living laboratory’, using state-of-the-art technologies to measure how people respond to, and interact with other people who are acting through a robot representative. They can then compare the measurements for different robots and means of control with how people normally interact in the living lab (as well as with one another).

They will be taking the robots and the tele-operation control stations to public spaces around Bristol and Bath in the later stages of the research to measure human interaction with robots in the real world.

Supporting this process, digital creatives from Bristol’s iShed will work alongside the researchers, bringing their expertise in public engagement to help bring the research out of the lab and into a range of public spaces in Bristol.

Photo credit: Aldebaran’s NAO





Send an Invite...

Would you like to share this event with your friends and colleagues?