What if handheld tools know what needs to be done and were even able to guide and help inexperienced users to complete jobs that require skill? Researchers at the University of Bristol have developed and started studying a novel concept in robotics – intelligent handheld robots.
Historically, handheld tools have been blunt, unintelligent instruments that are unaware of the context they operate in, are fully directed by the user, and critically, lack any understanding about the task they are performing.
Dr Walterio Mayol-Cuevas and PhD student, Austin Gregg-Smith, from the University’s Department of Computer Science, have been working in the design of robot prototypes as well as in understanding how best to interact with a tool that “knows and acts”. In particular, they have been involved with comparing tools with increasing levels of autonomy.
Compared to other tools such as power tools that have a motor and perhaps some basic sensors, the handheld robots developed at Bristol are designed to have more degrees of motion to allow greater independence from the motions of the user, and importantly, are aware of the steps being carried out. This allows for a new level of co-operation between user and tool, such as the user providing tactical motions or directions and the tool performing the detailed task.
Handheld robots, aim to share physical proximity with users but are neither fully independent as is a humanoid robot nor are part of the user’s body, as are exoskeletons. The aim with handheld robots is to capitalise on exploiting the intuitiveness of using traditional handheld tools while adding embedded intelligence and action to allow for new capabilities.
Dr Mayol-Cuevas, Reader in Robotics Computer Vision and Mobile Systems, said: “There are three basic levels of autonomy we are considering: no autonomy, semi-autonomous when the robot advises the user but does not act, and fully autonomous when the robot advises and acts even by correcting or refusing to perform incorrect user actions.”
The Bristol team has been studying user’s task performance and user preferences on two generic tasks: pick and drop of different objects to form tile patterns, and aiming in 3D for simulated painting.
Austin Gregg-Smith, a PhD student who is sponsored by the James Dyson Foundation, added: “Our results indicate that users tend to prefer a tool that is fully autonomous and there is evidence of a significant impact on completion time and reduced perceived workload for autonomous handheld. However, users sometimes also expressed how different it is to work with this type of novel robot.”
The researchers are currently investigating further topics on interaction, shared intelligence and new applications for field tasks, and due to the difficulties of starting in a new area of robotics, their robot designs are open source and available via www.handheldrobotics.org/
A paper about their recent work, which has been nominated for Best Cognitive Robotics Paper Award, Best Student Paper Award and Best Conference Paper Award, will be presented at this week’s IEEE International Conference on Robotics and Automation (ICRA).
A video of the first prototype in operation is on YouTube.
Paper
‘The design and evaluation of a co-operative handheld robot’ by Austin Gregg-Smith and Walterio W. Mayol-Cuevas at the IEEE International Conference on Robotics and Automation (ICRA) 2015.