Innovative human-robot interaction (HRI) methods permit to increase usability, simplify programming and improve safety of robotics arms.
Main fields of development:
- Gesture recognition for moving the robot
Using 3D sensors, the user movements are extrapolated and are used to guide the robot motion in real-time. In this way the robot is able to learn the task to be performed in an easy way.
- Manual guidance methods using a force/torque sensor
The user is able to manually move the robot in its workspace, applying forces to its tool, to teach a new task. Depending on the needs, it is possible to teach to the robot some fixed positions or to record a whole trajectory, which will then be optimized by the robot before executing it.
The manual guidance algorithm has been developed so that it can be applied to any kind of robot, provided that they can be controlled at low level. High precision movements are guaranteed through this algorithm.
- Robot real-time planning for obstacle avoidance using 3D environment reconstruction
The user can share the workspace with the robot, thanks to a 3D system which is monitoring the area. If the user gets too close to the robot, the robot will avoid the human (or other obstacles), interrupting temporarily the task if needed, giving priority to safety. Once the obstacles are removed, the robot will continue with its task.
- Simplified programming methods exploiting force and impedance control
In order to improve the robot path programming, it is possible to apply forces directly to the robot’s tool while it is executing a task, in order to modify it online. The robot will then memorize the user modifications and will optimize the path accordingly.