It is very important to make the robot feel the object when grasping the object to improve efficiency. Researchers at the Federal Institute of technology in Zurich announced that they have developed a low-cost tactile sensor using machine learning technology. The sensor can measure the force distribution with high resolution and high precision. These features enable the robot arm to grasp sensitive and fragile objects more flexibly.

Human touch allows us to pick up fragile or slippery objects with our hands without worrying about crushing or falling objects. If an object falls off our fingers, we can adjust the force we hold accordingly. Scientists hope that robot claws that can grasp products can get similar feedback from touch. The new sensor invented by researchers is said to be an important step towards “robot skin”.

The sensor consists of an elastic silicone skin with colored plastic beads and an ordinary camera fixed at the bottom. Vision based sensors can see when it comes into contact with objects and dents appear on silicone skin. The contact changes the pattern of the beads, which can be recorded by the fish eye lens on the lower side of the sensor. The change of bead pattern is used to calculate the force distribution on the sensor.

The robot skin can distinguish several forces acting on the sensor surface, calculate them with high resolution and precision, and determine the action direction of the force. In calculating the force and direction toward the beads, the team used a set of experiments and data. This method enables the team to accurately control and systematically change the contact position, force distribution and size of the contact object. Machine learning enables researchers to record thousands of contact instances and accurately match them with changes in magnetic bead patterns. The team is also working on large sensors equipped with multiple cameras that can recognize objects with complex shapes. They are also trying to make the sensor thinner.

Leave a Reply

Your email address will not be published. Required fields are marked *