Measuring Object Rotation via Visuo-Tactile Segmentation of Grasping Region

Julio Castaño-Amorós, Pablo Gil,
Group of Automation, Robotics and Computer Vision (AUROVA)
University of Alicante

Examples of grasping force estimation

Abstract

When carrying out robotic manipulation tasks, objects occasionally fall as a result of the rotation caused by slippage. This can be prevented by obtaining tactile information that provides better knowledge on the physical properties of the grasping. In this paper, we estimate the rotation angle of a grasped object when slippage occurs. We implement a system made up of a neural network with which to segment the contact region and an algorithm with which to estimate the rotated angle of that region. This method is applied to DIGIT tactile sensors. Our system has additionally been trained and tested with our publicly available dataset which is, to the best of our knowledge, the first dataset related to tactile segmentation from non-synthetic images to appear in the literature, and with which we have attained results of 95% and 90% as regards Dice and IoU metrics in the worst scenario. Moreover, we have obtained a maximum error of ≈ 3º when testing with objects not previously seen by our system in 45 different lifts. This, therefore, proved that our approach is able to detect the slippage movement, thus providing a possible reaction that will prevent the object from falling.

Scheme of our system combining both stages.

BibTeX


@ARTICLE{10149493,
  author={Castaño-Amorós, Julio and Gil, Pablo},
  journal={IEEE Robotics and Automation Letters}, 
  title={Measuring Object Rotation via Visuo-Tactile Segmentation of Grasping Region}, 
  year={2023},
  volume={8},
  number={8},
  pages={4537-4544},
  keywords={Image segmentation;Grasping;Task analysis;Estimation;Tactile sensors;Neural networks;Proposals;Grasping;force and tactile sensing;perception for grasping and manipulation},
  doi={10.1109/LRA.2023.3285471}}