This work presents a perception system applied to robotic manipulation, that is able to assist in navegation, household wasteclassification and collection in outdoor environments. This system is made up of optical tactile sensors, RGBD cameras and aLiDAR. These sensors are integrated on a mobile platform with a robot manipulator and a robotic gripper. Our system is dividedin three software modules, two of them are vision-based and the last one is tactile-based. The vision-based modules use CNNsto localize and recognize solid household waste, together with the grasping points estimation. The tactile-based module, whichalso uses CNNs and image processing, adjusts the gripper opening to control the grasping from touch data. Our proposal achieveslocalization errors around 6 %, a recognition accuracy of 98 % and ensures the grasping stability the 91 % of the attempts. The sumof runtimes of the three modules is less than 750 ms.
The dataset used for detecting and recognising objects is publicly available at HOWA_dataset. Datasets for location and tactile perception are not publicly available in this moment but they could be available from Institutional Repository of the University of Alicante or asking authors.
@article{castano2023manipulacion,
title={Visual-Tactile Manipulation to Collect Household Waste in Outdoor},
author={Castaño-Amorós, Julio and Páez-Ubieta, Ignacio de Loyola and Gil, Pablo and Puente, Santiago Timoteo},
journal={Revista Iberoamericana de Automática e Informática industrial (RIAI)},
volume={20},
number={2},
pages={163-174},
year={2023}
doi={10.4995/riai.2022.18534},
publisher={Universidad Poletécnica de Valencia, Editorial UPV}
}