3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands

Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception syst...

Full description

Bibliographic Details
Main Authors: Carlos M. Mateo, Pablo Gil, Fernando Torres
Format: Article
Language:English
Published: MDPI AG 2016-05-01
Series:Sensors
Subjects:
Online Access:http://www.mdpi.com/1424-8220/16/5/640
id doaj-9e80fc01222644e18e6d7f3e590861cd
record_format Article
spelling doaj-9e80fc01222644e18e6d7f3e590861cd2020-11-24T22:17:02ZengMDPI AGSensors1424-82202016-05-0116564010.3390/s16050640s160506403D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot HandsCarlos M. Mateo0Pablo Gil1Fernando Torres2Computer Science Research Institute, University of Alicante, San Vicente del Raspeig, Alicante 03690, SpainPhysics, Systems Engineering and Signal Theory Department, University of Alicante, San Vicente del Raspeig, Alicante 03690, SpainPhysics, Systems Engineering and Signal Theory Department, University of Alicante, San Vicente del Raspeig, Alicante 03690, SpainSensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments.http://www.mdpi.com/1424-8220/16/5/640visual perceptionvision algorithms for grasping3D-object recognitionsensing for robot manipulation
collection DOAJ
language English
format Article
sources DOAJ
author Carlos M. Mateo
Pablo Gil
Fernando Torres
spellingShingle Carlos M. Mateo
Pablo Gil
Fernando Torres
3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
Sensors
visual perception
vision algorithms for grasping
3D-object recognition
sensing for robot manipulation
author_facet Carlos M. Mateo
Pablo Gil
Fernando Torres
author_sort Carlos M. Mateo
title 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
title_short 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
title_full 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
title_fullStr 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
title_full_unstemmed 3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
title_sort 3d visual data-driven spatiotemporal deformations for non-rigid object grasping using robot hands
publisher MDPI AG
series Sensors
issn 1424-8220
publishDate 2016-05-01
description Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments.
topic visual perception
vision algorithms for grasping
3D-object recognition
sensing for robot manipulation
url http://www.mdpi.com/1424-8220/16/5/640
work_keys_str_mv AT carlosmmateo 3dvisualdatadrivenspatiotemporaldeformationsfornonrigidobjectgraspingusingrobothands
AT pablogil 3dvisualdatadrivenspatiotemporaldeformationsfornonrigidobjectgraspingusingrobothands
AT fernandotorres 3dvisualdatadrivenspatiotemporaldeformationsfornonrigidobjectgraspingusingrobothands
_version_ 1725786949707366400