期刊论文详细信息
International Journal of Advanced Robotic Systems
Multimodal grasp data set: A novel visual–tactile data set for robotic manipulation
TaoWang1 
关键词: Grasping data set;    visual;    tactile;    robotic manipulation;    slip detection;    long short-term memory;   
DOI  :  10.1177/1729881418821571
学科分类:自动化工程
来源: InTech
PDF
【 摘 要 】

This article introduces a visual–tactile multimodal grasp data set, aiming to further the research on robotic manipulation. The data set was built by the novel designed dexterous robot hand, the Intel’s Eagle Shoal robot hand (Intel Labs China, Beijing, China). The data set contains 2550 sets data, including tactile, joint, time label, image, and RGB and depth video. With the integration of visual and tactile data, researchers could be able to better understand the grasping process and analyze the deeper grasping issues. In this article, the building process of the data set was introduced, as well as the data set composition. In order to evaluate the quality of data set, the tactile data were analyzed by short-time Fourier transform. The tactile data–based slip detection was realized by long short-term memory and contrasted with visual data. The experiments compared the long short-term memory with the traditional classifiers, and generalization ability on different grasp directions and different objects is implemented. The results have proved that the data set’s value in promoting research on robotic manipulation area showed the effective slip detection and generalization ability of long short-term memory. Further work on visual and tactile data will be devoted to in the future.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO201910253710811ZK.pdf 867KB PDF download
  文献评价指标  
  下载次数:5次 浏览次数:6次