ROBOMECH Journal | |
View-based teaching/playback for robotic manipulation | |
Yusuke Maeda1  Takahito Nakamura2  | |
[1] Faculty of Engineering, Yokohama National University, Hodogaya-ku, Japan;Nikon Corp., Tokyo, Japan | |
关键词: Robot programming; View-based approach; Neural networks; | |
DOI : 10.1186/s40648-014-0025-4 | |
学科分类:人工智能 | |
来源: Springer | |
【 摘 要 】
In this paper, we study a new method for robot programming: view-based teaching/playback. The motivation of its development is to achieve more robustness against changes of task conditions than conventional teaching/playback without losing its general versatility. For proof of concept, the method was implemented and tested on a virtual environment. The method is composed of two parts: teaching phase and playback phase. In the teaching phase, a human operator commands a robot to achieve a manipulation task. All the movements of the robot are recorded. All the images of the teaching scenes are also recorded by a camera. Then, a mapping from the recorded images to the movements is obtained as an artificial neural network. In the playback phase, the motion of the robot is determined by the output of the neural network calculated from scene images. We applied this view-based teaching/playback to pick-and-place and pushing by a robot hand with eight degrees of freedom in the virtual environment. Human demonstrated manipulation was successfully reproduced by the robot hand with our proposed method. Moreover, manipulation of the object from some initial positions that are not identical to those in the demonstrations was also successfully achieved with our method.
【 授权许可】
CC BY
【 预 览 】
Files | Size | Format | View |
---|---|---|---|
RO201904027397198ZK.pdf | 2320KB | download |