Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation

碩士 === 國立虎尾科技大學 === 機械與電腦輔助工程系碩士班 === 107 === With the improvement of industrial technology, robots have become an indispensable tool of production line in most industrial fields. In the past, robots were often used to repeatedly work in fixed paths. These tasks were required time-consuming point-to...

Full description

Bibliographic Details
Main Authors: CHIEN, CHUN-YU, 簡駿宥
Other Authors: CHANG, WEN-YANG
Format: Others
Language:zh-TW
Published: 2019
Online Access:http://ndltd.ncl.edu.tw/handle/99z6bb
id ndltd-TW-107NYPI0689042
record_format oai_dc
spelling ndltd-TW-107NYPI06890422019-10-06T03:35:30Z http://ndltd.ncl.edu.tw/handle/99z6bb Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation 基於物件姿態估測導引機械手臂智慧取放與加工系統 CHIEN, CHUN-YU 簡駿宥 碩士 國立虎尾科技大學 機械與電腦輔助工程系碩士班 107 With the improvement of industrial technology, robots have become an indispensable tool of production line in most industrial fields. In the past, robots were often used to repeatedly work in fixed paths. These tasks were required time-consuming point-to-point teaching actions through a teaching box by an experienced operator, which limit the developing space of robots. The use of machine vision-assisted robots has become a trend of development. With assistance of visual to identify objects in advance, machine vision will increase the additional value and effectiveness of robot production. The robot reads its information to complete the specified action to increase the flexibility of the robot's movement path. For the above reasons, this research develops a 3D object modeling and pose estimation system, which recognizes the change of object's six-degree-of-freedom coordinate, so that the robot can automatically generate the moving path and execute any picking function. The object was pick by another robot and placed to the pneumatic vise to do the offline programming. The research is mainly divided into four parts. The first is using point cloud processing technology to obtain the rotation and translation matrix of the sample point cloud and the target point cloud by point cloud pre-processing, feature point cloud identification, and point cloud matching. The experiment shows that the accuracy of the system estimation is 0.57~1.95mm and 0.5~1.37 degrees. The second is the robot intelligent pick-and-place system. The world coordinate system and the tool coordinate system of cameras and robots will be integrated by PLC and MACRO program of controller. And then, the value obtained by the estimation system is used to communicate and upload the robot and the computer by the TCP/IP protocol to reach robot’s intelligent pick-and-place function. The third is the offline programming system. The software establish system parameter file of Fanuc robots. The machining path is planned by the engineering method and the simulated collision test is performed, and then the post-processing constructor converts the tool path file into the robot code and uploads it to the robot for milling. Fanuc robot communication monitoring system and image G54 correction system will do the fast workpiece coordinate system positioning. The fourth is the cloud system of robot, which take the robot status information by collected layer and manage the message through MQTT cloud communication. The cloud page displays the robot status information on the webpage. This paper integrates the object pose estimation system, the robot Intelligent pick-and-place system, the offline programming system and the cloud networking system, and increases the production value and flexibility of the robot. The goal is to achieve automation, unmanned and intelligent indicators. CHANG, WEN-YANG CHEN, LI-WEI 張文陽 陳立緯 2019 學位論文 ; thesis 87 zh-TW
collection NDLTD
language zh-TW
format Others
sources NDLTD
description 碩士 === 國立虎尾科技大學 === 機械與電腦輔助工程系碩士班 === 107 === With the improvement of industrial technology, robots have become an indispensable tool of production line in most industrial fields. In the past, robots were often used to repeatedly work in fixed paths. These tasks were required time-consuming point-to-point teaching actions through a teaching box by an experienced operator, which limit the developing space of robots. The use of machine vision-assisted robots has become a trend of development. With assistance of visual to identify objects in advance, machine vision will increase the additional value and effectiveness of robot production. The robot reads its information to complete the specified action to increase the flexibility of the robot's movement path. For the above reasons, this research develops a 3D object modeling and pose estimation system, which recognizes the change of object's six-degree-of-freedom coordinate, so that the robot can automatically generate the moving path and execute any picking function. The object was pick by another robot and placed to the pneumatic vise to do the offline programming. The research is mainly divided into four parts. The first is using point cloud processing technology to obtain the rotation and translation matrix of the sample point cloud and the target point cloud by point cloud pre-processing, feature point cloud identification, and point cloud matching. The experiment shows that the accuracy of the system estimation is 0.57~1.95mm and 0.5~1.37 degrees. The second is the robot intelligent pick-and-place system. The world coordinate system and the tool coordinate system of cameras and robots will be integrated by PLC and MACRO program of controller. And then, the value obtained by the estimation system is used to communicate and upload the robot and the computer by the TCP/IP protocol to reach robot’s intelligent pick-and-place function. The third is the offline programming system. The software establish system parameter file of Fanuc robots. The machining path is planned by the engineering method and the simulated collision test is performed, and then the post-processing constructor converts the tool path file into the robot code and uploads it to the robot for milling. Fanuc robot communication monitoring system and image G54 correction system will do the fast workpiece coordinate system positioning. The fourth is the cloud system of robot, which take the robot status information by collected layer and manage the message through MQTT cloud communication. The cloud page displays the robot status information on the webpage. This paper integrates the object pose estimation system, the robot Intelligent pick-and-place system, the offline programming system and the cloud networking system, and increases the production value and flexibility of the robot. The goal is to achieve automation, unmanned and intelligent indicators.
author2 CHANG, WEN-YANG
author_facet CHANG, WEN-YANG
CHIEN, CHUN-YU
簡駿宥
author CHIEN, CHUN-YU
簡駿宥
spellingShingle CHIEN, CHUN-YU
簡駿宥
Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation
author_sort CHIEN, CHUN-YU
title Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation
title_short Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation
title_full Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation
title_fullStr Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation
title_full_unstemmed Intelligent Pick-and-Place and Processing Systems of Vision Robot Based on 3D Object Modeling and Pose Estimation
title_sort intelligent pick-and-place and processing systems of vision robot based on 3d object modeling and pose estimation
publishDate 2019
url http://ndltd.ncl.edu.tw/handle/99z6bb
work_keys_str_mv AT chienchunyu intelligentpickandplaceandprocessingsystemsofvisionrobotbasedon3dobjectmodelingandposeestimation
AT jiǎnjùnyòu intelligentpickandplaceandprocessingsystemsofvisionrobotbasedon3dobjectmodelingandposeestimation
AT chienchunyu jīyúwùjiànzītàigūcèdǎoyǐnjīxièshǒubìzhìhuìqǔfàngyǔjiāgōngxìtǒng
AT jiǎnjùnyòu jīyúwùjiànzītàigūcèdǎoyǐnjīxièshǒubìzhìhuìqǔfàngyǔjiāgōngxìtǒng
_version_ 1719262626501885952