Cut Roses Grading with Machine Vision and Neural Network

碩士 === 國立中興大學 === 農業機械工程學系 === 84 === The purpose of this thesis is to develop digital image processingtechniques to extract feature parameters of cut roses, and to use neuralnetwork to simulate the manual grading experiences for cut roses grading....

Full description

Bibliographic Details
Main Authors: Tsay, Yue Fen, 蔡玉芬
Other Authors: Fun Fen Lee
Format: Others
Language:zh-TW
Published: 1996
Online Access:http://ndltd.ncl.edu.tw/handle/23244800943784275557
Description
Summary:碩士 === 國立中興大學 === 農業機械工程學系 === 84 === The purpose of this thesis is to develop digital image processingtechniques to extract feature parameters of cut roses, and to use neuralnetwork to simulate the manual grading experiences for cut roses grading. Two color images were grabbed for each rose, one of which was thewhole cut rose image for analyzing the morphological features of the stem,the other was the bud image for analyzing the bud features. The stemsegmentation method was first to define the stem image characteristics,then to search the image column by column based on the characteristicsdefined, and finally to label the stem segments. To segment the bud image,the color segmentation and the dilation and erosion techniques were utilizedand the color information of the bud was not changed. Ten feature parameterswere extracted for each cut rose. The stem straightness parameters were themaximum crooked angle, the maximum deviated distance, and the average deviateddistance. The stem diameter parameters were the bottom diameter, the middlediameter, and the top diameter. And the bud maturity parameters were theprojected area, the perimeter, the compactness, and the principal axes. Partof the 10 features were selected and inputted to an error back-propagationneural network to simulate human quality grading operations for cut roses.The length grading was run only by the image processing program. The cut roses length grading accuracy is 93%, and the identificationrate with the best neural network model obtained in this study is 70.7%,compared with human grading results.