Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network

Herbicide use is rising globally to enhance food production, causing harm to environment and the ecosystem. Precision agriculture suggests variable-rate herbicide application based on weed densities to mitigate adverse effects of herbicides. Accurate weed density estimation using advanced computer v...

Full description

Bibliographic Details
Main Authors: Muhammad Hamza Asad, Abdul Bais
Format: Article
Language:English
Published: KeAi Communications Co., Ltd. 2020-12-01
Series:Information Processing in Agriculture
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2214317319302355
Description
Summary:Herbicide use is rising globally to enhance food production, causing harm to environment and the ecosystem. Precision agriculture suggests variable-rate herbicide application based on weed densities to mitigate adverse effects of herbicides. Accurate weed density estimation using advanced computer vision techniques like deep learning requires large labelled agriculture data. Labelling large agriculture data at pixel level is a time-consuming and tedious job. In this paper, a methodology is developed to accelerate manual labelling of pixels using a two-step procedure. In the first step, the background and foreground are segmented using maximum likelihood classification, and in the second step, the weed pixels are manually labelled. Such labelled data is used to train semantic segmentation models, which classify crop and background pixels as one class, and all other vegetation as the second class. This paper evaluates the proposed methodology on high-resolution colour images of canola fields and makes performance comparison of deep learning meta-architectures like SegNet and UNET and encoder blocks like VGG16 and ResNet-50. ResNet-50 based SegNet model has shown the best results with mean intersection over union value of 0.8288 and frequency weighted intersection over union value of 0.9869.
ISSN:2214-3173