Identifying the vegetation type in Google Earth images using a convolutional neural network: a case study for Japanese bamboo forests

Abstract Background Classifying and mapping vegetation are crucial tasks in environmental science and natural resource management. However, these tasks are difficult because conventional methods such as field surveys are highly labor-intensive. Identification of target objects from visual data using...

Full description

Bibliographic Details
Main Authors: Shuntaro Watanabe, Kazuaki Sumi, Takeshi Ise
Format: Article
Language:English
Published: BMC 2020-11-01
Series:BMC Ecology
Subjects:
Online Access:https://doi.org/10.1186/s12898-020-00331-5
Description
Summary:Abstract Background Classifying and mapping vegetation are crucial tasks in environmental science and natural resource management. However, these tasks are difficult because conventional methods such as field surveys are highly labor-intensive. Identification of target objects from visual data using computer techniques is one of the most promising techniques to reduce the costs and labor for vegetation mapping. Although deep learning and convolutional neural networks (CNNs) have become a new solution for image recognition and classification recently, in general, detection of ambiguous objects such as vegetation is still difficult. In this study, we investigated the effectiveness of adopting the chopped picture method, a recently described protocol for CNNs, and evaluated the efficiency of CNN for plant community detection from Google Earth images. Results We selected bamboo forests as the target and obtained Google Earth images from three regions in Japan. By applying CNN, the best trained model correctly detected over 90% of the targets. Our results showed that the identification accuracy of CNN is higher than that of conventional machine learning methods. Conclusions Our results demonstrated that CNN and the chopped picture method are potentially powerful tools for high-accuracy automated detection and mapping of vegetation.
ISSN:1472-6785