Deep Learning for Feature Extraction in Remote Sensing: A Case-Study of Aerial Scene Classification

Scene classification relying on images is essential in many systems and applications related to remote sensing. The scientific interest in scene classification from remotely collected images is increasing, and many datasets and algorithms are being developed. The introduction of convolutional neural...

Full description

Bibliographic Details
Main Authors: Biserka Petrovska, Eftim Zdravevski, Petre Lameski, Roberto Corizzo, Ivan Štajduhar, Jonatan Lerga
Format: Article
Language:English
Published: MDPI AG 2020-07-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/20/14/3906
Description
Summary:Scene classification relying on images is essential in many systems and applications related to remote sensing. The scientific interest in scene classification from remotely collected images is increasing, and many datasets and algorithms are being developed. The introduction of convolutional neural networks (CNN) and other deep learning techniques contributed to vast improvements in the accuracy of image scene classification in such systems. To classify the scene from areal images, we used a two-stream deep architecture. We performed the first part of the classification, the feature extraction, using pre-trained CNN that extracts deep features of aerial images from different network layers: the average pooling layer or some of the previous convolutional layers. Next, we applied feature concatenation on extracted features from various neural networks, after dimensionality reduction was performed on enormous feature vectors. We experimented extensively with different CNN architectures, to get optimal results. Finally, we used the Support Vector Machine (SVM) for the classification of the concatenated features. The competitiveness of the examined technique was evaluated on two real-world datasets: UC Merced and WHU-RS. The obtained classification accuracies demonstrate that the considered method has competitive results compared to other cutting-edge techniques.
ISSN:1424-8220