Code-Aligned Autoencoders for Unsupervised Change Detection in Multimodal Remote Sensing Images

Image translation with convolutional autoencoders has recently been used as an approach to multimodal change detection (CD) in bitemporal satellite images. A main challenge is the alignment of the code spaces by reducing the contribution of change pixels to the learning of the translation function....

Full description

Bibliographic Details
Main Authors: Anfinsen, S.N (Author), Bianchi, F.M (Author), Hansen, M.A (Author), Jenssen, R. (Author), Kampffmeyer, M. (Author), Luppino, L.T (Author), Moser, G. (Author)
Format: Article
Language:English
Published: Institute of Electrical and Electronics Engineers Inc. 2022
Subjects:
Online Access:View Fulltext in Publisher
Description
Summary:Image translation with convolutional autoencoders has recently been used as an approach to multimodal change detection (CD) in bitemporal satellite images. A main challenge is the alignment of the code spaces by reducing the contribution of change pixels to the learning of the translation function. Many existing approaches train the networks by exploiting supervised information of the change areas, which, however, is not always available. We propose to extract relational pixel information captured by domain-specific affinity matrices at the input and use this to enforce alignment of the code spaces and reduce the impact of change pixels on the learning objective. A change prior is derived in an unsupervised fashion from pixel pair affinities that are comparable across domains. To achieve code space alignment, we enforce pixels with similar affinity relations in the input domains to be correlated also in code space. We demonstrate the utility of this procedure in combination with cycle consistency. The proposed approach is compared with the state-of-the-art machine learning and deep learning algorithms. Experiments conducted on four real and representative datasets show the effectiveness of our methodology. IEEE
ISBN:2162237X (ISSN)
DOI:10.1109/TNNLS.2022.3172183