High‐resolution optical‐to‐SAR image registration using mutual information and SPSA optimisation

Abstract The high‐resolution optical and synthetic aperture radar (SAR) images are widely used in many remote sensing application areas such as image fusion and change detection where image registration is a fundamental step. The latest high‐resolution optical and SAR satellites and airborne systems...

Full description

Bibliographic Details
Main Authors: Sourabh Paul, Umesh C. Pati
Format: Article
Language:English
Published: Wiley 2021-05-01
Series:IET Image Processing
Online Access:https://doi.org/10.1049/ipr2.12107
Description
Summary:Abstract The high‐resolution optical and synthetic aperture radar (SAR) images are widely used in many remote sensing application areas such as image fusion and change detection where image registration is a fundamental step. The latest high‐resolution optical and SAR satellites and airborne systems provide geometrically corrected images which do not contain global deformations. Though the images do not have global differences, still, registration differences exist between these optical and SAR images. These registration differences should be minimised through an automatic registration method before using the images for the aforementioned applications. However, an automatic optical‐to‐SAR image registration is a challenging task due to the presence of significant nonlinear intensity differences as well as local geometric distortions between the images. In order to solve these problems, an automatic optical‐to‐SAR image registration method is proposed which can effectively handle the registration differences between the globally corrected high‐resolution images. In the proposed method, initially, a coarse registration is performed by using a discrete simultaneous perturbation stochastic approximation (SPSA) optimisation. Then, a smooth continuous SPSA optimisation is utilised for the fine registration of the images. Experiments are performed on six sets of high‐resolution optical‐SAR image pairs and the results show the effectiveness of the proposed method.
ISSN:1751-9659
1751-9667