FastDerainNet: A Deep Learning Algorithm for Single Image Deraining

Existing neural network-based methods for de-raining single images exhibit dissatisfactory results owing to the inefficient propagation of features when objects with sizes and shapes similar to those of rain streaks are present in images. Furthermore, existing methods do not consider that the abunda...

Full description

Bibliographic Details
Main Authors: Xiuwen Wang, Zhiwei Li, Hongtao Shan, Zhiyuan Tian, Yuanhong Ren, Wuneng Zhou
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9139246/
Description
Summary:Existing neural network-based methods for de-raining single images exhibit dissatisfactory results owing to the inefficient propagation of features when objects with sizes and shapes similar to those of rain streaks are present in images. Furthermore, existing methods do not consider that the abundant information included in rain streaked images could interfere with the training process. To overcome these limitations, in this paper, we propose a deep residual learning algorithm called FastDerainNet for removing rain streaks from single images. We design a deep convolutional neural network architecture, based on a deep residual network called the share-source residual module (SSRM), by substituting the origins of all shortcut connections for one point. To further improve the de-raining performance, we adopt the SSRM as the parameter layers in FastDerainNet and use image decomposition to modify the loss function. Finally, we train FastDerainNet on a synthetic dataset. By learning the residual mapping between rainy and clean image detail layers, it is able to reduce the mapping range and simplify the training process. Experiments on both synthetic and real-world images demonstrate that the proposed method achieves increased performance with regard to de-raining, in addition to preserving original details, in comparison with other state-of-the-art methods.
ISSN:2169-3536