Summary: | The advanced driving assistant system (ADAS) is an important vehicle safety technology that can effectively reduce traffic accidents. This system can perceive information about the surrounding environment through in-vehicle cameras. However, these cameras are easily affected by severe weather conditions, such as those involving fog, rain, and snow. The quality of the images acquired by the system is degraded, and the function of the ADAS is thus weakened. In response to this problem, we propose a comprehensive imaging model that can represent the features of fog, rain streaks, raindrops and snowflakes in an image. Subsequently, an algorithm called RASWNet is proposed, which can remove all severe weather features from a degraded image. Based on the generative adversarial network, RASWNet combines the focus capture ability of a visual attention mechanism, the memory ability of the recurrent neural network and the feature extraction ability of the dense blocks approach. We verify the network structure through several ablation studies and use various synthetic and real images to test it. The results of these experiments show that our algorithm is not only better than the commonly used algorithms in terms of its clarity enhancement capacity but is also suitable for all severe weather conditions.
|