A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution

In recent years, neural networks for single image super-resolution (SISR) have applied more profound and deeper network structures to extract extra image details, which brings difficulties in model training. To deal with deep model training problems, researchers utilize dense skip connections to pro...

Full description

Bibliographic Details
Main Authors: Lei Zha, Yu Yang, Zicheng Lai, Ziwei Zhang, Juan Wen
Format: Article
Language:English
Published: MDPI AG 2021-05-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/10/11/1234
id doaj-2f47a8437eab41c6a202c654fc4d1be5
record_format Article
spelling doaj-2f47a8437eab41c6a202c654fc4d1be52021-06-01T00:47:17ZengMDPI AGElectronics2079-92922021-05-01101234123410.3390/electronics10111234A Lightweight Dense Connected Approach with Attention on Single Image Super-ResolutionLei Zha0Yu Yang1Zicheng Lai2Ziwei Zhang3Juan Wen4College of Information and Electrical Engineering, China Agricultural University, Beijing 100083, ChinaCollege of Information and Electrical Engineering, China Agricultural University, Beijing 100083, ChinaSchool of Advanced Materials and Nanotechnology, Xidian University, Xian 710061, ChinaCollege of Information and Electrical Engineering, China Agricultural University, Beijing 100083, ChinaCollege of Information and Electrical Engineering, China Agricultural University, Beijing 100083, ChinaIn recent years, neural networks for single image super-resolution (SISR) have applied more profound and deeper network structures to extract extra image details, which brings difficulties in model training. To deal with deep model training problems, researchers utilize dense skip connections to promote the model’s feature representation ability by reusing deep features of different receptive fields. Benefiting from the dense connection block, SRDensenet has achieved excellent performance in SISR. Despite the fact that the dense connected structure can provide rich information, it will also introduce redundant and useless information. To tackle this problem, in this paper, we propose a Lightweight Dense Connected Approach with Attention for Single Image Super-Resolution (LDCASR), which employs the attention mechanism to extract useful information in channel dimension. Particularly, we propose the recursive dense group (RDG), consisting of Dense Attention Blocks (DABs), which can obtain more significant representations by extracting deep features with the aid of both dense connections and the attention module, making our whole network attach importance to learning more advanced feature information. Additionally, we introduce the group convolution in DABs, which can reduce the number of parameters to 0.6 M. Extensive experiments on benchmark datasets demonstrate the superiority of our proposed method over five chosen SISR methods.https://www.mdpi.com/2079-9292/10/11/1234dense skip connectionssingle image super-resolutiondeep featureschannel attention mechanism
collection DOAJ
language English
format Article
sources DOAJ
author Lei Zha
Yu Yang
Zicheng Lai
Ziwei Zhang
Juan Wen
spellingShingle Lei Zha
Yu Yang
Zicheng Lai
Ziwei Zhang
Juan Wen
A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution
Electronics
dense skip connections
single image super-resolution
deep features
channel attention mechanism
author_facet Lei Zha
Yu Yang
Zicheng Lai
Ziwei Zhang
Juan Wen
author_sort Lei Zha
title A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution
title_short A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution
title_full A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution
title_fullStr A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution
title_full_unstemmed A Lightweight Dense Connected Approach with Attention on Single Image Super-Resolution
title_sort lightweight dense connected approach with attention on single image super-resolution
publisher MDPI AG
series Electronics
issn 2079-9292
publishDate 2021-05-01
description In recent years, neural networks for single image super-resolution (SISR) have applied more profound and deeper network structures to extract extra image details, which brings difficulties in model training. To deal with deep model training problems, researchers utilize dense skip connections to promote the model’s feature representation ability by reusing deep features of different receptive fields. Benefiting from the dense connection block, SRDensenet has achieved excellent performance in SISR. Despite the fact that the dense connected structure can provide rich information, it will also introduce redundant and useless information. To tackle this problem, in this paper, we propose a Lightweight Dense Connected Approach with Attention for Single Image Super-Resolution (LDCASR), which employs the attention mechanism to extract useful information in channel dimension. Particularly, we propose the recursive dense group (RDG), consisting of Dense Attention Blocks (DABs), which can obtain more significant representations by extracting deep features with the aid of both dense connections and the attention module, making our whole network attach importance to learning more advanced feature information. Additionally, we introduce the group convolution in DABs, which can reduce the number of parameters to 0.6 M. Extensive experiments on benchmark datasets demonstrate the superiority of our proposed method over five chosen SISR methods.
topic dense skip connections
single image super-resolution
deep features
channel attention mechanism
url https://www.mdpi.com/2079-9292/10/11/1234
work_keys_str_mv AT leizha alightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT yuyang alightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT zichenglai alightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT ziweizhang alightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT juanwen alightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT leizha lightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT yuyang lightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT zichenglai lightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT ziweizhang lightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
AT juanwen lightweightdenseconnectedapproachwithattentiononsingleimagesuperresolution
_version_ 1721413843913015296