Tamper Detection Network Using Global Discriminative Enhancement and Boundary Supervision
Tamper detection networks based on deep learning often fail to effectively utilize global correlations and distinguish between global channel features,leading to high fasle and missed detection rates.To address this issue,a new tamper detection network is proposed.The feature extraction backbone net...
| 發表在: | Jisuanji gongcheng |
|---|---|
| 主要作者: | |
| 格式: | Article |
| 語言: | 英语 |
| 出版: |
Editorial Office of Computer Engineering
2023-06-01
|
| 主題: | |
| 在線閱讀: | https://www.ecice06.com/fileup/1000-3428/PDF/65565.pdf |
| 總結: | Tamper detection networks based on deep learning often fail to effectively utilize global correlations and distinguish between global channel features,leading to high fasle and missed detection rates.To address this issue,a new tamper detection network is proposed.The feature extraction backbone network is constructed using a dual residual network and restricted convolutional layer to extract dual-view multi-scale features from the target of detection. A global information enhancement module is designed to utilize a non-local attention calculation method to obtain the low-dimensional global correlation degree of each scale channel.This correlation degree is then used as an enhancement parameter to perform a discriminative enhancement operation on the global features.Furthermore,a new boundary-supervised approach is introduced to generate boundary mask images by extracting boundary information from prediction results to calculate boundary-assisted loss.This approach utilizes backward learning to guide global features toward tampered regions for tamper detection.Experimental results on the CASIA,COVER,NIST16,and Columbia datasets demonstrate that this network effectively reduces false and missed detection rates,improving the pixel-level F1 score by an average of 2.3 percentage points compared to Multi-View multi-Scale Supervised Network(MVSS-Net) which is best-performing in the similar networks. |
|---|---|
| ISSN: | 1000-3428 |
