PAM: Pyramid Attention Mechanism Based on Contextual Reasoning

Recent work has shown that self-attention modules improve the performance of convolutional neural networks (CNNs), in which global operations are conventionally used to generate descriptors from feature context for attention calculation and characteristics recalibration. However, the performance gai...

Full description

Bibliographic Details
Main Authors: Bohua Chen, Hanzhi Ma, Junjie He, Yinzhang Ding, Lianghao Wang, Dongxiao Li, Ming Zhang
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8854077/