Rectified Exponential Units for Convolutional Neural Networks

Rectified linear unit (ReLU) plays an important role in today's convolutional neural networks (CNNs). In this paper, we propose a novel activation function called Rectified Exponential Unit (REU). Inspired by two recently proposed activation functions: Exponential Linear Unit (ELU) and Swish, t...

Full description

Bibliographic Details
Main Authors: Yao Ying, Jianlin Su, Peng Shan, Ligang Miao, Xiaolian Wang, Silong Peng
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8762191/