Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

Activation functions are essential for deep learning methods to learn and perform complex tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and become the default activation function across the deep learning community since 2012. Although ReLU has been popular, ho...

Full description

Bibliographic Details
Main Authors: Hock Hung Chieng, Noorhaniza Wahid, Ong Pauline, Sai Raj Kishore Perla
Format: Article
Language:English
Published: Universitas Ahmad Dahlan 2018-07-01
Series:IJAIN (International Journal of Advances in Intelligent Informatics)
Subjects:
Online Access:http://ijain.org/index.php/IJAIN/article/view/249