Large-Margin Regularized Softmax Cross-Entropy Loss

Softmax cross-entropy loss with L2 regularization is commonly adopted in the machine learning and neural network community. Considering that the traditional softmax cross-entropy loss simply focuses on fitting or classifying the training data accurately but does not explicitly encourage a large deci...

Full description

Bibliographic Details
Main Authors: Xiaoxu Li, Dongliang Chang, Tao Tian, Jie Cao
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8635450/