Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function

Deep neural networks (DNNs) have achieved astonishing results on a variety of supervised learning tasks owing to a large scale of well-labeled training data. However, as recent researches have pointed out, the generalization performance of DNNs is likely to sharply deteriorate when training data con...

Full description

Bibliographic Details
Main Authors: Zhen Qin, Zhengwen Zhang, Yan Li, Jun Guo
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8834773/
id doaj-6de4c9b4338746498316e95347fb1370
record_format Article
spelling doaj-6de4c9b4338746498316e95347fb13702021-04-05T17:32:56ZengIEEEIEEE Access2169-35362019-01-01713089313090210.1109/ACCESS.2019.29406538834773Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss FunctionZhen Qin0Zhengwen Zhang1Yan Li2https://orcid.org/0000-0001-9562-9634Jun Guo3School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing, ChinaSchool of Information and Electronics, Beijing Institute of Technology, Beijing, ChinaSchool of Information and Electronics, Beijing Institute of Technology, Beijing, ChinaSchool of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing, ChinaDeep neural networks (DNNs) have achieved astonishing results on a variety of supervised learning tasks owing to a large scale of well-labeled training data. However, as recent researches have pointed out, the generalization performance of DNNs is likely to sharply deteriorate when training data contains label noise. In order to address this problem, a novel loss function is proposed to guide DNNs to pay more attention to clean samples via adaptively weighing the traditional cross-entropy loss. Under the guidance of this loss function, a cross-training strategy is designed by leveraging two synergic DNN models, each of which plays the roles of both updating its own parameters and generating curriculums for the other one. In addition, this paper further proposes an online data filtration mechanism and integrates it into the final cross-training framework, which simultaneously optimizes DNN models and filters out noisy samples. The proposed approach is evaluated through a great deal of experiments on several benchmark datasets with man-made or real-world label noise, and the results have demonstrated its robustness to different noise types and noise scales.https://ieeexplore.ieee.org/document/8834773/Deep neural networkslabel noisecross-trainingloss functiondata filtration
collection DOAJ
language English
format Article
sources DOAJ
author Zhen Qin
Zhengwen Zhang
Yan Li
Jun Guo
spellingShingle Zhen Qin
Zhengwen Zhang
Yan Li
Jun Guo
Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function
IEEE Access
Deep neural networks
label noise
cross-training
loss function
data filtration
author_facet Zhen Qin
Zhengwen Zhang
Yan Li
Jun Guo
author_sort Zhen Qin
title Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function
title_short Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function
title_full Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function
title_fullStr Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function
title_full_unstemmed Making Deep Neural Networks Robust to Label Noise: Cross-Training With a Novel Loss Function
title_sort making deep neural networks robust to label noise: cross-training with a novel loss function
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2019-01-01
description Deep neural networks (DNNs) have achieved astonishing results on a variety of supervised learning tasks owing to a large scale of well-labeled training data. However, as recent researches have pointed out, the generalization performance of DNNs is likely to sharply deteriorate when training data contains label noise. In order to address this problem, a novel loss function is proposed to guide DNNs to pay more attention to clean samples via adaptively weighing the traditional cross-entropy loss. Under the guidance of this loss function, a cross-training strategy is designed by leveraging two synergic DNN models, each of which plays the roles of both updating its own parameters and generating curriculums for the other one. In addition, this paper further proposes an online data filtration mechanism and integrates it into the final cross-training framework, which simultaneously optimizes DNN models and filters out noisy samples. The proposed approach is evaluated through a great deal of experiments on several benchmark datasets with man-made or real-world label noise, and the results have demonstrated its robustness to different noise types and noise scales.
topic Deep neural networks
label noise
cross-training
loss function
data filtration
url https://ieeexplore.ieee.org/document/8834773/
work_keys_str_mv AT zhenqin makingdeepneuralnetworksrobusttolabelnoisecrosstrainingwithanovellossfunction
AT zhengwenzhang makingdeepneuralnetworksrobusttolabelnoisecrosstrainingwithanovellossfunction
AT yanli makingdeepneuralnetworksrobusttolabelnoisecrosstrainingwithanovellossfunction
AT junguo makingdeepneuralnetworksrobusttolabelnoisecrosstrainingwithanovellossfunction
_version_ 1721539422876336128