Deep Self-Supervised Diversity Promoting Learning on Hierarchical Hyperspheres for Regularization

In this paper, we propose a novel approach to enhance the generalization performance of deep neural networks. Our method employs a hierarchical hypersphere-based constraint that organizes weight vectors hierarchically based on observed data. By diversifying the parameter space of hyperplanes in the...

Full description

Bibliographic Details
Published in:IEEE Access
Main Authors: Youngsung Kim, Yoonsuk Hyun, Jae-Joon Han, Eunho Yang, Sung Ju Hwang, Jinwoo Shin
Format: Article
Language:English
Published: IEEE 2023-01-01
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10373009/
Description
Summary:In this paper, we propose a novel approach to enhance the generalization performance of deep neural networks. Our method employs a hierarchical hypersphere-based constraint that organizes weight vectors hierarchically based on observed data. By diversifying the parameter space of hyperplanes in the classification layer, we aim to encourage discriminative generalization. We introduce a self-supervised grouping method designed to unveil hierarchical structures in scenarios with unknown hierarchy information. To maximize distances between weight vectors on multiple hyperspheres, we propose a novel metric that combines discrete and continuous measures. This regularization encourages diverse orientations, consequently leading to improved generalization. Extensive evaluations on datasets, including CUB200-2011, Stanford-Cars, CIFAR-100, and TinyImageNet, consistently demonstrate enhancements in classification performance compared to baseline settings.
ISSN:2169-3536