PC-Match: Semi-Supervised Learning With Progressive Contrastive and Consistency Regularization

As artificial intelligence developed rapidly, deep learning models have been applied in various domains. While labeling is crucial to training models in fields that demand specific knowledge, producing such labeled datasets is expensive. Semi-supervised learning (SSL) is becoming a potential solutio...

Full description

Bibliographic Details
Published in:IEEE Access
Main Authors: Mikyung Kang, Sooyon Seo, Moohong Min
Format: Article
Language:English
Published: IEEE 2025-01-01
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10918676/
Description
Summary:As artificial intelligence developed rapidly, deep learning models have been applied in various domains. While labeling is crucial to training models in fields that demand specific knowledge, producing such labeled datasets is expensive. Semi-supervised learning (SSL) is becoming a potential solution to this problem, utilizing data without label information. However, conventional SSL methods focus solely on predicting data classes while minimizing classification loss, which can lead to ambiguous decision boundaries, especially for classes with similar characteristics. In this study, we propose enhancing the baseline SSL method by incorporating a contrastive loss. We emphasize that samples near the decision boundaries have a negative impact on the model’s performance. Our method addresses the issue of uncertain boundaries in the representation space by focusing on the unique characteristics of each class. We conducted experiments to validate the effectiveness of our proposed method using a limited number of labeled samples. The results demonstrate that our method effectively enhances performance, particularly in environments with limited labeled data, as evidenced by visual analysis.
ISSN:2169-3536