Selecting Images With Entropy for Frugal Knowledge Distillation

Frugal knowledge distillation is becoming increasingly important as it enables the distillation process to function effectively in resource-constrained environments. A key aspect of achieving this efficiency lies in minimizing the amount of training data required. To address this, we propose an entr...

Full description

Bibliographic Details
Published in:IEEE Access
Main Authors: Michail Kinnas, John Violos, Nikolaos Ioannis Karapiperis, Ioannis Kompatsiaris
Format: Article
Language:English
Published: IEEE 2025-01-01
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10878975/
Description
Summary:Frugal knowledge distillation is becoming increasingly important as it enables the distillation process to function effectively in resource-constrained environments. A key aspect of achieving this efficiency lies in minimizing the amount of training data required. To address this, we propose an entropy-based data selection method that identifies smaller subsets from the original dataset, focusing on images that retain the highest informational content. We explore the effectiveness of entropy-based method in combination with five different image representations to determine the subsets most effective for transferring knowledge to the student model. Our experimental evaluation on benchmark datasets, including CIFAR-10, MNIST, and FashionMNIST, shows that our approach outperforms other state-of-the-art image selection methods in most scenarios. It achieves over 3% higher accuracy compared to random selection methods while maintaining similar knowledge distillation time and energy efficiency.
ISSN:2169-3536