Reverse Knowledge Distillation: Training a Large Model using a Small One for Retinal Image Matching on Limited Data
Published in WACV2023, 2023
We propose a novel approach based on reverse knowledge distillation to train large models with limited data while preventing overfitting for retinal image registration. Read more