[1]
Dr. Jean-Pierre Berger 2022. Knowledge Distillation - Methods and Implementations: Studying knowledge distillation methods for transferring knowledge from large, complex models to smaller, more efficient ones. African Journal of Artificial Intelligence and Sustainable Development. 2, 1 (Jun. 2022), 46–52.