1.
Dr. Jean-Pierre Berger. Knowledge Distillation - Methods and Implementations: Studying knowledge distillation methods for transferring knowledge from large, complex models to smaller, more efficient ones. African J. of Artificial Int. and Sust. Dev. 2022;2(1):46-52. Accessed July 3, 2024. https://africansciencegroup.com/index.php/AJAISD/article/view/46