1.
Dr. Jean-Pierre Berger. Knowledge Distillation - Methods and Implementations: Studying knowledge distillation methods for transferring knowledge from large, complex models to smaller, more efficient ones. African J. of Artificial Int. and Sust. Dev. [Internet]. 2022 Jun. 20 [cited 2024 Nov. 21];2(1):46-52. Available from: https://africansciencegroup.com/index.php/AJAISD/article/view/46