(1)
Dr. Jean-Pierre Berger. Knowledge Distillation - Methods and Implementations: Studying Knowledge Distillation Methods for Transferring Knowledge from Large, Complex Models to Smaller, More Efficient Ones. African J. of Artificial Int. and Sust. Dev. 2022, 2 (1), 46-52.