项目作者: junfeizhuang

项目描述 :
Simple pytorch code for knowledge distillation
高级语言: Python
项目地址: git://github.com/junfeizhuang/Knowledge-distillation-example.git


Knowledge-distillation-example

Simple code using pytorch to realize part of Knowledge-distillation.

Support

BackBone

For KD and AT, ResNet20 is student network and ResNet56 is teacher Network.

For DML, two student networks are ResNet20.

Train

  1. python train.py -m student -gpu 1

Dataset

Cifar10

Result

metric Raw ResNet20 Raw ResNet56 KD AT DML
Top-1 91.030 92.257 91.723 91.822 91.574