Similar repositories to lhyfst/knowledge-distillation-papers:
lhyfst/knowledge-distillation-papers
github
similar
hanxiao/bert-as-service
github
similar
danistefanovic/build-your-own-x
github
similar
AberHu/Knowledge-Distillation-Zoo
github
similar
dkozlov/awesome-knowledge-distillation
github
similar
FLHonker/Awesome-Knowledge-Distillation
github
similar
lenscloth/RKD
github
similar
kon9chunkit/GitHub-Chinese-Top-Charts
github
similar
peterliht/knowledge-distillation-pytorch
github
similar
tuvtran/project-based-learning
github
similar
qijiezhao/M2Det
github
similar
memoiry/Awesome-model-compression-and-acceleration
github
similar
yuanli2333/Teacher-free-Knowledge-Distillation
github
similar
HobbitLong/RepDistiller
github
similar
chester256/Model-Compression-Papers
github
similar
imirzadeh/Teacher-Assistant-Knowledge-Distillation
github
similar
MingSun-Tse/EfficientDNNs
github
similar
meituan/YOLOv6
github
similar
guan-yuan/awesome-AutoML-and-Lightweight-Models
github
similar
clovaai/overhaul-distillation
github
similar
lab-ml/annotated_deep_learning_paper_implementations
github
similar
twangnh/Distilling-Object-Detectors
github
similar
he-y/Awesome-Pruning
github
similar
SsnL/dataset-distillation
github
similar
huawei-noah/Data-Efficient-Model-Compression
github
similar
cgnorthcutt/cleanlab
github
similar
sun254/awesome-model-compression-and-acceleration
github
similar
irfanICMLL/structure_knowledge_distillation
github
similar
SforAiDl/KD_Lib
github
similar
szagoruyko/attention-transfer
github
similar
Eric-mingjie/rethinking-network-pruning
github
similar
antspy/quantized_distillation
github
similar
JiahuiYu/slimmable_networks
github
similar
sseung0703/KD_methods_with_TF
github
similar
yoshitomo-matsubara/torchdistill
github
similar
bhheo/AB_distillation
github
similar
megvii-research/mdistiller
github
similar
NVIDIA/TRTorch
github
similar
csyhhu/Awesome-Deep-Neural-Network-Compression
github
similar
ericsun99/Shufflenet-v2-Pytorch
github
similar