Pytorch implementation of various Knowledge Distillation (KD) methods.
-
Updated
Nov 25, 2021 - Python
Pytorch implementation of various Knowledge Distillation (KD) methods.
[ICLR 2025] LLaVA-MoD: Making LLaVA Tiny via MoE-Knowledge Distillation
[ECCV2022] Factorizing Knowledge in Neural Networks
Training ImageNet / CIFAR models with sota strategies and fancy techniques such as ViT, KD, Rep, etc.
Official implementation of paper "Masked Distillation with Receptive Tokens", ICLR 2023.
Matching Guided Distillation (ECCV 2020)
Rotated Localization Distillation (CVPR 2022, TPAMI 2023)
A simple script to convert Agilent 845x Chemstation UV-Vis files (.KD or .SD formats) to .csv format. Fast and easy!
Visualizing the Equilibrium and Kinetics of Protein-Ligand Binding and Competitive Binding
This project implements knowledge distillation from DINOv2 (Vision Transformer) to convolutional networks, enabling efficient visual representation learning with reduced computational requirements.
A script that fetches and plots average team KD ratio of player games.
Add a description, image, and links to the kd topic page so that developers can more easily learn about it.
To associate your repository with the kd topic, visit your repo's landing page and select "manage topics."