Here are
18 public repositories
matching this topic...
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Updated
Jan 29, 2022
Python
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
Updated
Nov 2, 2024
Shell
Deep Hash Distillation for Image Retrieval - ECCV 2022
Updated
Jul 16, 2024
Python
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
Updated
Oct 12, 2021
Jupyter Notebook
(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
Updated
May 12, 2021
Python
Self-Distillation and Knowledge Distillation Experiments with PyTorch.
Updated
Feb 9, 2022
Python
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
Updated
Dec 17, 2023
Python
Bayesian Optimization Meets Self-Distillation, ICCV 2023
Updated
Aug 28, 2023
Python
A minimalist unofficial implementation of "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
Updated
Apr 6, 2022
Python
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
Updated
Dec 23, 2024
Python
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
Updated
Jul 1, 2023
Python
Official implementation of Self-Distillation for Gaussian Processes
Updated
May 16, 2023
Python
A simple and efficient implementation of Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture (I-JEPA)
Updated
Aug 16, 2024
Python
SEED: A Transformers-Based Autoencoder Enhanced by Masking and Self-Distillation for Business Process Anomaly Detection
Updated
Mar 21, 2024
Python
Self-distillation with no labels
Updated
Jan 27, 2025
Jupyter Notebook
Self supervised learning through self distillation with no labels (DINO) with Vision Transformers on the PCAM dataset.
Updated
Jan 20, 2025
Python
Improve this page
Add a description, image, and links to the
self-distillation
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
self-distillation
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.