(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
-
Updated
May 12, 2021 - Python
(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Self-Distillation and Knowledge Distillation Experiments with PyTorch.
A minimalist unofficial implementation of "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
Official implementation of Self-Distillation for Gaussian Processes
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
Bayesian Optimization Meets Self-Distillation, ICCV 2023
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
Deep Hash Distillation for Image Retrieval - ECCV 2022
Self supervised learning through self distillation with no labels (DINO) with Vision Transformers on the PCAM dataset.
A simple and efficient implementation of Self-Supervised Learning from Images with a Joint-Embedding Predictive Architecture (I-JEPA)
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
Enhancing LLMs with LoRAs
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
Add a description, image, and links to the self-distillation topic page so that developers can more easily learn about it.
To associate your repository with the self-distillation topic, visit your repo's landing page and select "manage topics."