GitHub 中文社区
回车: Github搜索    Shift+回车: Google搜索
论坛
排行榜
趋势
登录

©2025 GitHub中文社区论坛GitHub官网网站地图GitHub官方翻译

  • X iconGitHub on X
  • Facebook iconGitHub on Facebook
  • Linkedin iconGitHub on LinkedIn
  • YouTube iconGitHub on YouTube
  • Twitch iconGitHub on Twitch
  • TikTok iconGitHub on TikTok
  • GitHub markGitHub’s organization on GitHub
集合主题趋势排行榜
#

knowledge-distillation

Website
Wikipedia
https://static.github-zh.com/github_avatars/PaddlePaddle?size=40
PaddlePaddle / PaddleClas

PaddleClas 是一个为工业界和学术界所准备的图像识别任务工具集

image-classificationknowledge-distillationautoaugmentrandaugmentgridmaskcutmixdeitrepvggswin-transformerpretrained-modelsimage-recognition
Python 5.67 k
1 个月前
https://static.github-zh.com/github_avatars/dkozlov?size=40
dkozlov / awesome-knowledge-distillation

#计算机科学#Awesome Knowledge Distillation

knowledge-distillationteacher-studentdistillationmodel-compression深度学习
3.68 k
5 天前
https://static.github-zh.com/github_avatars/huawei-noah?size=40
huawei-noah / Pretrained-Language-Model

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

knowledge-distillationmodel-compressionquantizationpretrained-models
Python 3.1 k
1 年前
https://static.github-zh.com/github_avatars/jingyi0000?size=40
jingyi0000 / VLM_survey

#计算机科学#Collection of AWESOME vision-language models for vision tasks

机器视觉深度学习knowledge-distillationsurveytransfer-learningvision-language-modelclip
2.77 k
22 天前
https://static.github-zh.com/github_avatars/IDEA-Research?size=40
IDEA-Research / DWPose

"Effective Whole-body Pose Estimation with Two-stages Distillation" (ICCV 2023, CV4Metaverse Workshop)

controlnetknowledge-distillationpose-estimationstable-diffusion-webui
Python 2.47 k
2 年前
https://static.github-zh.com/github_avatars/intel?size=40
intel / neural-compressor

SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime

low-precisionpruningsparsityauto-tuningknowledge-distillationquantizationquantization-aware-trainingpost-training-quantizationsmoothquantlarge-language-modelsgptqint8
Python 2.43 k
2 天前
alibaba/EasyNLP
https://static.github-zh.com/github_avatars/alibaba?size=40
alibaba / EasyNLP

#自然语言处理#EasyNLP: A Comprehensive and Easy-to-use NLP Toolkit

transformersbert自然语言处理pretrained-models深度学习PyTorchfewshot-learningknowledge-distillationknowledge-pretrainingtext-image-retrievaltext-to-image-synthesis机器学习text-classificationtransfer-learning
Python 2.14 k
7 个月前
https://static.github-zh.com/github_avatars/haitongli?size=40
haitongli / knowledge-distillation-pytorch

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

PyTorchknowledge-distillation深度神经网络cifar10model-compression机器视觉
Python 1.94 k
2 年前
https://static.github-zh.com/github_avatars/microsoft?size=40
microsoft / Cream

This is a collection of our NAS and Vision Transformer work.

nasautomlvision-transformerrpevit-compressionefficiencyknowledge-distillation
Python 1.77 k
1 年前
https://static.github-zh.com/github_avatars/horseee?size=40
horseee / Awesome-Efficient-LLM

#大语言模型#A curated list for Efficient Large Language Models

compressionknowledge-distillationlanguage-model大语言模型model-quantizationefficient-llm
Python 1.71 k
6 天前
https://static.github-zh.com/github_avatars/AberHu?size=40
AberHu / Knowledge-Distillation-Zoo

Pytorch implementation of various Knowledge Distillation (KD) methods.

knowledge-distillationteacher-studentmodel-compressiondistillation
Python 1.7 k
4 年前
https://static.github-zh.com/github_avatars/open-mmlab?size=40
open-mmlab / mmrazor

OpenMMLab Model Compression Toolbox and Benchmark.

naspruningknowledge-distillationsposdartsautoslimdetectionsegmentationclassificationPyTorchquantization
Python 1.6 k
1 年前
yoshitomo-matsubara/torchdistill
https://static.github-zh.com/github_avatars/yoshitomo-matsubara?size=40
yoshitomo-matsubara / torchdistill

#自然语言处理#A coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemente...

knowledge-distillationPyTorchimage-classificationimagenetobject-detectioncocosemantic-segmentationcifar10cifar100colab-notebookgoogle-colabpascal-voc自然语言处理transformergluetext-classification
Python 1.52 k
1 个月前
https://static.github-zh.com/github_avatars/microsoft?size=40
microsoft / NeuronBlocks

#自然语言处理#NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego

question-answering深度学习PyTorch自然语言处理text-classification人工智能dnnqnatext-matchingknowledge-distillationmodel-compressionsequence-labeling
Python 1.45 k
2 年前
https://static.github-zh.com/github_avatars/szagoruyko?size=40
szagoruyko / attention-transfer

#计算机科学#Improving Convolutional Networks via Attention Transfer (ICLR 2017)

PyTorchknowledge-distillationattention深度学习
Jupyter Notebook 1.45 k
7 年前
https://static.github-zh.com/github_avatars/lxztju?size=40
lxztju / pytorch_classification

利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码

PyTorchimage-classification部署svmknncnndensenetresnextresnetFlaskrandom-forestknowledge-distillation
Jupyter Notebook 1.43 k
2 年前
https://static.github-zh.com/github_avatars/huawei-noah?size=40
huawei-noah / Efficient-Computing

Efficient computing methods developed by Huawei Noah's Ark Lab

knowledge-distillationmodel-compressionbinary-neural-networkspruningquantizationself-supervised
Jupyter Notebook 1.28 k
7 个月前
https://static.github-zh.com/github_avatars/Tebmer?size=40
Tebmer / Awesome-Knowledge-Distillation-of-LLMs

#大语言模型#This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vert...

data-augmentationinstruction-followingknowledge-distillation大语言模型surveycompressionfeedbackmulti-modalalignment
1.06 k
3 个月前
https://static.github-zh.com/github_avatars/MingSun-Tse?size=40
MingSun-Tse / Efficient-Deep-Learning

#计算机科学#Collection of recent methods on (deep) neural network compression and acceleration.

model-compressionnetwork-pruningknowledge-distillation深度学习深度神经网络efficient-deep-learning
950
2 个月前
https://static.github-zh.com/github_avatars/alibaba?size=40
alibaba / EasyTransfer

EasyTransfer is designed to make the development of transfer learning in NLP applications easier.

bertnlp-applicationsknowledge-distillationtransfer-learning
Python 861
3 年前
loading...