GitHub 中文社区
回车: Github搜索    Shift+回车: Google搜索
论坛
排行榜
趋势
登录

©2025 GitHub中文社区论坛GitHub官网网站地图GitHub官方翻译

  • X iconGitHub on X
  • Facebook iconGitHub on Facebook
  • Linkedin iconGitHub on LinkedIn
  • YouTube iconGitHub on YouTube
  • Twitch iconGitHub on Twitch
  • TikTok iconGitHub on TikTok
  • GitHub markGitHub’s organization on GitHub
集合主题趋势排行榜
#

pretraining

Website
Wikipedia
https://static.github-zh.com/github_avatars/LlamaFamily?size=40
LlamaFamily / Llama-Chinese

#大语言模型#Llama中文社区,实时汇总最新Llama学习资料,构建最好的中文Llama大模型开源生态,完全开源可商用

llama大语言模型pretrainingagentllama4rl
Python 14.61 k
2 个月前
https://static.github-zh.com/github_avatars/microsoft?size=40
microsoft / LMOps

#自然语言处理#General technology for enabling AI capabilities w/ LLMs and MLLMs

自然语言处理agigpt大语言模型lmpretrainingpromptlmopspromptistx-promptlanguage-model
Python 4.02 k
17 天前
https://static.github-zh.com/github_avatars/OFA-Sys?size=40
OFA-Sys / OFA

Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework

multimodalpretrainingimage-captioningtext-to-image-synthesisvisual-question-answeringreferring-expression-comprehensionvision-languagepretrained-modelspromptprompt-tuning中文
Python 2.5 k
1 年前
https://static.github-zh.com/github_avatars/X-PLUG?size=40
X-PLUG / mPLUG-Owl

#大语言模型#mPLUG-Owl: The Powerful Multi-modal Large Language Model Family

聊天机器人ChatGPTlarge-language-modelsllamamultimodaldamompluginstruction-tuningpretrainingmplug-owlhuggingfacePyTorchtransformeralpacavisual-recognitiongptgpt4gpt4-apidialogueVideo
Python 2.48 k
2 个月前
https://static.github-zh.com/github_avatars/ChandlerBang?size=40
ChandlerBang / awesome-self-supervised-gnn

#计算机科学#Papers about pretraining and self-supervised learning on Graph Neural Networks (GNN).

graph-neural-networkspretrainingself-supervised-learning深度学习机器学习pre-training
Python 1.68 k
1 年前
keyu-tian/SparK
https://static.github-zh.com/github_avatars/keyu-tian?size=40
keyu-tian / SparK

#计算机科学#[ICLR'23 Spotlight🔥] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling...

bertconvnetconvolutional-neural-networksmasked-image-modelingpre-trained-modelself-supervised-learningsparse-convolutionTLS (Transport Layer Security)cnniclr深度学习object-detectionPyTorchinstance-segmentationmask-rcnnpretrainpretraining
Python 1.35 k
1 年前
https://static.github-zh.com/github_avatars/yuewang-cuhk?size=40
yuewang-cuhk / awesome-vision-language-pretraining-papers

Recent Advances in Vision and Language PreTrained Models (VL-PTMs)

vision-and-languagepretrainingmultimodal-deep-learningbert
1.15 k
3 年前
https://static.github-zh.com/github_avatars/qqlu?size=40
qqlu / Entity

#计算机科学# EntitySeg Toolbox: Towards Open-World and High-Quality Image Segmentation

image-segmentationsegmentationPyTorchinstance-segmentationpanoptic-segmentationsemantic-segmentationobject-detectionfcoscondinstdetectron2pretrained-weightspretrained-models机器视觉深度学习cnnpretraining
Jupyter Notebook 1.02 k
2 年前
https://static.github-zh.com/github_avatars/YehLi?size=40
YehLi / xmodaler

X-modaler is a versatile and high-performance codebase for cross-modal analytics(e.g., image captioning, video captioning, vision-language pre-training, visual question answering, visual commonsense r...

image-captioningvideo-captioningvision-and-languagepretrainingcross-modal-retrievalvisual-question-answeringtden
Python 969
2 年前
https://static.github-zh.com/github_avatars/deepmodeling?size=40
deepmodeling / Uni-Mol

#计算机科学#Official Repository for the Uni-Mol Series Methods

pre-trained-modelpretraining深度学习
Python 879
18 天前
https://static.github-zh.com/github_avatars/PKU-YuanGroup?size=40
PKU-YuanGroup / LanguageBind

【ICLR 2024🔥】 Extending Video-Language Pretraining to N-modality by Language-based Semantic Alignment

multi-modalpretrainingzero-shot
Python 815
1 年前
https://static.github-zh.com/github_avatars/seal-rg?size=40
seal-rg / recurrent-pretraining

Pretraining code for a large-scale depth-recurrent language model

大语言模型pretrainingreasoning
Python 778
17 天前
https://static.github-zh.com/github_avatars/Alibaba-MIIL?size=40
Alibaba-MIIL / ImageNet21K

Official Pytorch Implementation of: "ImageNet-21K Pretraining for the Masses"(NeurIPS, 2021) paper

pretrainingmulti-label-classificationvision-transformermixer
Python 766
2 年前
https://static.github-zh.com/github_avatars/zubair-irshad?size=40
zubair-irshad / Awesome-Robotics-3D

#大语言模型#A curated list of 3D Vision papers relating to Robotics domain in the era of large models i.e. LLMs/VLMs, inspired by awesome-computer-vision, including papers, codes, and related websites

3Dbenchmarks机器视觉gaussian-splatting大语言模型manipulationnerfpolicy-learningpretrainingRoboticsscene-graphSimulationvision-language-modelvlmdiffusion-modelsfoundation-modelsnavigation
710
7 个月前
https://static.github-zh.com/github_avatars/AGI-Arena?size=40
AGI-Arena / MARS

The official implementation of MARS: Unleashing the Power of Variance Reduction for Training Large Models

fine-tuninglarge-language-modelsoptimization-algorithmsoptimizerpretraining
Python 702
6 天前
https://static.github-zh.com/github_avatars/cxcscmu?size=40
cxcscmu / Craw4LLM

#网络爬虫#Official repository for "Craw4LLM: Efficient Web Crawling for LLM Pretraining"

爬虫crawlinglarge-language-models大语言模型pre-trainingpretrainingweb-crawlerweb-crawling
Python 626
4 个月前
https://static.github-zh.com/github_avatars/PITI-Synthesis?size=40
PITI-Synthesis / PITI

PITI: Pretraining is All You Need for Image-to-Image Translation

机器视觉image-generationimage-synthesisimage-to-image-translationpretraining
Python 499
1 年前
https://static.github-zh.com/github_avatars/PaddlePaddle?size=40
PaddlePaddle / PaddleFleetX

飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。

paddlepaddlebenchmarklarge-scalemodel-parallelismdata-parallelismpipeline-parallelismcloudelasticlightningpretrainingself-supervised-learningunsupervised-learning
Python 470
1 年前
https://static.github-zh.com/github_avatars/westlake-repl?size=40
westlake-repl / SaProt

Saprot: Protein Language Model with Structural Alphabet (AA+3Di)

alphafold2pretrainingproteinprotein-structurerepresentation-learning
Python 461
1 个月前
https://static.github-zh.com/github_avatars/Coobiw?size=40
Coobiw / MPP-LLaVA

Personal Project: MPP-Qwen14B & MPP-Qwen-Next(Multimodal Pipeline Parallel based on Qwen-LM). Support [video/image/multi-image] {sft/conversations}. Don't let the poverty limit your imagination! Train...

multimodal-large-language-modelsdeepspeedpipeline-parallelismmllmqwenfine-tuningpretraining
Jupyter Notebook 452
3 个月前
loading...