GitHub 中文社区
回车: Github搜索    Shift+回车: Google搜索
论坛
排行榜
趋势
登录

©2025 GitHub中文社区论坛GitHub官网网站地图GitHub官方翻译

  • X iconGitHub on X
  • Facebook iconGitHub on Facebook
  • Linkedin iconGitHub on LinkedIn
  • YouTube iconGitHub on YouTube
  • Twitch iconGitHub on Twitch
  • TikTok iconGitHub on TikTok
  • GitHub markGitHub’s organization on GitHub
集合主题趋势排行榜
#

multihead-attention

Website
Wikipedia
Separius/awesome-fast-attention
https://static.github-zh.com/github_avatars/Separius?size=40
Separius / awesome-fast-attention

list of efficient attention modules

transformerattentionAwesome Listsreformerlongformerlinformermultihead-attentionself-attentionattention-is-all-you-needtransformer-network
Python 1.01 k
4 年前
https://static.github-zh.com/github_avatars/tlatkowski?size=40
tlatkowski / multihead-siamese-nets

#自然语言处理#Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.

multihead-attentionsemantic-similarity深度神经网络attention深度学习text-similarity自然语言处理sentence-similarityTensorflowPython
Jupyter Notebook 182
2 年前
https://static.github-zh.com/github_avatars/datnnt1997?size=40
datnnt1997 / multi-head_self-attention

A Faster Pytorch Implementation of Multi-Head Self-Attention

self-attentionattention-mechanismattentionmultihead-attention
Jupyter Notebook 74
3 年前
https://static.github-zh.com/github_avatars/tensorops?size=40
tensorops / TransformerX

#自然语言处理#Flexible Python library providing building blocks (layers) for reproducible Transformers research (Tensorflow ✅, Pytorch 🔜, and Jax 🔜)

attentionattention-mechanism深度学习vitmultihead-attention自然语言处理self-attentiontransformers
Python 53
2 年前
https://static.github-zh.com/github_avatars/jk96491?size=40
jk96491 / Advanced_Models

여러가지 유명한 신경망 모델들을 제공합니다. (DCGAN, VAE, Resnet 등등)

Generative Adversarial Networkresnet-50vaePyTorchdcgancganmultihead-attentiongpt-2
Python 50
4 年前
https://static.github-zh.com/github_avatars/Syeda-Farhat?size=40
Syeda-Farhat / awesome-Transformers-For-Segmentation

Semantic segmentation is an important job in computer vision, and its applications have grown in popularity over the last decade.We grouped the publications that used various forms of segmentation in ...

机器视觉encoder-decoderinstance-segmentationmultihead-attentionsegmentationself-attentionsemantic-segmentationtransformer
35
4 个月前
https://static.github-zh.com/github_avatars/akurniawan?size=40
akurniawan / pytorch-transformer

Implementation of "Attention is All You Need" paper

PyTorchattentionattention-is-all-you-needmultihead-attention
Python 33
1 年前
https://static.github-zh.com/github_avatars/changwookjun?size=40
changwookjun / Transformer

Chatbot using Tensorflow (Model is transformer) ko

transformerbert聊天机器人Tensorflowself-attentionmultihead-attention
Python 30
7 年前
https://static.github-zh.com/github_avatars/MirunaPislar?size=40
MirunaPislar / multi-head-attention-labeller

Joint text classification on multiple levels with multiple labels, using a multi-head attention mechanism to wire two prediction tasks together.

sentence-classificationmulti-task-learningmultihead-attentiontransformerattention-mechanismzero-shot-learningsemi-supervised-learningconll-2003error-detectionsentiment-analysis
Python 16
5 年前
https://static.github-zh.com/github_avatars/iafarhan?size=40
iafarhan / causal-synthesizer-multihead-attention

Synthesizer Self-Attention is a very recent alternative to causal self-attention that has potential benefits by removing this dot product.

PythonPyTorchattentionmultihead-attention
Python 13
7 个月前
https://static.github-zh.com/github_avatars/shawnhan108?size=40
shawnhan108 / AutoTruckX

An experimental project for autonomous vehicle driving perception with steering angle prediction and semantic segmentation using a combination of UNet, attention and transformers.

autonomous-vehiclesautonomous-drivingudacity-self-driving-carresnet-50transfer-learningtransformerattentionsemantic-segmentationunetmultihead-attention
Python 10
4 年前
https://static.github-zh.com/github_avatars/qdLMF?size=40
qdLMF / LightGlue-with-FlashAttentionV2-TensorRT

A cutlass cute implementation of headdim-64 flashattentionv2 TensorRT plugin for LightGlue. Run on Jetson Orin NX 8GB with TensorRT 8.5.2.

cutecutlasstensorrtfeature-matchingCUDAflash-attentionmultihead-attentiontransformersuperpoint
Cuda 9
5 个月前
https://static.github-zh.com/github_avatars/bkhanal-11?size=40
bkhanal-11 / transformers

The implementation of transformer as presented in the paper "Attention is all you need" from scratch.

attention-is-all-you-needattention-mechanismmultihead-attentionself-attentiontransformers
Python 9
2 年前
https://static.github-zh.com/github_avatars/hrithickcodes?size=40
hrithickcodes / transformer-tf

This repository contains the code for the paper "Attention Is All You Need" i.e The Transformer.

attention-is-all-you-needmultihead-attentionneural-machine-translationself-attentiontransformer-architecturetransformers
Jupyter Notebook 9
3 年前
https://static.github-zh.com/github_avatars/yflyzhang?size=40
yflyzhang / AnnotatedTransformer

encoder-decodermultihead-attentionParsingtransformer
Jupyter Notebook 6
6 个月前
https://static.github-zh.com/github_avatars/jaydeepthik?size=40
jaydeepthik / Nano-GPT

Simple GPT with multiheaded attention for char level tokens, inspired from Andrej Karpathy's video lectures : https://github.com/karpathy/ng-video-lecture

gptpytorch-implementationmultihead-attentionPyTorchtransformers
Jupyter Notebook 5
2 年前
https://static.github-zh.com/github_avatars/Bhazantri?size=40
Bhazantri / EvoLingua

#自然语言处理#EvoLingua: A Scalable Mixture-of-Experts Language Model Framework

attentiongpu-computing大语言模型mixture-of-expertsmultihead-attention自然语言处理Parsing
Python 5
4 个月前
https://static.github-zh.com/github_avatars/antonio-f?size=40
antonio-f / GPT_from_scratch

Very simple implementation of GPT architecture using PyTorch and Jupyter.

easy-to-usegptJupyter NotebookPythonPyTorch教程from-scratchnoob-friendlysimplenewbietransformermultihead-attentionself-attention
Jupyter Notebook 4
2 年前
https://static.github-zh.com/github_avatars/abhilash1910?size=40
abhilash1910 / GraphAttentionNetworks

This package is a Tensorflow2/Keras implementation for Graph Attention Network embeddings and also provides a Trainable layer for Multihead Graph Attention.

tf2graph-attention-networksmultihead-attentionself-attentionkeras-tensorflow
Python 3
4 年前
https://static.github-zh.com/github_avatars/whsqkaak?size=40
whsqkaak / attentions_pytorch

A repository for implementations of attention mechanism by PyTorch.

attentionPyTorchattention-mechanismmultihead-attention
Python 1
3 年前
loading...