#计算机科学#59 篇深度学习论文的实现,并带有详细注释。包括 transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 强化学习 (ppo, dqn), capsnet, distillation, ... 🧠
#计算机科学#🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
A New Optimization Technique for Deep Neural Networks
RAdam implemented in Keras & TensorFlow
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
#计算机科学#Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch
#计算机科学#Fast, Modern, and Low Precision PyTorch Optimizers
#计算机科学#Instantly improve your training performance of TensorFlow models with just 2 lines of code!
#计算机科学#FrostNet: Towards Quantization-Aware Network Architecture Search
#计算机科学#Prodigy and Schedule-Free, together at last.
#自然语言处理#Neutron: A pytorch based implementation of Transformer and its variants.
Intergration to get optimizers information from the SolarEdge portal
#计算机科学#Toy implementations of some popular ML optimizers using Python/JAX
#计算机科学#A high level deep learning library for Convolutional Neural Networks,GANs and more, made from scratch(numpy/cupy implementation).
A collection of optimizers, some arcane others well known, for Flax.
Optim4RL is a Jax framework of learning to optimize for reinforcement learning.
Code for "Accelerating Training with Neuron Interaction and Nowcasting Networks" [to appear at ICLR 2025]