transfer-nlp: NLP library designed for flexible research and development
texar-pytorch: Toolkit for Machine Learning and Text Generation, in PyTorch texar.io
pytorch-kaldi: pytorch-kaldi is a project for developing state-of-the-art DNN/RNN hybrid speech recognition systems. The DNN part is managed by pytorch, while feature extraction, label computation, and decoding are performed with the kaldi toolkit.
NeMo: Neural Modules: a toolkit for conversational AI nvidia.github.io/NeMo
pytorch-struct: A library of vectorized implementations of core structured prediction algorithms (HMM, Dep Trees, CKY, ..,)
espresso: Espresso: A Fast End-to-End Neural Speech Recognition Toolkit
transformers: huggingface Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. huggingface.co/transformers
functional zoo: PyTorch, unlike lua torch, has autograd in it's core, so using modular structure of torch.nn modules is not necessary, one can easily allocate needed Variables and write a function that utilizes them, which is sometimes more convenient. This repo contains model definitions in this functional way, with pretrained weights for some models.
torch-sampling: This package provides a set of transforms and data structures for sampling from in-memory or out-of-memory data.
torchcraft-py: Python wrapper for TorchCraft, a bridge between Torch and StarCraft for AI research.
aorun: Aorun intend to be a Keras with PyTorch as backend.
pytorch-extension: This is a CUDA extension for PyTorch which computes the Hadamard product of two tensors.
tensorboard-pytorch: This module saves PyTorch tensors in tensorboard format for inspection. Currently supports scalar, image, audio, histogram features in tensorboard.
gpytorch: GPyTorch is a Gaussian Process library, implemented using PyTorch. It is designed for creating flexible and modular Gaussian Process models with ease, so that you don't have to be an expert to use GPs.
PyTorch-Encoding: PyTorch Deep Texture Encoding Network http://hangzh.com/PyTorch-Encoding
pytorch-ctc: PyTorch-CTC is an implementation of CTC (Connectionist Temporal Classification) beam search decoding for PyTorch. C++ code borrowed liberally from TensorFlow with some improvements to increase flexibility.
Tor10: A Generic Tensor-Network library that is designed for quantum simulation, base on the pytorch.
Catalyst: High-level utils for PyTorch DL & RL research. It was developed with a focus on reproducibility, fast experimentation and code/ideas reusing. Being able to research/develop something new, rather than write another regular train loop.
higher: higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
Torchelie: Torchélie is a set of utility functions, layers, losses, models, trainers and other things for PyTorch. torchelie.readthedocs.org
CrypTen: CrypTen is a Privacy Preserving Machine Learning framework written using PyTorch that allows researchers and developers to train models using encrypted data. CrypTen currently supports Secure multi-party computation as its encryption mechanism.
cvxpylayers: cvxpylayers is a Python library for constructing differentiable convex optimization layers in PyTorch
RepDistiller: Contrastive Representation Distillation (CRD), and benchmark of recent knowledge distillation methods
kaolin: PyTorch library aimed at accelerating 3D deep learning research
PySNN: Efficient Spiking Neural Network framework, built on top of PyTorch for GPU acceleration.
sparktorch: Train and run Pytorch models on Apache Spark.
pytorch-metric-learning: The easiest way to use metric learning in your application. Modular, flexible, and extensible. Written in PyTorch.
flambe: An ML framework to accelerate research and its path to production. flambe.ai
pytorch-optimizer: Collections of modern optimization algorithms for PyTorch, includes: AccSGD, AdaBound, AdaMod, DiffGrad, Lamb, RAdam, RAdam, Yogi.
PyTorch-VAE: A Collection of Variational Autoencoders (VAE) in PyTorch.
ray: A fast and simple framework for building and running distributed applications. Ray is packaged with RLlib, a scalable reinforcement learning library, and Tune, a scalable hyperparameter tuning library. ray.io
Poutyne: A Keras-like framework for PyTorch that handles much of the boilerplating code needed to train neural networks.
Pytorch-Toolbox: This is toolbox project for Pytorch. Aiming to make you write Pytorch code more easier, readable and concise.
Pytorch-contrib: It contains reviewed implementations of ideas from recent machine learning papers.
EfficientNet PyTorch: It contains an op-for-op PyTorch reimplementation of EfficientNet, along with pre-trained models and examples.
PyTorch/XLA: PyTorch/XLA is a Python package that uses the XLA deep learning compiler to connect the PyTorch deep learning framework and Cloud TPUs.
webdataset: WebDataset is a PyTorch Dataset (IterableDataset) implementation providing efficient access to datasets stored in POSIX tar archives.
volksdep: volksdep is an open-source toolbox for deploying and accelerating PyTorch, Onnx and Tensorflow models with TensorRT.
PyTorch-StudioGAN: StudioGAN is a Pytorch library providing implementations of representative Generative Adversarial Networks (GANs) for conditional/unconditional image generation. StudioGAN aims to offer an identical playground for modern GANs so that machine learning researchers can readily compare and analyze a new idea.
pytorch containers: This repository aims to help former Torchies more seamlessly transition to the "Containerless" world of PyTorch by providing a list of PyTorch implementations of Torch Table Layers.
Deep Learning with PyTorch: Deep Learning with PyTorch teaches you how to implement deep learning algorithms with Python and PyTorch, the book includes a case study: building an algorithm capable of detecting malignant lung tumors using CT scans.
neuraltalk2-pytorch: image captioning model in pytorch(finetunable cnn in branch with_finetune)
vnet.pytorch: A Pytorch implementation for V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation.
pytorch-fcn: PyTorch implementation of Fully Convolutional Networks.
WideResNets: WideResNets for CIFAR10/100 implemented in PyTorch. This implementation requires less GPU memory than what is required by the official Torch implementation: https://github.com/szagoruyko/wide-residual-networks .
pytorch-NeuCom: Pytorch implementation of DeepMind's differentiable neural computer paper.
captionGen: Generate captions for an image using PyTorch.
AnimeGAN: A simple PyTorch Implementation of Generative Adversarial Networks, focusing on anime face drawing.
Cnn-text classification: This is the implementation of Kim's Convolutional Neural Networks for Sentence Classification paper in PyTorch.
deepspeech2: Implementation of DeepSpeech2 using Baidu Warp-CTC. Creates a network based on the DeepSpeech2 architecture, trained with the CTC activation function.
seq2seq: This repository contains implementations of Sequence to Sequence (Seq2Seq) models in PyTorch
Asynchronous Advantage Actor-Critic in PyTorch: This is PyTorch implementation of A3C as described in Asynchronous Methods for Deep Reinforcement Learning. Since PyTorch has a easy method to control shared memory within multiprocess, we can easily implement asynchronous method like A3C.
densenet: This is a PyTorch implementation of the DenseNet-BC architecture as described in the paper Densely Connected Convolutional Networks by G. Huang, Z. Liu, K. Weinberger, and L. van der Maaten. This implementation gets a CIFAR-10+ error rate of 4.77 with a 100-layer DenseNet-BC with a growth rate of 12. Their official implementation and links to many other third-party implementations are available in the liuzhuang13/DenseNet repo on GitHub.
nninit: Weight initialization schemes for PyTorch nn.Modules. This is a port of the popular nninit for Torch7 by @kaixhin.
faster rcnn: This is a PyTorch implementation of Faster RCNN. This project is mainly based on py-faster-rcnn and TFFRCNN.For details about R-CNN please refer to the paper Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks by Shaoqing Ren, Kaiming He, Ross Girshick, Jian Sun.
doomnet: PyTorch's version of Doom-net implementing some RL models in ViZDoom environment.
flownet: Pytorch implementation of FlowNet by Dosovitskiy et al.
sqeezenet: Implementation of Squeezenet in pytorch, #### pretrained models on CIFAR10 data to come Plan to train the model on cifar 10 and add block connections too.
optnet: This repository is by Brandon Amos and J. Zico Kolter and contains the PyTorch source code to reproduce the experiments in our paper OptNet: Differentiable Optimization as a Layer in Neural Networks.
qp solver: A fast and differentiable QP solver for PyTorch. Crafted by Brandon Amos and J. Zico Kolter.
bandit-nmt: This is code repo for our EMNLP 2017 paper "Reinforcement Learning for Bandit Neural Machine Translation with Simulated Human Feedback", which implements the A2C algorithm on top of a neural encoder-decoder model and benchmarks the combination under simulated noisy rewards.
pytorch-a2c-ppo-acktr: PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO) and Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKTR).
skip-gram-pytorch: A complete pytorch implementation of skipgram model (with subsampling and negative sampling). The embedding result is tested with Spearman's rank correlation.
stackGAN-v2: Pytorch implementation for reproducing StackGAN_v2 results in the paper StackGAN++: Realistic Image Synthesis with Stacked Generative Adversarial Networks by Han Zhang, Tao Xu, Hongsheng Li, Shaoting Zhang, Xiaogang Wang, Xiaolei Huang, Dimitris Metaxas.
self-critical.pytorch: Unofficial pytorch implementation for Self-critical Sequence Training for Image Captioning.
yolo2-pytorch: The YOLOv2 is one of the most popular one-stage object detector. This project adopts PyTorch as the developing framework to increase productivity, and utilize ONNX to convert models into Caffe 2 to benifit engineering deployment.
reseg-pytorch: PyTorch Implementation of ReSeg (arxiv.org/pdf/1511.07053.pdf)
Detectron.pytorch: A pytorch implementation of Detectron. Both training from scratch and inferring directly from pretrained Detectron weights are available.
R2Plus1D-PyTorch: PyTorch implementation of the R2Plus1D convolution based ResNet architecture described in the paper "A Closer Look at Spatiotemporal Convolutions for Action Recognition"
StackNN: A PyTorch implementation of differentiable stacks for use in neural networks.
translagent: Code for Emergent Translation in Multi-Agent Communication.
ban-vqa: Bilinear attention networks for visual question answering.
pytorch-openai-transformer-lm: This is a PyTorch implementation of the TensorFlow code provided with OpenAI's paper "Improving Language Understanding by Generative Pre-Training" by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever.
T2F: Text-to-Face generation using Deep Learning. This project combines two of the recent architectures StackGAN and ProGAN for synthesizing faces from textual descriptions.
pytorch - fid: A Port of Fréchet Inception Distance (FID score) to PyTorch
vae_vpflows:Code in PyTorch for the convex combination linear IAF and the Householder Flow, J.M. Tomczak & M. Welling jmtomczak.github.io/deebmed.html
CoordConv-pytorch: Pytorch implementation of CoordConv introduced in 'An intriguing failing of convolutional neural networks and the CoordConv solution' paper. (arxiv.org/pdf/1807.03247.pdf)
SDPoint: Implementation of "Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks", published in CVPR 2018.
pnn.pytorch: PyTorch implementation of CVPR'18 - Perturbative Neural Networks http://xujuefei.com/pnn.html.
FaceAttentionNetwork: Pytorch implementation of face attention network as described in Face Attention Network: An Effective Face Detector for the Occluded Faces.
waveglow: A Flow-based Generative Network for Speech Synthesis.
deepfloat: This repository contains the SystemVerilog RTL, C++, HLS (Intel FPGA OpenCL to wrap RTL code) and Python needed to reproduce the numerical results in "Rethinking floating point for deep learning"