Need help with char-rnn.pytorch?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.


PyTorch implementation of char-rnn (character-level language model)

149 Stars 73 Forks MIT License 14 Commits 13 Opened issues

Services available

Need anything else?


A PyTorch implementation of char-rnn for character-level text generation. This is copied from the Practical PyTorch series.


Download this Shakespeare dataset (from the original char-rnn) as

. Or bring your own dataset — it should be a plain text file (preferably ASCII).

with the dataset filename to train and save the network:
> python shakespeare.txt

Training for 2000 epochs... (... 10 minutes later ...) Saved as

After training the model will be saved as


Training options

Usage: [filename] [options]

Options: --model Whether to use LSTM or GRU units gru --n_epochs Number of epochs to train 2000 --print_every Log learning rate at this interval 100 --hidden_size Hidden size of GRU 50 --n_layers Number of GRU layers 2 --learning_rate Learning rate 0.01 --chunk_len Length of training chunks 200 --batch_size Number of examples per batch 100 --cuda Use CUDA


with the saved model from training, and a "priming string" to start the text with.
> python --prime_str "Where"

Where, you, and if to our with his drid's Weasteria nobrand this by then.

AUTENES: It his zersit at he

Generation options

Usage: [filename] [options]

Options: -p, --prime_str String to prime generation with -l, --predict_len Length of prediction -t, --temperature Temperature (higher is more chaotic) --cuda Use CUDA

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.