项目作者: pskrunner14

项目描述 :
Character Level RNN language model in PyTorch
高级语言: Python
项目地址: git://github.com/pskrunner14/char-level-rnn.git
创建时间: 2018-10-07T19:01:01Z
项目社区:https://github.com/pskrunner14/char-level-rnn

开源协议:MIT License

下载


Character Level RNN

This project contains a character Level RNN (RNN, LSTM and GRU) language model implemented in PyTorch. It can be used to generate novel texts one character at a time. This is a PyTorch example that generates new names from scratch and can be a useful resource for learning how to easily handle sequential data with the framework.

Getting Started

  1. $ python train.py --help
  2. Usage: train.py [OPTIONS]
  3. Trains a character-level Recurrent Neural Network in PyTorch.
  4. Args: optional arguments [python train.py --help]
  5. Options:
  6. -f, --filename PATH path for the training data file [data/names]
  7. -rt, --rnn-type TEXT type of RNN layer to use [LSTM]
  8. -nl, --num-layers INTEGER number of layers in RNN [2]
  9. -dr, --dropout FLOAT dropout value for RNN layers [0.5]
  10. -es, --emb-size INTEGER size of the each embedding [64]
  11. -hs, --hidden-size INTEGER number of hidden RNN units [256]
  12. -n, --num-epochs INTEGER number of epochs for training [50]
  13. -bz, --batch-size INTEGER number of samples per mini-batch [32]
  14. -lr, --learning-rate FLOAT learning rate for the adam optimizer [0.0002]
  15. -se, --save-every INTEGER epoch interval for saving the model [10]
  16. -ns, --num-samples INTEGER number of samples to generate after epoch interval [5]
  17. -sp, --seed-phrase TEXT seed phrase to feed the RNN for sampling [SOS_TOKEN]
  18. -sa, --sample-every INTEGER epoch interval for sampling new sequences [5]
  19. --help Show this message and exit.

References