项目作者: mahdiabdollahpour

项目描述 :
SEQ2SEQ model with Attention mechanism for QA also for NMT +(Generating text using LSTM network)
高级语言: Python
项目地址: git://github.com/mahdiabdollahpour/Neural-Dialogue-System.git
创建时间: 2018-09-14T15:09:33Z
项目社区:https://github.com/mahdiabdollahpour/Neural-Dialogue-System

开源协议:MIT License

下载


“# RNN-Language-Model + SEQ2SEQ model for question answering”

Two main approaches

1- character level

2- word level

using LSTM layers to model recurrent neural network to predict next word or char
given the previous ones

How To Use This Code

Creating QA Model

(will be completed soon)

Using Pretrained Model on Daily Dialog Dataset

(will be completed soon)

Your Dataset Format

(will be completed soon)

TO DO

  • [ ] Using state_is_tuple in char level

  • [x] Printing number of trainable parameter

  • [ ] Training the model

  • [ ] Using Estimator

  • [X] Converting string data to indexes takes long not to do it every time

  • [X] Code is dirty I’ve got to clean it

  • [x] Saving is not handled in seq2seq

  • [x] Plotting loss

  • [x] Adding Dropout

  • [ ] Last batches are ignored , not getting to batch_size

  • [ ] Handle early stopping