Neural Dialogue System
v1.0
"# RNN-Language-Model + SEQ2SEQ model for question answering"
using LSTM layers to model recurrent neural network to predict next word or char given the previous ones
(will be completed soon)
(will be completed soon)
(will be completed soon)
Using state_is_tuple in char level
Printing number of trainable parameter
Training the model
Using Estimator
Converting string data to indexes takes long not to do it every time
Code is dirty I've got to clean it
Saving is not handled in seq2seq
Plotting loss
Adding Dropout
Last batches are ignored , not getting to batch_size
Handle early stopping