Unofficial implementation algorithms of attention models on SNLI dataset
Unofficial implementation algorithms of attention models on SNLI dataset.
Current include papers:
Based on Lasagne.
At source root dir
First extracts preprocessed SNLI data./extract_data.sh
Then run:python3 ./snli_reasoning_attention.py [condition|attention|word_by_word]
Or run:python3 ./snli_match_lstm.py
The learning curve of word by word attention(best test acc is at epoch 41):
Epoch: 1-20
Epoch: 20-39
Epoch: 40-59
The learning curve of match LSTM with word embedding:
About word by word attention:
About match LSTM:
Reasoning Attention
[3]: