项目作者: guillaume-chevalier

项目描述 :
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
高级语言: Jupyter Notebook
项目地址: git://github.com/guillaume-chevalier/Linear-Attention-Recurrent-Neural-Network.git