Recurrent |
A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. |
SimpleRNN |
Fully-connected RNN where the output is to be fed back to input. |
GRU |
Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014. |
LSTM |
Long short-term memory (LSTM) is a recurrent neural network (RNN) architecture (an artificial neural network) proposed in 1997 by Sepp Hochreiter and Jürgen Schmidhuber [R7] and further improved in 2000 by Felix Gers et al.[R8]_ Like most RNNs, a LSTM network is universal in the sense that given enough network units it can compute anything a conventional computer can compute, provided it has the proper weight matrix, which may be viewed as its program. |
BatchLSTM |
Long short-term memory (LSTM) is a special kind of RNN. |