; readWord success
  
RNN and LSTM cella 2017 Branchable 0 12256
  

Start from "Rohan & Lenny #3: Recurrent Neural Networks & LSTMs" (an article with almost no omission):

https://ayearofai.com/rohan-lenny-3-recurrent-neural-networks-10300100899b

And then Tensorflow's RNN page:

https://www.tensorflow.org/tutorials/recurrent

In this page, an introductory article by Colah is recommended to read first.

http://colah.github.io/posts/2015-08-Understanding-LSTMs/

Colah's article refers two papers explaining why a basic RNN is not capable of dealing with long term dependencies.
The 1991 paper by Hochreiter is in german and the other 1994 paper by Bengio et al. is in the following:

http://www-dsi.ing.unifi.it/~paolo/ps/tnn-94-gradient.pdf

The above tensorflow rnn page is implementing Zeremba et al. 2014 of the following:
read the paper first before checking the implementation.

https://arxiv.org/abs/1409.2329

다음은 deeplearning4j 를 이용한 LSTM 구현:

https://deeplearning4j.org/kr/lstm