Long short- term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning. LSTM networks are type of RNN.
LSTM networks are well-suited to classifying, processing and making predictions based on time series data.
LSTM networks have some internal contextual state cells that act as long term or short term memory short. The output of the LSTM network is modulated by the state of these cells. This is a very important property when we need the prediction of the neural network to depend on the historical context of inputs, rather than only on the very last input.
Consider that we want to repdict the number of the following sequences: 8->9->10->? However, if we provide these sequences: 6->8->10->?
Although both cases, the current last input was number 10, the prediction outcome should be different (when we take into account the contextual information of previous values and not only the last one)