In this post I will share experiment with Time Series Prediction with LSTM and Keras. LSTM neural network is used in this experiment for multiple steps ahead for stock prices data. The experiment is based on the paper . The authors of the paper examine independent value prediction approach. With this approach a separate model is built for each prediction step. This approach helps to avoid error accumulation problem that we have when we use multi-stage step prediction.
Following this approach I decided to use Long Short-Term Memory network or LSTM network for daily data stock price prediction. LSTM is a type of recurrent neural network used in deep learning. LSTMs have been used to advance the state-of the-art for many difficult problems. 
For this time series prediction I selected the number of steps to predict ahead = 3 and built 3 LSTM models with Keras in python. For each model I used different variable (fit0, fit1, fit2) to avoid any “memory leakage” between models.
The model initialization code is the same for all 3 models except changing parameters (number of neurons in LSTM layer)
The architecture of the system is shown on the fig below.
Here we have 3 LSTM models that are getting same X input data but different target Y data. The target data is shifted by number of steps. If model is forecasting the data stock price for day 2 then Y is shifted by 2 elements.
This happens in the following line when i=1:
yt_ = yt.shift (i - 1 )
The data were obtained from stock prices from Internet.
The number of unit was obtained by running several variations and chosen based on MSE as following:
if i==0: units=20 batch_size=1 if i==1: units=15 batch_size=1 if i==2: units=80 batch_size=1
If you want run more than 3 steps / models you will need to add parameters to the above code. Additionally you will need add model initialization code shown below.
Each LSTM network was constructed as following:
if i == 0 : fit0 = Sequential () fit0.add (LSTM ( units , activation = 'tanh', inner_activation = 'hard_sigmoid' , input_shape =(len(cols), 1) )) fit0.add(Dropout(0.2)) fit0.add (Dense (output_dim =1, activation = 'linear')) fit0.compile (loss ="mean_squared_error" , optimizer = "adam") fit0.fit (x_train, y_train, batch_size =batch_size, nb_epoch =25, shuffle = False) train_mse[i] = fit0.evaluate (x_train, y_train, batch_size =batch_size) test_mse[i] = fit0.evaluate (x_test, y_test, batch_size =batch_size) pred = fit0.predict (x_test) pred = scaler_y.inverse_transform (np. array (pred). reshape ((len( pred), 1))) # below is just fo i == 0 for j in range (len(pred)) : prediction_data[j] = pred[j]
For each model the code is saving last forecasted number.
Additionally at step i=0 predicted data is saved for comparison with actual data:
prediction_data = np.asarray(prediction_data) prediction_data = prediction_data.ravel() # shift back by one step for j in range (len(prediction_data) - 1 ): prediction_data[len(prediction_data) - j - 1 ] = prediction_data[len(prediction_data) - 1 - j - 1] # combine prediction data from first model and last predicted data from each model prediction_data = np.append(prediction_data, forecast)
The full python source code for time series prediction with LSTM in python is shown here
Data can be found here
The LSTM neural network was running with the following performance:
train_mse [0.01846262458218137, 0.009637593373373323, 0.0018845983509225203] test_mse [0.01648362025879952, 0.026161141224167357, 0.01774421124347165]
Below is the graph of actual data vs data testing data, including last 3 stock data prices from each model.
Accuracy of prediction 98% calculated for last 3 data stock prices (one from each model).
The experiment confirmed that using models (one model for each step) in multistep-ahead time series prediction has advantages. With this method we can adjust parameters of needed LSTM for each step. For example, number of neurons for i=2 was modified to decrease prediction error for this step. And it did not affect predictions for other steps. This is one of machine learning techniques for stock prediction that is described in 
1. Multistep-ahead Time Series Prediction
2. LSTM: A Search Space Odyssey
3. Deep Time Series Forecasting with Python: An Intuitive Introduction to Deep Learning for Applied Time Series Modeling