MATLAB: Predict future values of a PRNN

MATLABneural networkneural networks

My data set consists of 4 input variables, and 1 target variable with 5 categories. My input and target variables are the same length (~2000 values each). I followed the tutorials with the Pattern Recognition Neural Network, and then used my data in the network with okay results. I would like to use the network to predict the next 5-20 values, but having trouble. When I use the 'sim' function, I get a matrix the size of my target (5×2200) instead of the desired size of my prediction (5×20). How do I correct the prediction error?
Load data.mat % Contains 'inputs' & 'targets' variables
% Create a Pattern Recognition Network
hiddenLayerSize = 17;
net = patternnet(hiddenLayerSize);
% Set up Division of Data for Training, Validation, Testing
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Train the Network
[net,tr] = train(net,inputs,targets);
% Test the Network
outputs = net(inputs);
errors = gsubtract(targets,outputs);
performance = perform(net,targets,outputs);
% Predict next 20 values
y = sim(net,inputs);
Any help is greatly appreciated!

Best Answer

  • Explanation for the output you’re getting:
    Your patternnet is learning a mapping/relationship between the inputs and the target. Using the sim command, you’re predicting the values for every data point in inputs. Since inputs is of, I presume, size 4x2200, your predictions are of size 5x2200. Essentially, the network is predicting for every 4-length data point.
    The requirement:
    What you want is for the network to predict the future values. You haven’t mentioned whether there is temporal information in your dataset. That is, you haven’t mentioned whether these data points are sequential, in some sense, and have enough information among them to predict future target values. I am going to assume they do. There are, at least, two ways to achieve this.
    The first way is for you train your network to predict the k-th future target which can be done with the following statement for some constant ‘k’:
    [net, tr] = train(net, inputs(1:end-k), targets(1+k:end));
    In this approach, you’re not learning/using any temporal relation among the datapoints. The network only learns a mapping/relation between the input for a given time instant and the target k steps into the future. This leads to the shortcoming that you’ll have to construct different networks for different values of ‘k’ if you’re constructing your networks using patternnet. You could make this better by constructing an equivalent network using fullyConnectedLayers of appropriate sizes. This has the advantage that you can drop the necessity for one-hot vectors. You can then replace the targets with a 5-20 length vectors containing the categorical target information for the next 5-20 time steps. Here’s an example to help you understand how fullyConnectedLayers are used:
    The second approach is to use an LSTM. This is a type of network which can learn relations between consecutive samples. This approach can be implemented like the previous approach except it also uses temporal information. You could train the network to learn a temporal relation between the current input and a 5-20 length target vector of future categorical information. Here’s a simple example for you to model your network: