# MATLAB: Strange neural network output

algebraic representationDeep Learning Toolboxneural network

Hi, I am trying to use the Neural Network Toolbox but I have troubles in calculating the output of a network. I will try to explain my problem: I have defined a very simple ANN with one hidden layer and linear activation functions. So if I have an input x, then I expect the output of the hidden layer to be
h = w * x + b
where w are the weights and b the biases. Then I expect my output to be
o = w' * h + b'
where w' are the weights between the hidden layer and the output and b' the biases.
Now the problem is that if I do
o = net(x)
this doesn't happen. Here is my code:
net = feedforwardnet([layer1], 'traincgp');net = configure(net, Dtrain, Dtrain);net.trainParam.epochs = 0;net.IW{1,1} = weights12;net.LW{2,1} = my_weights;net.b{1} = bias12;for ii=1:size(net.layers, 1)    net.layers{ii}.transferFcn = 'purelin';end;net = train(net, Dtrain, Dtrain);
As you can see I am training for 0 epochs since this is just a test and I am also using Dtrain both as input and target since I am training an autoencoder. As I said, the problem is that if I calculate the output as I wrote before I get one result, while if I do
output = net(input)
I get another one. What should I do to have the same result?

•  Just modify the following close all, clear all, clc, tic[ x, t ] = simplefit_dataset;[ I N ] = size(x), [O N ] = size(t)net = fitnet;net.input.processFcns  = { 'removeconstantrows' };net.output.processFcns =  { 'removeconstantrows' };rng('default')net   = train(net,x,t);NMSE1 = mse(t-net(x))/var(t)  %   1.7057e-05 IW = net.IW{1,1}          % [ 10 1 ]b1 = net.b{1}              % [ 10 1 ]b2 = net.b{2}              % [ 1  1 ]LW = net.LW{2,1}           % [ 1  10 ]B1 = repmat(b1,I,N)        % [ 10 94 ]B2 = repmat(b2,O,N)        % [1   94 ]y = B2+LW*tanh(B1+IW*x);   % [1   94 ]NMSE2 = mse(t-y)/var(t)    %   1.7057e-05
 I will let you figure out how to handle  1. The default mapminmax normalization 2. Multiple inputs and outputs.