MATLAB: The neural network is giving same output for all inputs…do you have any idea why

Deep Learning ToolboxmlffNetwork

net=network(8,3,[1;1;1],[1 1 1 1 1 1 1 1;0 0 0 0 0 0 0 0;0 0 0 0 0 0 0 0],[0 0 0;1 0 0;0 1 0],[0 0 1]); net.layers{1}.transferFcn='logsig'; net.layers{2}.transferFcn='logsig'; net.layers{3}.transferFcn='logsig'; net.layers{2}.dimensions=10; net.trainFcn='traingd'; net.trainparam.min_grad=0.00001; net.trainparam.epochs=10000; net.trainparam.lr=0.3; net.trainparam.goal=0.0001; net=init(net); net.layers{1}.initFcn='initwb'; net.layers{2}.initFcn='initwb'; net.biases{1,1}.initFcn='rands'; net.biases{2,1}.initFcn='rands'; i=load('input.txt'); t=load('target.txt'); i=i'; t=t'; in=zeros(8,53); %normalized input tn=zeros(1,53); %normalized target
for r=1:8 %normalization of input min=i(r,1); max=i(r,1); for c=2:53 if i(r,c)<min min=i(r,c); end if i(r,c)>max max=i(r,c); end end for c=1:53 in(r,c)=0.1+(0.8*(i(r,c)-min)/(max-min)); end end
min=t(1); %normalization of target max=t(1); for c=2:53 if t(1,c)<min min=t(1,c); end if t(1,c)>max max=t(1,c); end end for c=1:53 tn(1,c)=0.1+(0.8*(t(1,c)-min)/(max-min)); end
net.divideFcn='divideblock'; net.divideParam.trainRatio = 0.85; net.divideParam.valRatio = 0.05; net.divideParam.testRatio = 0.1; net.performFcn='mse'; [net,tr]=train(net,in,tn); y=sim(net,in);

Best Answer

  • 1. There is no reason to use more than one hidden layer
    2. You have created a net with 8 inputs instead of 1 8-dimensional input.
    3. After creating a net view it using the command
    view(net)
    4. Why not just use fitnet?
    help fitnet
    5. After you rewrite your code you can test it on the 8-input/1-output chemical_dataset if you want to post further questions.
    Hope this helps.
    Thank you for formally accepting my answer
    Greg