I have been using ga to minimize a fitness function of mine, but have recently come across some issues concerning the output. After running for several hours (the fitness function is quite dense), the ga outputs an x vector along with an fval, which is supposed to be the fitness function evaluated at this x value. However, when I take the x-vector and input it into the fitness function manually, it returns a value different from the fval.
My ga setup is the following:
%Contraints and functionrng defaultFitFcn = @FullDevice;nvars = 3;lb = [1,20,5];ub = [5,100,10];IntCon = ;%optomization algorithm[x,fval,exitflag] = ga(FitFcn,nvars,,,,,lb,ub,,IntCon,)
As an example, on my most recent optimization, the x and fval output were the following
x =3.0000 20.0000 5.0333fval =-199.1417
> FullDevice([3 20 5.0333])ans =-189.8933
Any thoughts as to why this is happening? Thanks!