I am trying to implement the newton's method to find the minima in the Himmelblau function.
The code does work most of the time, but on cases like this where my initial guess is (0.5 , 1) it returns a critical point of the function. I understand this is because the gradient becomes 0 and no new points are generated.
Now my question would be, is this normal with this method? Is there a way of getting around this problem?
Thanks for any help
close all; clear; clc% Initialisation of variables to usex0 = [0.5;1];tol = 1e-4;maxits = 50;% Himmelblau functionhim = @(x,y) (x.^2 + y - 11).^2 + (x + y.^2 - 7).^2;% Gradient of the Himmelblaugrad_him = @(x,y) [[4*x.^3 + 4*x.*y - 42*x + 2*y.^2 - 14];[4*y.^3 + 4*x.*y - 26*y + 2*x.^2 - 22]];% Hessian matrix of the Himmelblauhessian_him = @(x,y) [[ 12*x.^2 + 4*y - 42 , 4*x + 4*y ];[ 4*x + 4*y , 12*y.^2 + 4*x - 26 ]];% Call to newton's function and displaying our results accordingly[r, iters, flag] = newton_min(grad_him,hessian_him,x0,tol,maxits);fprintf ("<strong>Newton's method</strong>\n\n");switch (flag)case 0fprintf ("There was a convergence on f\n\n");fprintf("The minima found is: \n");disp(r);fprintf("It took %d iterations.\n\n",iters);case 1fprintf ("There was a convergence on x\n\n");fprintf("The minima found is: \n");disp(r);fprintf("It took %d iterations.\n\n",iters);otherwisefprintf ("There was no convergence\n\n");endfunction [r, iters, flag] = newton_min(dg,ddg,x0,tol,maxits)x = x0(1); y = x0(2);r = NaN;flag = -1;for iters = 1 : maxitsx_old = [x;y];x_new = x_old - (ddg(x,y)\dg(x,y));if norm(dg(x,y)) < tolflag = 0;r = x_new;return;endif norm(x_new - x_old) <= (tol + eps*norm(x_new))flag = 1;r = x_new;return;endx = x_new(1);y = x_new(2);endend