要約: | Many numerical methods have been developed and modified to solve nonlinear least squares (NLS) problems as unconstrained optimization problems. One of the challenges regarding the existing numerical methods in solving NLS problems is expensive computation of the Hessian matrix at every iteration. Hence, most numerical methods employ a truncated Hessian, which, if the residuals are large, it may not be a good approximation of the Hessian matrix. In this paper, the approximate greatest descent (AGD) method is proposed to solve NLS problems. The algorithm of the AGD method, which uses the full Hessian matrix, is constructed in a logical, rational, and geometrical way. Numerical differentiation is employed to numerically calculate the derivatives of the function. The convergence analysis of the AGD method is following the Lyapunov function theorem whereby monotonic decreasing property and properly nested level sets of the objective function ensures its convergence. Coincidentally, the iterative equation of the AGD method resembles the Levenberg-Marquadrt (LM) method, even though both methods derived differently. Numerical experiments have shown that both AGD and LM methods are able to produce stable trajectories. The AGD method, on the ° ther hand, is more efficient, reliable, and robust because convergence can be achieved with fewer iterations in less time. © 2022 IEEE.
|