Nonlinear Least Squares Problems Using Approximate Greatest Descent Method

Many numerical methods have been developed and modified to solve nonlinear least squares (NLS) problems as unconstrained optimization problems. One of the challenges regarding the existing numerical methods in solving NLS problems is expensive computation of the Hessian matrix at every iteration. He...

詳細記述

書誌詳細
出版年:Proceedings - 2022 International Conference on Computer and Drone Applications, IConDA 2022
第一著者: 2-s2.0-85146657825
フォーマット: Conference paper
言語:English
出版事項: Institute of Electrical and Electronics Engineers Inc. 2022
オンライン・アクセス:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85146657825&doi=10.1109%2fICONDA56696.2022.10000382&partnerID=40&md5=54db9497ee051201347ecb910204daab
id Ling Eu C.N.; Harno H.G.; Lim K.H.
spelling Ling Eu C.N.; Harno H.G.; Lim K.H.
2-s2.0-85146657825
Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
2022
Proceedings - 2022 International Conference on Computer and Drone Applications, IConDA 2022


10.1109/ICONDA56696.2022.10000382
https://www.scopus.com/inward/record.uri?eid=2-s2.0-85146657825&doi=10.1109%2fICONDA56696.2022.10000382&partnerID=40&md5=54db9497ee051201347ecb910204daab
Many numerical methods have been developed and modified to solve nonlinear least squares (NLS) problems as unconstrained optimization problems. One of the challenges regarding the existing numerical methods in solving NLS problems is expensive computation of the Hessian matrix at every iteration. Hence, most numerical methods employ a truncated Hessian, which, if the residuals are large, it may not be a good approximation of the Hessian matrix. In this paper, the approximate greatest descent (AGD) method is proposed to solve NLS problems. The algorithm of the AGD method, which uses the full Hessian matrix, is constructed in a logical, rational, and geometrical way. Numerical differentiation is employed to numerically calculate the derivatives of the function. The convergence analysis of the AGD method is following the Lyapunov function theorem whereby monotonic decreasing property and properly nested level sets of the objective function ensures its convergence. Coincidentally, the iterative equation of the AGD method resembles the Levenberg-Marquadrt (LM) method, even though both methods derived differently. Numerical experiments have shown that both AGD and LM methods are able to produce stable trajectories. The AGD method, on the ° ther hand, is more efficient, reliable, and robust because convergence can be achieved with fewer iterations in less time. © 2022 IEEE.
Institute of Electrical and Electronics Engineers Inc.

English
Conference paper

author 2-s2.0-85146657825
spellingShingle 2-s2.0-85146657825
Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
author_facet 2-s2.0-85146657825
author_sort 2-s2.0-85146657825
title Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
title_short Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
title_full Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
title_fullStr Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
title_full_unstemmed Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
title_sort Nonlinear Least Squares Problems Using Approximate Greatest Descent Method
publishDate 2022
container_title Proceedings - 2022 International Conference on Computer and Drone Applications, IConDA 2022
container_volume
container_issue
doi_str_mv 10.1109/ICONDA56696.2022.10000382
url https://www.scopus.com/inward/record.uri?eid=2-s2.0-85146657825&doi=10.1109%2fICONDA56696.2022.10000382&partnerID=40&md5=54db9497ee051201347ecb910204daab
description Many numerical methods have been developed and modified to solve nonlinear least squares (NLS) problems as unconstrained optimization problems. One of the challenges regarding the existing numerical methods in solving NLS problems is expensive computation of the Hessian matrix at every iteration. Hence, most numerical methods employ a truncated Hessian, which, if the residuals are large, it may not be a good approximation of the Hessian matrix. In this paper, the approximate greatest descent (AGD) method is proposed to solve NLS problems. The algorithm of the AGD method, which uses the full Hessian matrix, is constructed in a logical, rational, and geometrical way. Numerical differentiation is employed to numerically calculate the derivatives of the function. The convergence analysis of the AGD method is following the Lyapunov function theorem whereby monotonic decreasing property and properly nested level sets of the objective function ensures its convergence. Coincidentally, the iterative equation of the AGD method resembles the Levenberg-Marquadrt (LM) method, even though both methods derived differently. Numerical experiments have shown that both AGD and LM methods are able to produce stable trajectories. The AGD method, on the ° ther hand, is more efficient, reliable, and robust because convergence can be achieved with fewer iterations in less time. © 2022 IEEE.
publisher Institute of Electrical and Electronics Engineers Inc.
issn
language English
format Conference paper
accesstype
record_format scopus
collection Scopus
_version_ 1828987868647784448