Our study focuses on exploring new variants of the structured quasi-Newton (SQN) method with a secant-like diagonal approximation (SLDA) of the second-order term of Hessian for solving nonlinear least squares (NLS) problems. In addition, an accelerated version of SQN-SLDA referred to as ASQN-SLDA, is also considered. In ASQN-SLDA, we rescale the search direction after the first backtracking linesearch procedure to produce a more aggressive step to increase the objective function value reduction. The concept of the proposed methods is simple and easy to implement. We prove the proposed methods are globally convergent under some appropriate assumptions and report several numerical experiments based on a suite of benchmark problems for NLS. The numerical results show that ASQN-SLDA is more robust than some baseline methods, including SQN-SLDA, the generalized Gauss–Newton (GN), Levenberg–Marquardt update, and the hybrid GN with Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, also called H-GN-BFGS. For computing time, AQN-SLDA outperforms H-GN-BFGS for most test problems, and the speedup is more significant as the problem size increases. Furthermore, due to the trade-off between the number of iterations for convergence and the overhead needed in ASQN-SLDA, the benefit of the acceleration step is more evident for the largest-sized problems.
Tạp chí khoa học Trường Đại học Cần Thơ
Lầu 4, Nhà Điều Hành, Khu II, đường 3/2, P. Xuân Khánh, Q. Ninh Kiều, TP. Cần Thơ
Điện thoại: (0292) 3 872 157; Email: tapchidhct@ctu.edu.vn
Chương trình chạy tốt nhất trên trình duyệt IE 9+ & FF 16+, độ phân giải màn hình 1024x768 trở lên