Đăng nhập
 
Tìm kiếm nâng cao
 
Tên bài báo
Tác giả
Năm xuất bản
Tóm tắt
Lĩnh vực
Phân loại
Số tạp chí
 

Bản tin định kỳ
Báo cáo thường niên
Tạp chí khoa học ĐHCT
Tạp chí tiếng anh ĐHCT
Tạp chí trong nước
Tạp chí quốc tế
Kỷ yếu HN trong nước
Kỷ yếu HN quốc tế
Book chapter
Bài báo - Tạp chí
442 (2024) Trang: 115718
Tạp chí: Journal of Computational and Applied Mathematics

Our study focuses on exploring new variants of the structured quasi-Newton (SQN) method with a secant-like diagonal approximation (SLDA) of the second-order term of Hessian for solving nonlinear least squares (NLS) problems. In addition, an accelerated version of SQN-SLDA referred to as ASQN-SLDA, is also considered. In ASQN-SLDA, we rescale the search direction after the first backtracking linesearch procedure to produce a more aggressive step to increase the objective function value reduction. The concept of the proposed methods is simple and easy to implement. We prove the proposed methods are globally convergent under some appropriate assumptions and report several numerical experiments based on a suite of benchmark problems for NLS. The numerical results show that ASQN-SLDA is more robust than some baseline methods, including SQN-SLDA, the generalized Gauss–Newton (GN), Levenberg–Marquardt update, and the hybrid GN with Broyden–Fletcher–Goldfarb–Shanno (BFGS) method, also called H-GN-BFGS. For computing time, AQN-SLDA outperforms H-GN-BFGS for most test problems, and the speedup is more significant as the problem size increases. Furthermore, due to the trade-off between the number of iterations for convergence and the overhead needed in ASQN-SLDA, the benefit of the acceleration step is more evident for the largest-sized problems.

Các bài báo khác
Vol. 54, No. 5 (2018) Trang: 72-76
Tải về
1 (2020) Trang: 1-4
Tạp chí: Journal of the Operations Research Society of China
 


Vietnamese | English






 
 
Vui lòng chờ...