A Class of 2-Order Convergence Algorithm for One Dimension Optimization Problem
-
Graphical Abstract
-
Abstract
Newton's methods play an important role in problems of optimization. It is a kind of iteration that is quadratic convergent. However, it must calculate second order derivative of objective function. Based on the solution prior to this paper, a calss of methods is gained by referring to the extra message from an added point, which only uses first order derivative, and has an adjustable parameter and the same convergent rate as Newton's methods as well.
-
-