language-iconOld Web
English
Sign In

Newton's method

In numerical analysis, Newton's method, also known as the Newton–Raphson method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a real-valued function. The most basic version starts with a single-variable function f defined for a real variable x, the function's derivative f ′, and an initial guess x0 for a root of f. If the function satisfies sufficient assumptions and the initial guess is close, then f ( α ) = f ( x n ) + f ′ ( x n ) ( α − x n ) + R 1 {displaystyle f(alpha )=f(x_{n})+f'(x_{n})(alpha -x_{n})+R_{1},} (1) 0 = f ( α ) = f ( x n ) + f ′ ( x n ) ( α − x n ) + 1 2 f ″ ( ξ n ) ( α − x n ) 2 {displaystyle 0=f(alpha )=f(x_{n})+f'(x_{n})(alpha -x_{n})+{ frac {1}{2}}f''(xi _{n})(alpha -x_{n})^{2},} (2) f ( x n ) f ′ ( x n ) + ( α − x n ) = − f ″ ( ξ n ) 2 f ′ ( x n ) ( α − x n ) 2 {displaystyle {frac {f(x_{n})}{f'(x_{n})}}+left(alpha -x_{n} ight)={frac {-f''(xi _{n})}{2f'(x_{n})}}left(alpha -x_{n} ight)^{2}} (3) x n + 1 = x n − f ( x n ) f ′ ( x n ) , {displaystyle x_{n+1}=x_{n}-{frac {f(x_{n})}{f'(x_{n})}},,} (4) ε n + 1 = − f ″ ( ξ n ) 2 f ′ ( x n ) ⋅ ε n 2 . {displaystyle varepsilon _{n+1}={frac {-f''(xi _{n})}{2f'(x_{n})}}cdot {varepsilon _{n}}^{2},.} (5) | ε n + 1 | = | f ″ ( ξ n ) | 2 | f ′ ( x n ) | ⋅ ε n 2 . {displaystyle left|{varepsilon _{n+1}} ight|={frac {left|f''(xi _{n}) ight|}{2left|f'(x_{n}) ight|}}cdot {varepsilon _{n}}^{2},.} (6) In numerical analysis, Newton's method, also known as the Newton–Raphson method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a real-valued function. The most basic version starts with a single-variable function f defined for a real variable x, the function's derivative f ′, and an initial guess x0 for a root of f. If the function satisfies sufficient assumptions and the initial guess is close, then is a better approximation of the root than x0. Geometrically, (x1, 0) is the intersection of the x-axis and the tangent of the graph of f at (x0, f (x0)): that is, the improved guess is the unique root of the linear approximation at the initial point. The process is repeated as until a sufficiently precise value is reached. This algorithm is first in the class of Householder's methods, succeeded by Halley's method. The method can also be extended to complex functions and to systems of equations. The idea is to start with an initial guess which is reasonably close to the true root, then to approximate the function by its tangent line using calculus, and finally to compute the x-intercept of this tangent line by elementary algebra. This x-intercept will typically be a better approximation to the original function's root than the first guess, and the method can be iterated. More formally, suppose f : (a, b) → ℝ is a differentiable function defined on the interval (a, b) with values in the real numbers ℝ, and we have some current approximation xn. Then we can derive the formula for a better approximation, xn + 1 by referring to the diagram on the right. The equation of the tangent line to the curve y = f (x) at x = xn is where f′ denotes the derivative. The x-intercept of this line (the value of x which makes y = 0) is taken as the next approximation,xn + 1, to the root, so that the equation of the tangent line is satisfied when ( x , y ) = ( x n + 1 , 0 ) {displaystyle (x,y)=(x_{n+1},0)} : Solving for xn + 1 gives We start the process with some arbitrary initial value x0. (The closer to the zero, the better. But, in the absence of any intuition about where the zero might lie, a 'guess and check' method might narrow the possibilities to a reasonably small interval by appealing to the intermediate value theorem.) The method will usually converge, provided this initial guess is close enough to the unknown zero, and that f ′(x0) ≠ 0. Furthermore, for a zero of multiplicity 1, the convergence is at least quadratic (see rate of convergence) in a neighbourhood of the zero, which intuitively means that the number of correct digits roughly doubles in every step. More details can be found in the analysis section below. Householder's methods are similar but have higher order for even faster convergence. However, the extra computations required for each step can slow down the overall performance relative to Newton's method, particularly if f or its derivatives are computationally expensive to evaluate.

[ "Convergence (routing)", "Nonlinear system", "newton direction", "third order method", "cubic convergence", "Kantorovich theorem", "Gauss–Newton algorithm" ]
Parent Topic
Child Topic
    No Parent Topic
Baidu
map