Next: Levenberg-Marquardt Method Up: Nonlinear Chi Square Fits Previous: Steepest Descent

## Newton's Method

The minimization condition can be converted into the problem of solving a nonlinear system by requiring that the derivative with respect to each of the parameters must vanish. We have

 (33)

We can try to solve this nonlinear system by using Newton's method. This method starts with a trial value for the parameter vector and then seeks the vector change that improves the trial value. Thus we seek the solution to
 (34)

Thus the change is found by solving the linear system
 (35)

where
 (36)

and
 (37)

Actually this linear system is the same as what we had to solve for the linear least squares problem. The connection can be made more explicit by realizing that a Taylor's expansion of in small shifts about the vector is just
 (38)

to second order in . Thus is twice the Hessian matrix just as before. However, in the case of a linear least squares problem, the Taylor series was exact, and solving the linear system once led directly to the optimum value of the parameter vector. In the present case, solving the linear system is just a step in the Newton iteration that is supposed to lead us to the solution after a number of steps. To construct the components of the vector and the matrix , we must be able to evaluate the first and second partial derivatives of the fitting function with respect to the fitting parameters. Then to proceed with the Newton iteration, we must solve the linear system each time we take a step.

The error in the th fitted parameter is found from the diagonal element of the matrix at the minimum of .

 (39)

Next: Levenberg-Marquardt Method Up: Nonlinear Chi Square Fits Previous: Steepest Descent
Carleton DeTar 2009-11-23