by nick bulinski and justin gilmore. solving for a system of equations is not all that complicated...

7
Newton's Method for Functions of Several Variables By Nick Bulinski and Justin Gilmore

Upload: cecilia-manning

Post on 22-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

Newton's Method for Functions of Several Variables

By Nick Bulinski and Justin Gilmore

Page 2: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations are linear. The combination of nonlinear and more then one equation raises the difficulty significantly.

Systems of Equations

Multivariate Newton’s MethodOne way to solve systems of equations with multiple variables is using multivariate Newton’s method. This method comes from

Newton’s original method which is .Newton's Method for Functions of Several Variables finds the roots for a system of nonlinear equations.

)('

)(1

n

nnn xf

xfxx

Page 3: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

Multivariate Newton’s Method (cont)

Newton's one-variable method provides an outline for how the multi-variable case will work. Both are derived from the linear approximation given by the Taylor expansion. Before we get to that however we need define a few terms.

We will also need to take the derivative of . For that we will need to compute what is known as the Jacobian matrix.

Page 4: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

The Jacobian matrix is an analog to the derivative of f in the one variable case. It is defined as the matrix of all first partial derivatives of a vector function F(v) s.t.

ex:

Jacobian Matrix

Page 5: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

Now we have all the pieces we need to make the Taylor expansion:

for k = 0,1,2,………

Putting it all Together

or

We then derive the algorithm by solving the second equation for r

Page 6: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

Example

𝐷𝐹 (𝑣 )−1=[ 12 𝑥+4 𝑦

−2 𝑦

2𝑥+4 𝑦2

2 𝑥+4 𝑦2 𝑥

2𝑥+4 𝑦]𝐷𝐹 (𝑣 )=[2 𝑥 2 𝑦

− 2 1 ]

𝑣1=[0.7857141.56143 ] 𝑣2=[0.52013

1.04026 ] 𝑣3=[ 0.45230130.904649 ]

Page 7: By Nick Bulinski and Justin Gilmore. Solving for a system of equations is not all that complicated for a system of linear equations, but not all equations

Advantages & Disadvantages

• Advantages• Different solutions can be found with a different

starting guess• Fast Convergence• Disadvantages• Will only work if the Jacobian can be computed• If the Jacobian is singular the algorithm breaks • Number of iterations can not be determined before

the algorithm begins