ch6 roots eqn
TRANSCRIPT
1
Finding Roots of Equations
“Numerical Methods with MATLAB”, Recktenwald, Chapter 6and
“Numerical Methods for Engineers”, Chapra and Canale, 5th Ed., Part Two, Chapters 5, 6 and 7 and
“Applied Numerical Methods with MATLAB”, Chapra, 2nd Ed., Part Two, Chapters 5 and 6
PGE 310: Formulation and Solution in Geosystems Engineering
Dr. Balhoff
1
Martini and Wine Glass
2 2 2R h a R
2 236capV h a h
2 2163
3 3capV h R h
Martini glass filled to 6cm depth contains ~ 216 /3 of wine. How deep must you fill wine glass to ensure that it contains the same amount of wine?
Volume of a spherical cap
Use Pythagorean Theorem2 22a Rh h
32 216
( ) 7 03 3
hf h h
Substitute into formula (R=7)
2
2
Peng-Robinson Equation of State
Ideal gas law: PVi=RT
Many gases not “ideal”: Reservoirs High temperatures
High Pressures
Peng-Robinson accounts for non-ideality
At P= 50 bar and T=473K….
2 22i i i
RT aP
V b V bV b
2 20.4572.3 6
0.077824.7
c
c
c
c
R Ta E
P
RTb
P
2
39325 2.3 6( ) 50 0
24.7 49.4 611ii i i
Ef V
V V V
Methane
3
Falling Parachutist
Newton’s Law:
Force Balance:
Plugging in gravity and drag:
Integrate:
What is “c” if: g = 9.8 m/s2, m = 68.1 kg, v =40 m/s, t = 10s
dvF ma m
dt
( / )( ) 1 c m tgmv t e
c
D U
dvF F m
dt
dvmg cv m
dt
040138.667
)( 146843.0 cec
cf
4
3
Roots of Nonlinear Equations: f(x) = 0
Find the root of the quadratic equation
Many problems aren’t so simple Martini and wine glass
Peng-Robinson Equation of State
Falling Parachutist
Use a “numerical” method to solve
Approximate technique
a
acbbxcbxaxxf
2
4;0)(
22
no “analytical” solution!!!
5
Root Finding Techniques
Bracketing Methods Graphical
Bisection
False Position
Open Methods Fixed Point Iteration
Newton-Raphson
Secant Method
Polynomials Muller’s Method
Bairstows Method
32 216
( ) 7 03 3
hf h h
2
39325 2.3 6( ) 50 0
24.7 49.4 611ii i i
Ef V
V V V
040138.667
)( 146843.0 cec
cf
6
4
Graphical Methods
• Find “c”, but how?
• Maybe I could just plot points?
c = 4; f(c) = 34.115c = 8; f(c) = 17.653 c = 12; f(c) = 6.067c = 16; f(c) = -2.269 c = 20; f(c) = -8.401
040138.667
)( 146843.0 cec
cf
Nonlinear Equation for drag coefficient
root
7
Bisection (Interval Halving)
We saw from the graph The root: 12 < c < 16
Why f(c) went from being positive to negative?
The value of “c”, that gave f(c)= 0 had to be between those numbers
Theorem: If f(x) is real & continuous in the interval xL to xU and f(xL) and f(xU) have opposite signs, then there is at least one root between xL to xU
In English: If “f(x)” goes from positive to negative, it must have been zero somewhere in the middle
0)()( UL xfxf Means positive number multiplied by negative #
8
5
f(c) > 0
f(c) < 09
1. Bracket the root between 12 and 16
2. Bisect in half; x=14?
3. Is root between 14 and 16? Yes.
4. Bisect in half; x=15
5. Is root between 15 and 16? No.
Bisection Method
10
6
Bisection Method: basic algorithm
1. Choose xL and xU so that root is bracketed: f(xL)*f(xU) < 0
2. Divide “x” in half xr = (xL + xU)/2
3. Calculate f(xr): evaluate function at new point
4. Determine if f(xr)* f(xU) < 0 OR if f(xr)* f(xL) < 0
5. Replace either xU or xL with xr
6. Repeat iterative strategy
11
False-Position Method
Bisection is “brute-force” and inefficient
No account is taken for magnitude of f(xU) and f(xL) If f(xU) is closer to zero than f(xL), xU is probably closer to the root
Replace the curve with a straight line to give a “false position”
• Line creates “similar triangles”
• Need a formula to find the x-intercept
• Sounds like a geometry problem
( ) ( )l u
r l r u
f x f x
x x x x
ul
uluur xfxf
xxxfxx
)(
7
Convergence
Bisection guaranteed to converge
“Brute force”
May converge slowly/oscillate
False position also guaranteed to converge Converges linearly and hopefully
quicker
There are pitfalls
ul
uluur xfxf
xxxfxx
)(
vs.
2lu
r
xxx
Pitfalls of False Position
While the false-position WILL converge to the root eventually…
Here is a case where it does very slowly
ul
uluur xfxf
xxxfxx
)(
14
8
Summary of Bracketing Methods
Bracketing methods guarantee convergence! Graphical can be difficult for accurate answers
Bisection is simple and straightforward Bounds the root
Bisects and creates new bounds
False Position takes advantage of f(x) magnitude Uses similar triangles to find new bounds
Converges linearly and usually more efficient
Next time….Open Methods Only require 1 guess
Usually faster, but convergence not guaranteed
15
Fixed Point Iteration
Re-arrange equation to get x = g(x)
4x2-x+3=0 x=4x2+3sin(x)=0 x=sin(x)+x
So, the solution will be the intersection point between these two curves: f1(x)=x and f2(x)=g(x)
Use an “iterative” method to solve for x1. Guess a value of xk
2. Calculate g(xk)3. Update xk+1=g(xk) 4. Iteratively update “x” until the solution
converges
May be several ways to rearrange equation Some arrangements converge faster Some arrangements may DIVERGE
9
Convergence of Fixed Point Iteration : x=g(x)
Fixed Point does not always converge
Converges b/w x = a & b ONLY IF
It means that:
'( ) 1g x (for all a < x< b)
Divergexgorxg 1)(1)(
Convergexgorxg 0)(11)(0
Why? Because of the slopes:g’(x) is the slope of f2(x)
1 is the slope of f1(x)
Look at the next slides
Fixed Point Iteration Convergence for Positive Slopes
f1(x)=x
f2(x)=g(x)
Root
x
y
1st try2nd try
1)(0 xg
1 is the slope of f1(x)g’(x) is the slope of f2(x)
Lower slope of f2(x) leads to convergence of iterations
CONVERGES !Monotonically
f1(x)=xf2(x)=g(x)
Root
x
y
1st try
2nd try
1)( xg
DIVERGES !
Higher slope of f2(x) leads to divergence of iterations
10
Fixed Point Iteration Convergence for Negative Slopes
f1(x)=x
f2(x)=g(x)
x
y
1st try
2nd try
0)(1 xg
CONVERGES !Oscillatory
f1(x)=x
x
y
1st try
2nd try
1)( xg
DIVERGES !
f2(x)=g(x)
Higher slope of f2(x) leads to divergence of iterations
Lower slope of f2(x) leads to convergence of iterations
1 is the slope of f1(x)g’(x) is the slope of f2(x)
Newton’s Method (or Newton-Raphson)
Most widely used method
Often converges much faster (quadratic convergence)
Idea: Use the line tangent to the curve to find the new root
Derive from a Taylor Series
1
( )
'( )k
k kk
f xx x
f x
20
11
Convergence of Newton-Raphson Method
Usually converges quadratically
Example:
Solved with 2 methods:
Newton-Raphson with x0=0
False-Position Method with xl=0 and xu=20
xexf x )( (true solution = 0.567143290409784)
Iterations true error
x0 = 0 100.000000000%
x1 = 0.500000000000000 11.838858282%
x2 = 0.566311003197218 0.146750782%
x3 = 0.567143165034862 0.000022106%
x4 = 0.567143290409781 0.000000000%
Newton‐Raphson
Iterations true error
x0 = 0.952380952 67.925984240%
x1 = 0.607944265065116 7.194121018%
x2 = 0.571658116501746 0.796064446%
x3 = 0.567645088312370 0.088478152%
x4 = 0.567199089558233 0.009838633%
False‐Position
Quadratic curve
Newton-Raphson Pitfalls
(a) Inflection point near root; f”(x)=0, causes divergence
(b) Oscillates around local minimum or maximum
(c) Near-zero slopes can jump to location several roots away
(d) Zero slope is true disaster!
• No general convergence criterion!
• Depends on nature of the function
• Best remedy is to have guess “sufficiently” close to root
• Some functions – no guess works!
22
12
The Secant Method
Derivative can be difficult or inconvenient to calculate for some functions
Approximate the derivative Remember the derivative is “slope of the line tangent to the curve”
Calculate the slope of a line that approximates the tangent line
1
1
( ) ( )' k k
kk k
f x f xf x
x x
• 2 points make a line (slope = rise/run)
1
( )
'( )k
k kk
f xx x
f x
11
1
( )( )
( ) ( )k k k
k kk k
f x x xx x
f x f x
• Substitute into Newton’s Formula
23
False Position versus Secant Method
False Position and Secant Method look like the same formula
What is the difference?
( )( )
( ) ( )U L U
r UL U
f x x xx x
f x f x
False Position:
replaces new guess with one of the bounds
Always “bracketed”
Secant Method:
replaces guesses in strict sequence
So xk+1 replaces xk and xk replaces xk-1
No guarantee of bracketing root
11
1
( )( )
( ) ( )k k k
k kk k
f x x xx x
f x f x
24
13
Convergence Comparison
Newton’s method converges fastest Quadratic convergence
sometimes (like many open methods) it may fail
Bracketing Methods slower But convergence is guaranteed
Bisection is the slowest of all
25
Modified Secant Method
Newton’s method is fast (quadratic convergence) but derivative may not be available
Secant method uses two points to approximate the derivative, but approximation may be poor if points are far apart.
Modified Secant method is a much better approximation because it uses one point, and the derivative is found by using another point some small distance, , away
1
( )
'( )k
k kk
f xx x
f x
( )' k k
k
f x f xf x
26
14
“Multiple Roots”
Several equations may have more than root 2nd order polynomials have 2 roots, right?
Start with an initial guess close to the root you want
But “multiple roots” refers to equations with the same root multiple times:
Problems arise with multiple roots Bracketing methods don’t work, b/c no sign change
f(x) and f’(x) both zero exactly at root
Newton’s method because linearly, not quadratically convergent….but an alternative formula is quadratic
3 2( ) 1 1 3 5 5 3f x x x x x x x
1
( ) '( )
' ( ) "( )i i
i ii i i
f x f xx x
f x f x f x
27
Matlab Built-In Functions
fzero uses a hybrid of several methodsr = fzero(fun,x0,options)
r = fzero(fun,x0)
roots function gives multiple roots of polynomial
Consider f(x) = 1x2 - 3x + 2 = 0
>> roots ([1 -3 2])
ans =
2
1
root function “.m” file
Your initial guess
28
15
Review
Need to find roots of nonlinear equations
Martini and wine glass
Peng-Robinson EOS
Parachute Problem
Learned a few different methods Closed – Convergence guaranteed by bracketing the root
Bisection
False Position
Open – No guaranteed Convergence, but much faster
Fixed Point Iteration
Newton’s
Secant and modified secant Method
32 216
( ) 7 03 3
hf h h
2
39325 2.3 6( ) 50 0
24.7 49.4 611ii i i
Ef V
V V V
040138.667
)( 146843.0 cec
cf
29