Академический Документы
Профессиональный Документы
Культура Документы
Roger Crawfis
Ohio State University
Bi-section Method
Newtons method
Uses of root finding for sqrt() and reciprocal sqrt()
Secant Method
Generalized Newtons method for systems of nonlinear equations
The Jacobian matrix
OSU/CIS 541
Motivation
Many problems can be re-written into a
form such as:
f(x,y,z,) = 0
f(x,y,z,) = g(s,q,)
July 4, 2016
OSU/CIS 541
Motivation
A root, r, of function f occurs when f(r) = 0.
For example:
f(x) = x2 2x 3
has two roots at r = -1 and r = 3.
f(-1) = 1 + 2 3 = 0
f(3) = 9 6 3 = 0
OSU/CIS 541
A root exists at x = 1.
f(x) = (x 1) sin x
Or,
f(x) = sin x => x (x 1) (x 2)
July 4, 2016
OSU/CIS 541
Examples
Find x, such that
xp = c, xp c = 0
Calculate the sqrt(2)
x 22 22=00 x 2 x 2
Ballistics
Determine the horizontal distance at which the
projectile will intersect the terrain function.
July 4, 2016
OSU/CIS 541
Open techniques
Newton fixed-point iteration
Secant method
Fixed-point iterations
Convergence and Fractal Basins of Attraction
July 4, 2016
OSU/CIS 541
Bisection Method
Based on the fact that the function will
change signs as it passes thru the root.
f(a)*f(b) < 0
July 4, 2016
OSU/CIS 541
Bisection Method
c=(a+b)/2
f(a)>0
f(c)>0
f(b)<0
July 4, 2016
OSU/CIS 541
10
Bisection Method
Guaranteed to converge to a root if one
exists within the bracket.
a=c
f(a)>0
f(c)<0
July 4, 2016
OSU/CIS 541
f(b)<0
11
Bisection Method
Slowly converges to a root
b=c
f(b)<0
a
July 4, 2016
OSU/CIS 541
12
Bisection Method
Simple algorithm:
Given: a and b, such that f(a)*f(b)<0
Given: error tolerance, err
c=(a+b)/2.0; // Find the midpoint
While( |f(c)| > err ) {
if( f(a)*f(c) < 0 ) // root in the left half
b = c;
else
// root in the right half
a = c;
c=(a+b)/2.0; // Find the new midpoint
}
return c;
July 4, 2016
OSU/CIS 541
13
Relative Error
We can develop an upper bound on the
relative error quite easily.
ba
x c
,a x
a
x
July 4, 2016
OSU/CIS 541
14
Absolute Error
What does this mean in binary mode?
err0 |b-a|
erri+1 erri/2 = |b-a|/2i+1
July 4, 2016
ba
errtol
n
2
b a
n log 2
err
tol
OSU/CIS 541
15
Absolute Error
The bisection method converges linearly or
first-order to the root.
If we need an accuracy of 0.0001 and our
initial interval (b-a)=1, then:
2-n < 0.0001 14 iterations
July 4, 2016
OSU/CIS 541
16
A Note on Functions
Functions can be simple, but I may need to
evaluate it many many times.
Or, a function can be extremely
complicated. Consider:
Interested in the configuration of air vents (position,
orientation, direction of flow) that makes the
temperature in the room at a particular position
(teachers desk) equal to 72.
Is this a function?
July 4, 2016
OSU/CIS 541
17
A Note on Functions
This function may require a complex threedimensional heat-transfer coupled with a
fluid-flow simulation to evaluate the
function. hours of computational time on
a supercomputer!!!
May not necessarily even be computational.
Techniques existed before the Babylonians.
July 4, 2016
OSU/CIS 541
18
Open techniques
Newton fixed-point iteration
Secant method
Fixed-point iterations
Convergence and Fractal Basins of Attraction
July 4, 2016
OSU/CIS 541
19
Regula Falsi
In the book under computer problem 16 of
section 3.3.
Assume the function is linear within the
bracket.
Find the intersection of the line with the xaxis.
July 4, 2016
OSU/CIS 541
20
Regula Falsi
f (a ) f (b)
y ( x) f b
( x b)
a b
f (a ) f (b)
y (c ) 0 f b
(c b )
a b
a b
0 f (b)
c b
f (a ) f (b)
f (b) a b
c b
f (a ) f (b)
f(c)<0
a
July 4, 2016
OSU/CIS 541
21
Regula Falsi
Large benefit when the root is much closer
to one side.
Do I have to worry about division by zero?
July 4, 2016
OSU/CIS 541
22
Regula Falsi
More generally, we can state this method as:
c=wa + (1-w)b
July 4, 2016
OSU/CIS 541
23
Bracketing Methods
Guaranteed to converge
July 4, 2016
OSU/CIS 541
24
Open techniques
Newton fixed-point iteration
Secant method
Fixed-point iterations
Convergence and Fractal Basins of Attraction
July 4, 2016
OSU/CIS 541
25
Newtons Method
Open solution, that requires only one
current guess.
Root does not need to be bracketed.
Consider some point x0.
If we approximate f(x) as a line about x0, then
we can again solve for the root of the line.
l ( x) f ( x0 )( x x0 ) f ( x0 )
July 4, 2016
OSU/CIS 541
26
Newtons Method
Solving, leads to the following iteration:
l ( x) 0
f ( x0 )
x1 x0
f ( x0 )
f ( xi )
xi 1 xi
f ( xi )
July 4, 2016
OSU/CIS 541
27
Newtons Method
This can also be seen from Taylors Series.
Assume we have a guess, x0, close to the
actual root. Expand f(x) about this point.
x xi x
x 2
f ( xi x ) f ( xi ) xf ( xi )
f ( xi ) L 0
2!
f ( xi )
f ( xi )
OSU/CIS 541
28
Newtons Method
Graphically, follow the tangent vector down
to the x-axis intersection.
xi
July 4, 2016
OSU/CIS 541
xi+1
29
Newtons Method
Problems
diverges
x0
2
July 4, 2016
OSU/CIS 541
30
Newtons Method
Need the initial guess to be close, or, the
function to behave nearly linear within the
range.
July 4, 2016
OSU/CIS 541
31
Finding a square-root
Ever wonder why they call this a squareroot?
Consider the roots of the equation:
f(x) = x2-a
July 4, 2016
a x a 0, p R
p
OSU/CIS 541
32
Finding a square-root
Example: 2 = 1.4142135623730950488016887242097
Let x0 be one and apply Newtons method.
f ( x) 2 x
x 2
xi 1 xi i
2 xi
1
xi
2
xi
x0 1
1
2
3
x1 1 1.5000000000
2
1
2
1 3 4
17
x2
1.4166666667
2 2 3
12
July 4, 2016
OSU/CIS 541
33
Finding a square-root
Example: 2 = 1.4142135623730950488016887242097
Note the rapid convergence
1 17 24
577
x3
1.414215686
2 12 17
408
x4 1.4142135623746
((
x5 1.4142135623730950488016896
((
x6 1.4142135623730950488016887242097
OSU/CIS 541
34
Finding a square-root
Can we come up with a better initial guess?
Sure, just divide the exponent by 2.
Remember the bias offset
Use bit-masks to extract the exponent to an
integer, modify and set the initial guess.
July 4, 2016
OSU/CIS 541
35
Now,
en 1 x xn 1 x xn
f ( xn )
f ( xn )
en
f ( xn )
f ( xn )
en f ( xn ) f ( xn )
f ( xn )
1 f ( n ) 2
en 1
en
2 f ( xn )
July 4, 2016
OSU/CIS 541
36
if en 10
then,
en 1 c10
July 4, 2016
2 k
OSU/CIS 541
37
Newtons Algorithm
Requires the derivative function to be evaluated,
hence more function evaluations per iteration.
A robust solution would check to see if the
iteration is stepping too far and limit the step.
Most uses of Newtons method assume the
approximation is pretty close and apply one to
three iterations blindly.
July 4, 2016
OSU/CIS 541
38
Division by Multiplication
Newtons method has many uses in
computing basic numbers.
For example, consider the equation:
1
a 0
x
July 4, 2016
OSU/CIS 541
39
July 4, 2016
OSU/CIS 541
40
f ( x) 3
x
OSU/CIS 541
41
1/Sqrt(2)
Lets look at the convergence for the
reciprocal square-root of 2.
x0 1
x1 0.5 1 3 2 12 0.5
x2 0.5 0.5 3 2 0.5
0.625
x3 0.693359375
x4 0.706708468496799468994140625
If we could
only start
here!!
x 5 0.707106444695907075511730676593228
x 6 0.707106781186307335925435931237738
x 7 0.7071067811865475244008442397
2481
(
July 4, 2016
OSU/CIS 541
42
1/Sqrt(x)
What is a good choice for the initial seed
point?
Optimal the root, but it is unknown
Consider the normalized format of the number:
2e 127 (1.m) 2
OSU/CIS 541
43
1/Sqrt(x)
Theoretically,
1
1
1
127
2
x 2 1.m 2 2
e
1
127
3127
1.m 2 2 2
1.m
1
2
127 e
2
New bit-pattern
for the exponent
OSU/CIS 541
44
1/Sqrt(x)
GPUs such as nVidias FX cards provide a
23-bit accurate reciprocal square-root in
two clock cycles, by only doing 2 iterations
of Newtons method.
Need 24-bits of precision =>
Previous iteration had 12-bits of precision
Started with 6-bits of precision
July 4, 2016
OSU/CIS 541
45
1/Sqrt(x)
Examine the mantissa term again (1.m).
Possible patterns are:
1.000, 1.100, 1.010, 1.110,
OSU/CIS 541
46
1/Sqrt(x)
Slight problem:
July 4, 2016
OSU/CIS 541
47
Open techniques
Newton fixed-point iteration
Secant method
Fixed-point iterations
Convergence and Fractal Basins of Attraction
July 4, 2016
OSU/CIS 541
48
Secant Method
What if we do not know the derivative of
f(x)?
Secant line
Tangent vector
xi
July 4, 2016
OSU/CIS 541
xi-1
49
Secant Method
As we converge on the root, the secant line
approaches the tangent.
Hence, we can use the secant line as an
estimate and look at where it intersects the
x-axis (its root).
July 4, 2016
OSU/CIS 541
50
Secant Method
This also works by looking at the definition of the
f ( x h) f ( x)
derivative: f ( x)
lim
h
h 0
f ( xk )
f ( xk ) f ( xk 1 )
xk xk 1
xk xk 1
xk 1 xk
f ( xk )
f ( xk ) f ( xk 1 )
OSU/CIS 541
51
July 4, 2016
OSU/CIS 541
52
Where =1.62.
We call this super-linear convergence.
July 4, 2016
OSU/CIS 541
53
Open techniques
Newton fixed-point iteration
Secant method
Fixed-point iterations
Convergence and Fractal Basins of Attraction
July 4, 2016
OSU/CIS 541
54
Higher-dimensional Problems
Consider the class of functions
f(x1,x2,x3,,xn)=0,
where we have a mapping from n.
We can apply Newtons method separately
for each variable, xi, holding the other
variables fixed to the current guess.
July 4, 2016
OSU/CIS 541
55
Higher-dimensional Problems
This leads to the iteration:
f x1 , x2 ,K , xn
xi xi
f xi x1 , x2 ,K , xn
OSU/CIS 541
56
Higher-dimensional Problems
Example:
x+y+z=3
Solutions:
x=3, y=0, z=0
x=0, y=3, z=0
July 4, 2016
OSU/CIS 541
57
f 2 x1 , x2 ,K , xn 0
M
f n x1 , x2 ,K , xn 0
July 4, 2016
OSU/CIS 541
58
e x xy xz 1
OSU/CIS 541
59
Vector Notation
We can rewrite this using vector notation:
r r r
f ( x) 0
f f1 , f 2 ,K , f n
x x1 , x2 ,K , xn
July 4, 2016
OSU/CIS 541
60
( k 1)
(k )
f x
(k )
f x( k )
(k )
July 4, 2016
OSU/CIS 541
61
f1
x2
f 2
J x1
M
f n
x1
f 2
x2
M O
f n
x2
f1
xn
f 2
xn
f n
xn
OSU/CIS 541
62
f1 ( i )
x L
x2
f 2 ( i )
x
(i )
J (x ) x1
f n x (i )
x1
f 2 ( i )
x L
x2
July 4, 2016
f n ( i )
x L
x2
f1 (i )
x
xn
f 2 (i )
x
xn
f n (i )
x
xn
OSU/CIS 541
63
Matrix Inverse
We define the inverse of a matrix, the same
as the reciprocal.
a
1
1
a
1
0
AA 1 I
M
July 4, 2016
0
1
M
0
L
L
O
L
0
0
OSU/CIS 541
64
Newtons Method
If the Jacobian is non-singular, such that its
inverse exists, then we can apply this to
Newtons method.
We rarely want to compute the inverse, so
instead we look at the problem.
x
July 4, 2016
( i 1)
x f x
x(i ) h(i )
(i )
(i )
f x(i )
OSU/CIS 541
65
Newtons Method
Now, we have a linear system and we solve
for h.
J x ( k ) h ( k ) f x ( k )
x (i 1) x (i ) h ( i )
OSU/CIS 541
66
Initial Guess
How do we get an initial guess for the root
vector in higher-dimensions?
In 2D, I need to find a region that contains
the root.
Steepest Decent is a more advanced topic
not covered in this course. It is more stable
and can be used to determine an
approximate root.
July 4, 2016
OSU/CIS 541
67
Open techniques
Newton fixed-point iteration
Secant method
Fixed-point iterations
Convergence and Fractal Basins of Attraction
July 4, 2016
OSU/CIS 541
68
Fixed-Point Iteration
Many problems also take on the specialized
form: g(x)=x, where we seek, x, that
satisfies this equation.
f(x)=x
g(x)
July 4, 2016
OSU/CIS 541
69
Fixed-Point Iteration
Newtons iteration and the Secant method
are of course in this form.
In the limit, f(xk)=0, hence xk+1=xk
July 4, 2016
OSU/CIS 541
70
Fixed-Point Iteration
Only problem is that that assumes it converges.
The pretty fractal images you see basically encode
how many iterations it took to either converge (to
some accuracy) or to diverge, using that point as
the initial seed point of an iterative equation.
The book also has an example where the roots
converge to a finite set. By assigning different
colors to each root, we can see to which point the
initial seed point converged.
July 4, 2016
OSU/CIS 541
71
Fractals
Images result
when we deal
with 2dimensions.
Such as
complex
numbers.
Color
indicates how
quickly it
converges or
diverges.
July 4, 2016
OSU/CIS 541
72
Fixed-Point Iteration
More on this when we look at iterative
solutions for linear systems (matrices).
July 4, 2016
OSU/CIS 541
73