Course Webpage

Zeros of nonlinear equations II

Contents

We import the modules matplotlib.pyplot and numpy.

In [1]:
import numpy as np
import matplotlib.pyplot as plt

The Secant method

It is the most important variant of the Netwon-Raphson method, in which the tangent line is replaced by the secant.

The main drawback of the Newton-Raphson method is that we need to compute $f'(x)$ in advance.

In the Secant method we approximate $f'(x)$ by finite differences: $$f'(x_{k-1})\approx\frac{f(x_{k-1})-f(x_{k-2})}{x_{k-1}-x_{k-2}}.$$

In counterpart, we need two initial guesses, $x_0$ and $x_1$ for the first iteration: $$ x_k= x_{k-1} -f( x_{k-1})\frac{x_{k-1}-x_{k-2}}{f(x_{k-1})-f(x_{k-2})}.$$

In the plot, we see the sequence of secants (in red) to compute the first zero of $f$, taking initial guesses $x_0=0$ and $x_1=-1$.

In [2]:
secant_a
Out[2]:

Algorithm

  • Let $x_0$ and $x_1$ be the initial guesses.
  • For $k=1,2,\ldots,\mathrm{MaxIter}$:
    • Compute $$ x_k= x_{k-1} -f( x_{k-1})\frac{x_{k-1}-x_{k-2}}{f(x_{k-1})-f(x_{k-2})}.$$
    • If $x_k$ satisfies the stopping criterion, stop.
    • On the contrary, perform a new iteration.

Exercise 1

Write a function secant(f,x0,x1,tol=1e-6,maxiter=100) whose input arguments are the function f, the two initial guesses x0 and x1, the absolute error bound tol and the maximum number of iterations maxiter. And, as output arguments, the approximate solution sol and the number of iterations performed n.

Use it to approximate the three zeros of $f(x)= x^5 - 3x^2 + 1.6$, with tol = 1e-6 and maxiter = 100.

Use the intervals [a,b] obtained in Exercise 1 (incremental search) from the previous lab where x0 = a and x1 = b.

Print the three zeros and their corresponding number of iterations.

Plot the curve and the zeros as red dots.

Hints:

Use a numpy array to store the zeros. It may be initialized

  • r = np.zeros(3)

The zeros can be plotted

  • plt.plot(r,r*0,'ro')
In [8]:
%run Exercise1
-0.6928908017616315 4
0.8027967742658026 4
1.2573997082378758 5

Back to Contents


Python functions for computing zeros

SciPy is a open source library that contains mathematical tools and algorithms for Phython. SciPy has optimization, linear algebra, integration, interpolation, special functions, FTT, signal and image processing and ODE solvers modules and more for scienctific and engineering tasks.

Funtions that solve equations are included in the optimization module (scipy.optimize).

If we only want to use this module we can import it

In [9]:
import scipy.optimize as op

The function newton solves the equation with Secant or Newton-Raphson's method. The function bisect computes the zeros using the bisection method.

Let us see an example computing the positive solutions of $\sin x-0.1x-0.1=0$

We need the derivative for Newton's method

In [10]:
f  = lambda x: np.sin(x) - 0.1*x - 0.1
df = lambda x: np.cos(x) - 0.1

We plot the function

In [11]:
x = np.linspace(-1,20,1000)

plt.figure(figsize=(15,4))
plt.plot(x,f(x))
plt.plot(x,0*x,'k-')
plt.show()

We obtain the zeros estimating initial guesses over this plot

In [12]:
x0 = np.array([0., 2., 7., 8.])
raiz = op.newton(f,x0,fprime=df,tol=1.e-6,maxiter=100)       
print(raiz)
[0.11136674 2.75649514 7.25411627 8.2450445 ]

If we do not use the derivative as input argument, the function uses the Secant method

In [13]:
x0 = np.array([0., 2., 7., 8.])
raiz = op.newton(f,x0,tol=1.e-6,maxiter=100)       
print(raiz)
[0.11136674 2.75649514 7.25411627 8.2450445 ]

For Bisection, we need to estimate the initial interval

In [14]:
x0 = np.array([0., 2., 7., 8.])
x1 = x0 + 1
raiz = np.zeros_like(x0)
for i in range(len(raiz)):
    raiz[i]= op.bisect(f,x0[i],x1[i],xtol=1.e-6,maxiter=100)       
    print(raiz[i])
0.11136722564697266
2.756495475769043
7.254117012023926
8.245043754577637

Let us check the solutions graphically

In [15]:
x = np.linspace(-1,9)

plt.figure()
plt.plot(x,f(x),x,x*0,'k',raiz,raiz*0,'ro')
plt.show()

Extrema and inflection points of a function

If $f$ is differentiable in a $x_0$ of its domain and $x_0$ is a relative extremum

$$ f'(x_0)=0 $$

That is, extrema are zeros of $f'$.

If $f$ is differentiable two times in a $x_0$ of its domain and $x_0$ is an inflection point

$$ f''(x_0)=0 $$

That is, inflection points are zeros of $f''$.


Exercise 2

Given the function: $$f(x)=x^2+\ln (2x+7) \cos (3 x)+0.1$$

Calculate the relative extrema (maxima and minima) of $f$. That is, the zeros of $f'$:

  • Define the lambda functions $f$, $f'$ y $f''$ using the sympy module (see Note at the bottom).
  • Draw $f'$ with axis OX and check with different intervals, until you find one that contains all the zeros.
  • With a bounding error of tol = 1e-6 calculate the zeros of the function $f'$ using newton from scipy.optimize.
  • Draw $f$, with all its maxima and minima and looking at the graph, decide which are the maxima and which the minima from the previously calculated points. Draw over the function the maxima with red circles and the minima with green circles.
  • Print the extremes.

Calculate the inflection points of $f$ in $[-1,4]$. That is, the zeros of $f''$:

  • Plot them over the function graph with blue circles.
  • Print the inflection points.

Hint:

NumPy Elementary Mathematical Functions.

SymPy Elementary Mathematical Functions.

Usually, in sympy the functions are the same but, for instance, instead of np.cos(x), we write sym.cos(x) (taking into account how we have imported numpy and sympy in this lab). This is not a general rule because, for instance np.arctan(x) and sym.atan(x).

In [16]:
%run Exercise2
EXTREMA
[-0.89794785  0.01824977  0.95972949  2.33033731  2.86540018]
INFLECTION POINTS IN [-1,4]
[-0.44343281  0.51463064  1.62117405  2.60293871  3.70449784]

Back to Contents

Fixed point method

In the Fixed point method, we change the equation $f(x)=0$, with an equivalent equation $g(x)=x$ where $\alpha$ is a fixed point of $g$. That is, $g(\alpha)=\alpha$, and $\alpha$ is also a zero of $f$, and thus $f(\alpha)=0$.

A function $g:[a,b]\to\mathbb{R}$ has a fixed point $\alpha\in [a,b]$ if $g(\alpha)=\alpha$. The method of Fixed point is base in the sequence defined by

$$ x_{k+1}=g(x_k),\quad k\geq 0, $$

where $x_0$ is the initial guess.

There is not a unique $g$. Let us see an example.

If

$$f(x)=x^3 - 2 x^2 + 1$$

and we want to find the zeros of $f$, that is the values of $x$ that satisfy $x^3 - 2 x^2 + 1 = 0$. We can reorganize the equation as $x^3 + 1 = 2 x^2$, or

$$x^3 - 2 x^2 + 1 = 0\quad\implies \quad x^3 + 1 = 2 x^2\quad\implies \qquad x = -\sqrt{\frac{x^3+1}{2}}$$

The solution of this second equation is called fixed point of the function

$$ g(x) = -\sqrt{\frac{x^3+1}{2}}$$

and it will also be a zero of $f$.

Let us check it graphically.

In [17]:
f = lambda x: x**3 - 2*x**2 +1
g = lambda x: - np.sqrt((x**3+1)/2)
In [18]:
a = -1.; b = 0;
x = np.linspace(a, b, 200)       # vector with 200 equally spaced points

plt.figure()
plt.plot(x, f(x),'g-',label='f')              
plt.plot(x, g(x),'r-',label='g')
plt.plot(x, 0*x,'k-')            # OX axis  
plt.plot(x, x,'b-',label='y = x')

r = -0.61803
plt.plot(r,0,'ro',label='zero')     
plt.plot(r,r,'bo',label='fixed point')

plt.legend(loc='best')
plt.show()

And the $x$ where $f(x)=0$ is also the $x$ where $x=g(x)$.

Thus, the fixed point of

$$ g(x) = -\sqrt{\frac{x^3+1}{2}}\quad (1)$$

is a zero of $f$.

Contraction mapping theorem: let $g$ be differentiable and defined in an interval $[a,b] \subset \mathbb{R} $ and $x_0 \in [a,b]$ a point of the interval. If

  • $x \in [a,b] \quad \Rightarrow \quad g(x)\in [a,b]$
  • $\left|g'(x)\right|\le k\lt 1$ for all $x\in [a,b]$

then $g$ has a unique fixed point $\alpha \in [a,b]$, and the sequence $x_n$ defined as $x_{i+1}=g(x_i)$ with initial guess $x_0$ converges to $\alpha$ with order at least linear.

In [19]:
a = -0.8; b = -0.3;
x = np.linspace(a, b) 

plt.figure()
plt.plot(x, g(x),'r-', label='g')
plt.plot(x, x, 'b-',label='y = x')
plt.plot([a,b],[b,a],'k-') # Draw the other diagonal

pf = -0.61803
plt.plot(pf,pf,'bo',label='fixed point') 

plt.axis([a, b, a, b])     # Graph in [a,b]x[a,b]
plt.legend(loc='best')
plt.show()

In this graph we can see that the conditions of the contraction mapping theorem are satisfied by function $g$ in the interval $[-0.8, -0.3]$. If we take as initial guess a point inside this interval, the sequence generated by the fixed point algorithm converges.

In [7]:
fpa
Out[7]:
In [8]:
fpb
Out[8]:
In [9]:
fpc
Out[9]:
In [10]:
fpd
Out[10]:

Exercise 3

Write a script that contains function fixedPoint(g,x0,tol=1e-6,maxiter=200) whose input arguments are the iteration function g, the initial guess x0, the absolute error bound tol and the maximum number of iterations maxiter. And, as output arguments, the approximate solution sol and the number of iterations performed n. Use the default tol and maxiter.

Use it to approximate the zero of

$$f(x)= e^{-x}-x$$

with iteration function

$$g(x)=e^{-x}$$

con $\mathrm{tol}=10^{-6}$ y $\mathrm{maxiter}=200$.

Follow the steps:

  • Using the function incrementalSearch for the interval $[0,1]$ find a 0.1 length interval that contains a zero of function f.
  • Taking as initial guess $x_0$ the left border in the previous interval, compute the fixed point of g. Remember that the fixed point method sequence is defined by $x_{k+1} = g(x_k).$
  • Print the fixed point approximation and the number of iterations.
  • Graph the iteration function g, in red, the straight line y = x in blue and the fixed point in blue.

Repeat the process to find the zeros of

$$f(x)= x - \cos(x)$$

using as iteration functions

$$g_1(x)=\cos(x)\quad g_2(x)=2x-\cos(x) \quad g_3(x)=x-\dfrac{x-\cos(x)}{1+\sin(x)}\quad g_4(x)=\dfrac{9x+\cos(x)}{10}$$

Decide which sequences converge. Graph the iteration functions, the line y = x in blue and the fixed point in blue.

Hint:

NumPy Elementary Mathematical Functions.

In [24]:
%run Exercise3
There is a zero in  [[0.5 0.6]] 

0.5671430289524634 22
There is a zero in  [[0.7 0.8]] 

g1  0.7390848589216623 30
g2  -9.452341808684707e+58 200
g3  0.7390851332151608 3
g4  0.7390809703425896 50

More exercises


Exercise 4

Using the secant method and the function newton from the module scipy.optimize, calculate the nearest solution to zero of the equation $\cosh(x)\cos(x)-1=0$. Begin drawing the function to estimate the initial guesses.

Use as input parameters tol=1e-7 and maxiter=100. Check it is a solution drawing it as a red circle in the graph. Print the solucion using print('%.6f'%r) where r is the zero.

In [27]:
%run Exercise4
4.730041

Exercise 5

Using the function bisect from the module scipy.optimize, compute $\sqrt[3]{75}$. Start drawing the function to estimate the initial guesses.

Use as input parameters xtol=1e-6 and maxiter=100. Check it is a solution drawing it as a red circle in the graph. Print the solucion using print('%.6f'%r) where r is the zero.

In [28]:
%run Exercise5
4.217163

Exercise 6

Given the function: $$f(x)= e^{-x^2}\ln (x^2+1)$$

Compute the extrema and inflexion points of $f$ and plot them over the graph of $f$ like in Exercise 2. Use as input parameters tol=1e-6 and maxiter=100.


In [29]:
%run Exercise6
[-0.87362626  0.          0.87362626]
[-1.34109575 -0.39300931  0.39300931  1.34109575]

NOTE: Calculate a function derivative

We can calculate the exact derivatives of a function using the sympy module (symbolic python).

In [32]:
import sympy as sym

We create a symbolic variable

In [33]:
x = sym.Symbol('x', real=True)

Then we define a symbolic function and we can calculate its derivatives

In [34]:
f_sim   = sym.cos(x)*(x**3+1)/(x**2+1)
df_sim  = sym.diff(f_sim,x)
d2f_sim = sym.diff(df_sim,x)

We can plot these derivatives

In [35]:
print(df_sim)
3*x**2*cos(x)/(x**2 + 1) - 2*x*(x**3 + 1)*cos(x)/(x**2 + 1)**2 - (x**3 + 1)*sin(x)/(x**2 + 1)

We may lambify them to use their corresponding numerical functions.

In [36]:
f   = sym.lambdify([x], f_sim,'numpy') 
df  = sym.lambdify([x], df_sim,'numpy') 
d2f = sym.lambdify([x], d2f_sim,'numpy') 

For instance, let us draw them with the numerical command plot.

In [37]:
x = np.linspace(-3,3,100)
plt.plot(x,f(x),x,df(x),x,d2f(x),x,0*x,'k')
plt.legend(['f','df','d2f'],loc='best')
plt.show()