SlideShare une entreprise Scribd logo
1  sur  8
Télécharger pour lire hors ligne
The International Journal Of Engineering And Science (IJES)
||Volume||2 ||Issue|| 11||Pages|| 05-12||2013||
ISSN(e): 2319 – 1813 ISSN(p): 2319 – 1805

On the Rate of Convergence of Newton-Raphson Method
Ranbir Soram1, Sudipta Roy2, Soram Rakesh Singh3, Memeta Khomdram4,
Swavana Yaikhom4 , Sonamani Takhellambam1
1

Manipur Institute of Technology, Takyelpat, Imphal-795004
Department of Information Technology, Assam University, Silchar-788011
3
Rajiv Gandhi Institute of Technology, Bangalore-560032
4
National Institute of Electronics and Information Technology, Akampat, Imphal-795001
2

------------------------------------------------------------ABSTRACT--------------------------------------------------------A computer program in Java has been coded to evaluate the cube roots of numbers from 1 to 25 using NewtonRaphson method in the interval [1, 3]. The relative rate of convergence has been found out in each calculation.
The lowest rate of convergence has been observed in the evaluation of cube root of 16 and highest in the
evaluation of cube root of 3. The average rate of convergence of Newton-Raphson method has been found
to be 0.217920.

KEYWORDS : Computer Program, Cube Root, Java, Newton-Raphson Method, Rate of Convergence.
-------------------------------------------------------------------------------------------------------------------------------------Date of Submission: 21, October, 2013
Date of Acceptance: 10, November, 2013
--------------------------------------------------------------------------------------------------------------------------------------

I. INTRODUCTION
Numerical analysis is a very important branch of computer science that deals with the study of
algorithms that use numerical approximation in mathematical analysis. It involves the study of methods of
computing numerical data. It has its applications in all fields of engineering and the physical sciences including
the life sciences. The stochastic differential equations and Markov chains have been used in simulating living
cells for medicine and biology. Numerical analysis computes numerical data for solving numerically the
problems of continuous mathematics. So, it implies producing a sequence of approximate values; thus the
questions involve the rate of convergence, the accuracy of the answer, and even time consumed to report the
answer. Many problems in mathematics can be reduced in polynomial time to linear algebra, this too can be
studied numerically. Root-finding algorithms are also used to solve nonlinear equations. If the function is
differentiable and the derivative is known and not equal to zero, then Newton-Raphson method is a popular
choice. Linearization is another technique for solving nonlinear equations. The formal academic area of
numerical analysis varies from quite foundational mathematical studies to the computer science issues involved
in the creation and implementation of several algorithms.

NEWTON-RAPHSON METHOD
In numerical analysis, Newton-Raphson method is a very popular numerical method used for finding
successively better approximations to the zeroes of a real-valued function x : f ( x)  0. Even though the method
can also be extended to complex functions, we shall restrict ourselves to real-valued functions only. NewtonRaphson method has been used by a large class of users as it works very well for a large variety of equations like
polynomial, rational, transcendental, trigonometric, and so on. It is also well suited on computers as it is
iterative in nature. This feature of Newton-Raphson method has attracted many scientists and many scientific
application programs use Newton-Raphson method as one of the root finding tools.The Newton-Raphson
method in one variable is implemented as follows: Given a function f ( x) defined over the real x, and its
derivative f '( x) , we begin with a first guess x0 for a root of the function f. Provided the function satisfies all the
assumptions made in the derivation of the formula, a better approximation x1 is
f ( x0 )
x1  x0 
(1)
f '( x0 )
Geometrically, (x1, 0) is the intersection with the x-axis of a line tangent to f at (x0, f (x0)).
The process is repeated as

www.theijes.com

The IJES

Page 5
On the Rate of Convergence of…
xn 1  xn 

f ( xn )

(2)

f '( xn )

until a sufficiently desired accurate value is obtained.
The idea of the method is as follows: one starts with an initial guess which is reasonably close to the true root,
then the function is approximated by its tangent line (which can be computed using the tools of calculus), and
one computes the x-intercept of this tangent line (which is easily done with elementary algebra). This x-intercept
will typically be a better approximation to the function's root than the original guess, and the method can be
iterated. Suppose f : [a, b]  R is a differentiable function defined on the interval [a, b] with values in the real
numbers R. The formula for converging on the root can be easily derived. Suppose we have some current
approximation xn. Then we can derive the formula for a better approximation, xn+1 by referring to the fig. 1 given
below. We know from the definition of the derivative at a given point that it is the slope of a tangent at that
point.

Fig. 1: Geometrical Interpretation of Newton-Raphson method
That is
f '( xn ) 

y
x



f ( xn )  0

(3)

xn  xn 1

Here, f ' denotes the derivative of the function f. Then by simple algebra we can derive
xn 1  xn 

f ( xn )

(4)

f '( xn )

We start the iteration with some arbitrary initial value x0. The method will usually converge, provided
this initial guess is close enough to the unknown root, and that f '( x)  0. We reproduce here the algorithm for
Newton-Raphson method and it is from [2].

Algorithm 1. Newton-Raphson Method
1 Read x0, epsilon, delta, n
Remarks: Here x0 is the initial guess, epsilon is the prescribed relative error, delta is the prescribed lower bound
for f ' and n is the maximum number of iterations to be allowed. Statements 3 to 8 are repeated until the
procedure converges to a root or iterations equal n.

www.theijes.com

The IJES

Page 6
On the Rate of Convergence of…
2

for i  1 to n in steps of 1 do

3

f 0  f ( x0 )

4

f 0  f 0 ( x0 )

5

if f 0  delta then goto 11

6

x1  x0  

7

if

'

'

'

 f0 
' 
 f0 

x1  x0

 epsilon then goto 13

x1
8

x0  x1
endfor
'

9 Write “ does not converse in n iterations”, f 0 , f 0 , x0 , x1
10 Stop
'

11 Write “ slope too small ”, x0 , f0 , f0 , i
12 Stop
13 Write “ convergent solution”, x1 , f ( x1 ), i
14 stop

II. ESTIMATING THE RATE OF CONVERGENCE
Let us take a sequence x1 , x2 ,..., xn . If the sequence converges to a value r and if there exist real
numbers λ > 0 and α ≥ 1 such that lim
n 

xn 1  r
xn  r





(5)

then we say that α is the rate of convergence of the sequence. Many root-finding algorithms are in fact fixedpoint iterations. These iterations have the name fixed-point because the desired root r is a fixed point of a
function g(x), i.e., g (r )  r. For fixed-point iterations, if a value close to r is substituted into g(x) then the
result is even closer to r so we are tempted to write xn 1  g ( xn ).
It might be very convenient to define the error after n steps of an iterative root-finding algorithm as
en  xn  r . As n → ∞ we see from equation (5) that

en 1   en



en   en 1



(6)
(7)

Dividing equation (6) by equation (7) we get the following formula given in equation (8).

en 1
en



 en



 en1



e
 n
en 1



(8)

Now we solve for α and it gives
log

en 1
en


log

(9)

en
en 1

Using the formula given in equation (9), we can determine the convergence rate α from two consecutive
error ratios. But there is still a difficulty with this approach as to compute en we would need to know the root r
which in fact we don’t know. Anyway we can use the ideas above to develop a formula to estimate the rate of
www.theijes.com

The IJES

Page 7
On the Rate of Convergence of…
convergence of any method that behaves like a fixed-point method. If the method converges, then the sequence

xn it generates must satisfy the relation xn1  r  xn  r , for n > N where N is some positive integer. As
mentioned above, to compute en  xn  r we need to know the value of the root r which we still do not know.
We will overcome this by assuming that the ratio of consecutive errors can be approximated by the ratio of
consecutive differences of the estimates of the root:

en1 xn1  r xn1  xn


en
xn  r
xn  xn 1

(10)

This allows us to approximate α with
log



xn 1  xn
xn  xn 1

(11)

x  xn 1
log n
xn 1  xn  2

Even though this only gives an estimate of α, in practice it agrees well with the theoretical convergence rates of
Newton’s method and gives us a good measure of the efficiency of various forms of fixed point algorithms.
Another formula that we can approximate the convergence rate α of Newton-Raphson method is
1
  scaling factor X
(12)
No. of Iterations X f ( r )
where r is the estimated root and in case f (r )  0, the formula fails.

III. WHY DO WE USE NEWTON-RAPHSON METHOD
The beauty of Newton-Raphson method lies in its efficiency and accuracy. The number of correct
digits in each intermediate solution roughly at least doubles in every iteration and thus, a very high accuracy is
obtained very quickly and it can be judged from the fig. 2. Thus, the use of Newton-Raphson method is highly
demanding. In a problem where the Newton-Raphson method converges in one iteration, it amounts to
f ( x0 )
using and evaluating one single formula: xroot  x0 
in order to obtain the desired root. Rather
f '( x0 )
than solving any equations the users are just evaluating functions. In practice a single evaluation of the
formula may fail to give the root, but two successive evaluations may give the solution. This again amounts
to using the formula twice and is still reasonable and attractive. Although the functions involved may not
converge so easily or the values of their derivatives at some points are too small to be used in the formula, a
sufficiently small neighborhood can be found out in which they are almost convergent and the values of their
derivatives at that neighborhoods can fit in the formula. By focusing our attention to such a neighborhood,
Newton-Raphson method would yield the desired root in two iterations, if not in one. The starting point plays a
very important role in the working of Newton-Raphson method. In general, a starting value far away from the
actual solution may take more iteration to get the desired root. Sometime it may also happen that the iterations
oscillate around the solution refusing to converge to it. There are many ways to avoid these conditions.
Generally, choosing the initial guess sufficiently close to the solution is attractive. The closer to the zero of the
function, the better it is. But, in the absence of any information about where the root might lie, a guess method
might narrow the possibilities to a reasonably small interval by using the intermediate value theorem.
Table 1: Rate of convergence of Newton-Raphson method in the calculation of roots of f(x)=x3-n=0.
S.No

Function

No. of
Iteration

1

f(x)=x3-1=0

1

2

f(x)=x3-2=0

7

3

f(x)=x3-3=0

8

www.theijes.com

Value of the Root
1.000000000000000000000000000
00000000000000000000000000000
0000000000000
1.259921049894873164767210607
27822835057025146470150798008
1975112155300
1.442249570307408382321638310
The IJES

Rate of
convergence

Time taken
in ms

5.000000

0

1.058113

14

2.089114

16
Page 8
On the Rate of Convergence of…

4

f(x)=x3-4=0

5000

5

f(x)=x3-5=0

9

6

f(x)=x3-6=0

9

7

f(x)=x3-7=0

9

8

f(x)=x3-8=0

9

9

f(x)=x3-9=0

9

10

f(x)=x3-10=0

9

11

f(x)=x3-11=0

10

12

f(x)=x3-12=0

5000

13

f(x)=x3-13=0

5000

14

f(x)=x3-14=0

10

15

f(x)=x3-15=0

10

16

f(x)=x3-16=0

5000

17

f(x)=x3-17=0

10

18

f(x)=x3-18=0

10

19

f(x)=x3-19=0

10

20

f(x)=x3-20=0

10

21

f(x)=x3-21=0

11

22

f(x)=x3-22=0

11

www.theijes.com

78010958839186925349935057754
6416194541688
1.587401051968199474751705639
27230826039149332789985300980
8285761825216
1.709975946676696989353108872
54386010986805511054305492438
2861707444296
1.817120592832139658891211756
32726050242821046314121967148
1334297931310
1.912931182772389101199116839
54876028286243905034587576621
0647640447234
2.000000000000000000000000000
00000000000000000000000000000
0000000000000
2.080083823051904114530056824
35788538633780534037326210969
7591080200106
2.154434690031883721759293566
51935049525934494219210858248
9235506346411
2.223980090569315521165363376
72215719651869912809692305569
9345808660401
2.289428485106663735616084423
87935401783181384157586214419
8104348131349
2.351334687720757489500016339
95691452691584198346217510504
0254311588343
2.410142264175229986128369667
60327289535458128998086765416
4139710413292
2.466212074330470101491611323
15458904273548448662805376017
8787410293377
2.519842099789746329534421214
55645670114050292940301596016
3950224310599
2.571281590658235355453187208
73972611642790163245469625984
8022376219940
2.620741394208896607141661280
44199627023942764572363172510
2773805728700
2.668401648721944867339627371
97083033509587856918310186566
4213586945794
2.714417616594906571518089469
67948920480510776948909695728
4365442803308
2.758924176381120669465791108
35852158225271208603893603280
6592502162799
2.802039330655387120665677385
66589401758579821876926831697
The IJES

0.003202

1310

0.210722

31

0.185394

16

0.167897

16

5.000000

15

0.100250

16

0.158022

15

0.117827

15

0.000235

1389

0.000235

1357

0.078721

15

0.061548

16

0.000185

1357

0.193048

31

0.050412

31

0.154323

32

0.050715

16

0.032744

31

0.041051

16
Page 9
On the Rate of Convergence of…

23

f(x)=x3-23=0

11

24

f(x)=x3-24=0

11

25

f(x)=x3-25=0

11

9783733783750
2.843866979851565477695439400
95865185276416517273704810465
3425230235441
2.884499140614816764643276621
56021917678373850699870115509
2832389083375
2.924017738212866065506787360
13792277853049863510103004142
2573510072560
Average rate of convergence

0.072259

32

0.030560

31

0.155596

31

0.217920

IV. RESULT AND DISCUSSION
We have calculated the cube roots of integers from 1 to 25 using Newton-Raphson method in the
8

interval [1, 3] with an initial guess of 1 in each case. We have used equation (12) with a scaling factor of 1x10 .
“0.000000000000000000000000000000000000000000000000000000000000000000000001” is the tolerance
limit used in our program. The rate of convergence, value of roots of the functions, and time takens in
millisecond to obtain the roots are shown in Table-2. In the calculation of cube root of 1 and 8, we have taken α
as 5.0 as f ( r ) equals 0 in each case. As this will affect the average rate of convergence, we have excluded
them in the calculation of the average rate of convergence.
As an example, we choose to calculate a root of the equation f ( x)  x  21  0. The graph showing
the value of each intermediate value at each iteration is given in fig. 2. The graph showing the time taken in
3

millisecond to calculate a zero of f ( x)  x  n  0 for n  1 to 25 is shown in fig. 3.The speed at which the
program reports a root of the function is highly attractive. In most cases, the program has taken less than 33
milliseconds to calculate a root of the function. It is highly surprising as we use
“0.000000000000000000000000000000000000000000000000000000000000000000000001” as the tolerance
limit in the program. We can improve the program a lot if we choose different initial guess for each function.
The program executed very quickly in the calculation of cube root of 2 and it had taken 14 milliseconds to report
a root. The most time consuming execution of the program was reported in the calculation of a cube root of 12 as
it took 1389 milliseconds to report the root. The graph in fig. 4 shows the relative rate of convergence in the
calculation of cube roots of integers from 1 to 25 of Newton-Raphson method in the interval [1, 3].
3

Fig. 2: Value of function f(x)=x3-21 from iteration 1 to 11

www.theijes.com

The IJES

Page 10
On the Rate of Convergence of…

Fig. 3: Time taken in millisecond to calculate a root of f(x)=x3-n=0

V. CONCLUSION
The highest rate of convergence of Newton-Raphson method has been observed in the calculation of
cube root of 3 and is equal to 2.089114. The lowest rate of convergence of Newton-Raphson method has been
observed in the calculation of cube root of 16 and is equal to 0.000185. Average rate of convergence of NewtonRaphson method is calculated to be 0.217920.

Fig. 4: Rate of convergence in reporting a root of f(x)=x3-n=0

ACKNOWLEDGEMENTS
The first author thanks his daughter Java and son Chandrayan for not using his laptop during the
preparation of the manuscript. The fourth author thanks her husband for his understanding and support in the
preparation of the manuscript. The fifth author thanks her family and teachers for their supports in the writing of
this manuscript.

REFERENCES
[1]
[2]
[3]
[4]
[5]

Zoltan Kovacs, Understanding convergence and stability of the Newton-Raphson method (Department of Analysis, University of
Szeged, 2011).
V. Rajaraman, Computer Oriented Numerical Method (Prentice-Hall of India, 2005).
E Balagurusamy, Computer Oriented Statistical and Numerical Methods(Macmillan India Limited, 1988)
C. Woodford , C. Phillips, Numerical Methods with Worked Examples: Matlab Edition (Springer London , 1997).
N. Daili, Numerical Approach to Fixed Point Theorems, Int. J. Contemp. Math. Sciences, Vol. 3, 2008, no. 14, 675 – 682.

www.theijes.com

The IJES

Page 11
On the Rate of Convergence of…
Ranbir Soram works at Manipur Institute of Technology,Takyelpat,
Imphal, India. His field of interest includes Network Security, NLP, Neural
Network, Genetic Algorithm, and Fuzzy Logic etc.
Sudipta Roy works in the Department of Information Technology, Assam
University Silchar.

Soram Rakesh Singh is a PG student in the Department of Computer
Science & Engineering at Rajiv Gandhi Institute of Technology, Bangalore.

Memeta Khomdram works at National Institute of Electronics and
Information Technology (Formerly DOEACC Centre), Akampat, Imphal.

Swavana Yaikhom works at National Institute of Electronics and
Information Technology, Akampat, Imphal.

Sonamani
Takhellambam
works
Technology,Takyelpat, Imphal, India.

www.theijes.com

The IJES

at

Manipur

Institute

of

Page 12

Contenu connexe

Tendances

Estimation theory 1
Estimation theory 1Estimation theory 1
Estimation theory 1Gopi Saiteja
 
01 Notes Introduction Analysis of Algorithms Notes
01 Notes Introduction Analysis of Algorithms Notes01 Notes Introduction Analysis of Algorithms Notes
01 Notes Introduction Analysis of Algorithms NotesAndres Mendez-Vazquez
 
Introduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood EstimatorIntroduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood EstimatorAmir Al-Ansary
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)IJERD Editor
 
A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...
A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...
A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...ijfls
 
Roots of equations
Roots of equationsRoots of equations
Roots of equationsMileacre
 
lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...
lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...
lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...mathsjournal
 
Optimisation in engineering design
Optimisation in engineering designOptimisation in engineering design
Optimisation in engineering designramnaresh yadav
 
The Complexity Of Primality Testing
The Complexity Of Primality TestingThe Complexity Of Primality Testing
The Complexity Of Primality TestingMohammad Elsheikh
 
Maximum likelihood estimation
Maximum likelihood estimationMaximum likelihood estimation
Maximum likelihood estimationzihad164
 
Introduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov ChainsIntroduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov ChainsUniversity of Salerno
 

Tendances (20)

Estimation theory 1
Estimation theory 1Estimation theory 1
Estimation theory 1
 
PhysicsSIG2008-01-Seneviratne
PhysicsSIG2008-01-SeneviratnePhysicsSIG2008-01-Seneviratne
PhysicsSIG2008-01-Seneviratne
 
Chapter 3 roots of equations
Chapter 3 roots of equationsChapter 3 roots of equations
Chapter 3 roots of equations
 
03 notes
03 notes03 notes
03 notes
 
01 Notes Introduction Analysis of Algorithms Notes
01 Notes Introduction Analysis of Algorithms Notes01 Notes Introduction Analysis of Algorithms Notes
01 Notes Introduction Analysis of Algorithms Notes
 
02 Notes Divide and Conquer
02 Notes Divide and Conquer02 Notes Divide and Conquer
02 Notes Divide and Conquer
 
Roots equation
Roots equationRoots equation
Roots equation
 
Introduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood EstimatorIntroduction to Maximum Likelihood Estimator
Introduction to Maximum Likelihood Estimator
 
Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)Welcome to International Journal of Engineering Research and Development (IJERD)
Welcome to International Journal of Engineering Research and Development (IJERD)
 
A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...
A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...
A LEAST ABSOLUTE APPROACH TO MULTIPLE FUZZY REGRESSION USING Tw- NORM BASED O...
 
Roots of equations
Roots of equationsRoots of equations
Roots of equations
 
Signal Processing Assignment Help
Signal Processing Assignment HelpSignal Processing Assignment Help
Signal Processing Assignment Help
 
Estimationtheory2
Estimationtheory2Estimationtheory2
Estimationtheory2
 
C025020029
C025020029C025020029
C025020029
 
lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...
lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...
lassomodel, sparsity, multivariate modeling, NIR spectroscopy, biodiesel from...
 
Optimisation in engineering design
Optimisation in engineering designOptimisation in engineering design
Optimisation in engineering design
 
The Complexity Of Primality Testing
The Complexity Of Primality TestingThe Complexity Of Primality Testing
The Complexity Of Primality Testing
 
Data Analysis Homework Help
Data Analysis Homework HelpData Analysis Homework Help
Data Analysis Homework Help
 
Maximum likelihood estimation
Maximum likelihood estimationMaximum likelihood estimation
Maximum likelihood estimation
 
Introduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov ChainsIntroduction to Bootstrap and elements of Markov Chains
Introduction to Bootstrap and elements of Markov Chains
 

En vedette

The International Journal of Engineering and Science (The IJES)
 The International Journal of Engineering and Science (The IJES) The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
A03406001012
A03406001012A03406001012
A03406001012theijes
 
I0331043049
I0331043049I0331043049
I0331043049theijes
 
L021201085091
L021201085091L021201085091
L021201085091theijes
 
B0331009012
B0331009012B0331009012
B0331009012theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
F021202058066
F021202058066F021202058066
F021202058066theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
K034050900107
K034050900107K034050900107
K034050900107theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)theijes
 
F03405046048
F03405046048F03405046048
F03405046048theijes
 

En vedette (17)

The International Journal of Engineering and Science (The IJES)
 The International Journal of Engineering and Science (The IJES) The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
A03406001012
A03406001012A03406001012
A03406001012
 
I0331043049
I0331043049I0331043049
I0331043049
 
L021201085091
L021201085091L021201085091
L021201085091
 
B0331009012
B0331009012B0331009012
B0331009012
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
F021202058066
F021202058066F021202058066
F021202058066
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
K034050900107
K034050900107K034050900107
K034050900107
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)The International Journal of Engineering and Science (The IJES)
The International Journal of Engineering and Science (The IJES)
 
F03405046048
F03405046048F03405046048
F03405046048
 

Similaire à The International Journal of Engineering and Science (The IJES)

Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson methodBijay Mishra
 
Newton paper.docx
Newton  paper.docxNewton  paper.docx
Newton paper.docxnitmor1
 
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
 
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...Stephen Faucher
 
Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...
Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...
Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...RaihanHossain49
 
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...mathsjournal
 
MolinaLeydi_FinalProject
MolinaLeydi_FinalProjectMolinaLeydi_FinalProject
MolinaLeydi_FinalProjectLeydi Molina
 
Non linearequationsmatlab
Non linearequationsmatlabNon linearequationsmatlab
Non linearequationsmatlabsheetslibrary
 
Solution of non-linear equations
Solution of non-linear equationsSolution of non-linear equations
Solution of non-linear equationsZunAib Ali
 
Non linearequationsmatlab
Non linearequationsmatlabNon linearequationsmatlab
Non linearequationsmatlabZunAib Ali
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxmccormicknadine86
 
Numerical Study of Some Iterative Methods for Solving Nonlinear Equations
Numerical Study of Some Iterative Methods for Solving Nonlinear EquationsNumerical Study of Some Iterative Methods for Solving Nonlinear Equations
Numerical Study of Some Iterative Methods for Solving Nonlinear Equationsinventionjournals
 
Newton's Raphson method
Newton's Raphson methodNewton's Raphson method
Newton's Raphson methodSaloni Singhal
 
Newton raphsonmethod presentation
Newton raphsonmethod presentationNewton raphsonmethod presentation
Newton raphsonmethod presentationAbdullah Moin
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docxbissacr
 

Similaire à The International Journal of Engineering and Science (The IJES) (20)

Newton
NewtonNewton
Newton
 
Secant method
Secant methodSecant method
Secant method
 
Newton Raphson
Newton RaphsonNewton Raphson
Newton Raphson
 
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
 
Newton raphson method
Newton raphson methodNewton raphson method
Newton raphson method
 
Newton paper.docx
Newton  paper.docxNewton  paper.docx
Newton paper.docx
 
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
 
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
A Comparison Of Iterative Methods For The Solution Of Non-Linear Systems Of E...
 
Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...
Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...
Exploring_the_Fundamentals_of_Numerical_Analysis_An_Overview_of_Newton_Raphso...
 
The newton raphson method
The newton raphson methodThe newton raphson method
The newton raphson method
 
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
Fractional Newton-Raphson Method and Some Variants for the Solution of Nonlin...
 
MolinaLeydi_FinalProject
MolinaLeydi_FinalProjectMolinaLeydi_FinalProject
MolinaLeydi_FinalProject
 
Non linearequationsmatlab
Non linearequationsmatlabNon linearequationsmatlab
Non linearequationsmatlab
 
Solution of non-linear equations
Solution of non-linear equationsSolution of non-linear equations
Solution of non-linear equations
 
Non linearequationsmatlab
Non linearequationsmatlabNon linearequationsmatlab
Non linearequationsmatlab
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 
Numerical Study of Some Iterative Methods for Solving Nonlinear Equations
Numerical Study of Some Iterative Methods for Solving Nonlinear EquationsNumerical Study of Some Iterative Methods for Solving Nonlinear Equations
Numerical Study of Some Iterative Methods for Solving Nonlinear Equations
 
Newton's Raphson method
Newton's Raphson methodNewton's Raphson method
Newton's Raphson method
 
Newton raphsonmethod presentation
Newton raphsonmethod presentationNewton raphsonmethod presentation
Newton raphsonmethod presentation
 
Chi-squared Goodness of Fit Test Project Overview and.docx
Chi-squared Goodness of Fit Test Project  Overview and.docxChi-squared Goodness of Fit Test Project  Overview and.docx
Chi-squared Goodness of Fit Test Project Overview and.docx
 

Dernier

04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptxHampshireHUG
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitecturePixlogix Infotech
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsEnterprise Knowledge
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024Rafal Los
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...gurkirankumar98700
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxMalak Abu Hammad
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)Gabriella Davis
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slidevu2urc
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking MenDelhi Call girls
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGSujit Pal
 

Dernier (20)

04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Understanding the Laravel MVC Architecture
Understanding the Laravel MVC ArchitectureUnderstanding the Laravel MVC Architecture
Understanding the Laravel MVC Architecture
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
IAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI SolutionsIAC 2024 - IA Fast Track to Search Focused AI Solutions
IAC 2024 - IA Fast Track to Search Focused AI Solutions
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
Kalyanpur ) Call Girls in Lucknow Finest Escorts Service 🍸 8923113531 🎰 Avail...
 
The Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptxThe Codex of Business Writing Software for Real-World Solutions 2.pptx
The Codex of Business Writing Software for Real-World Solutions 2.pptx
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men08448380779 Call Girls In Civil Lines Women Seeking Men
08448380779 Call Girls In Civil Lines Women Seeking Men
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Google AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAGGoogle AI Hackathon: LLM based Evaluator for RAG
Google AI Hackathon: LLM based Evaluator for RAG
 

The International Journal of Engineering and Science (The IJES)

  • 1. The International Journal Of Engineering And Science (IJES) ||Volume||2 ||Issue|| 11||Pages|| 05-12||2013|| ISSN(e): 2319 – 1813 ISSN(p): 2319 – 1805 On the Rate of Convergence of Newton-Raphson Method Ranbir Soram1, Sudipta Roy2, Soram Rakesh Singh3, Memeta Khomdram4, Swavana Yaikhom4 , Sonamani Takhellambam1 1 Manipur Institute of Technology, Takyelpat, Imphal-795004 Department of Information Technology, Assam University, Silchar-788011 3 Rajiv Gandhi Institute of Technology, Bangalore-560032 4 National Institute of Electronics and Information Technology, Akampat, Imphal-795001 2 ------------------------------------------------------------ABSTRACT--------------------------------------------------------A computer program in Java has been coded to evaluate the cube roots of numbers from 1 to 25 using NewtonRaphson method in the interval [1, 3]. The relative rate of convergence has been found out in each calculation. The lowest rate of convergence has been observed in the evaluation of cube root of 16 and highest in the evaluation of cube root of 3. The average rate of convergence of Newton-Raphson method has been found to be 0.217920. KEYWORDS : Computer Program, Cube Root, Java, Newton-Raphson Method, Rate of Convergence. -------------------------------------------------------------------------------------------------------------------------------------Date of Submission: 21, October, 2013 Date of Acceptance: 10, November, 2013 -------------------------------------------------------------------------------------------------------------------------------------- I. INTRODUCTION Numerical analysis is a very important branch of computer science that deals with the study of algorithms that use numerical approximation in mathematical analysis. It involves the study of methods of computing numerical data. It has its applications in all fields of engineering and the physical sciences including the life sciences. The stochastic differential equations and Markov chains have been used in simulating living cells for medicine and biology. Numerical analysis computes numerical data for solving numerically the problems of continuous mathematics. So, it implies producing a sequence of approximate values; thus the questions involve the rate of convergence, the accuracy of the answer, and even time consumed to report the answer. Many problems in mathematics can be reduced in polynomial time to linear algebra, this too can be studied numerically. Root-finding algorithms are also used to solve nonlinear equations. If the function is differentiable and the derivative is known and not equal to zero, then Newton-Raphson method is a popular choice. Linearization is another technique for solving nonlinear equations. The formal academic area of numerical analysis varies from quite foundational mathematical studies to the computer science issues involved in the creation and implementation of several algorithms. NEWTON-RAPHSON METHOD In numerical analysis, Newton-Raphson method is a very popular numerical method used for finding successively better approximations to the zeroes of a real-valued function x : f ( x)  0. Even though the method can also be extended to complex functions, we shall restrict ourselves to real-valued functions only. NewtonRaphson method has been used by a large class of users as it works very well for a large variety of equations like polynomial, rational, transcendental, trigonometric, and so on. It is also well suited on computers as it is iterative in nature. This feature of Newton-Raphson method has attracted many scientists and many scientific application programs use Newton-Raphson method as one of the root finding tools.The Newton-Raphson method in one variable is implemented as follows: Given a function f ( x) defined over the real x, and its derivative f '( x) , we begin with a first guess x0 for a root of the function f. Provided the function satisfies all the assumptions made in the derivation of the formula, a better approximation x1 is f ( x0 ) x1  x0  (1) f '( x0 ) Geometrically, (x1, 0) is the intersection with the x-axis of a line tangent to f at (x0, f (x0)). The process is repeated as www.theijes.com The IJES Page 5
  • 2. On the Rate of Convergence of… xn 1  xn  f ( xn ) (2) f '( xn ) until a sufficiently desired accurate value is obtained. The idea of the method is as follows: one starts with an initial guess which is reasonably close to the true root, then the function is approximated by its tangent line (which can be computed using the tools of calculus), and one computes the x-intercept of this tangent line (which is easily done with elementary algebra). This x-intercept will typically be a better approximation to the function's root than the original guess, and the method can be iterated. Suppose f : [a, b]  R is a differentiable function defined on the interval [a, b] with values in the real numbers R. The formula for converging on the root can be easily derived. Suppose we have some current approximation xn. Then we can derive the formula for a better approximation, xn+1 by referring to the fig. 1 given below. We know from the definition of the derivative at a given point that it is the slope of a tangent at that point. Fig. 1: Geometrical Interpretation of Newton-Raphson method That is f '( xn )  y x  f ( xn )  0 (3) xn  xn 1 Here, f ' denotes the derivative of the function f. Then by simple algebra we can derive xn 1  xn  f ( xn ) (4) f '( xn ) We start the iteration with some arbitrary initial value x0. The method will usually converge, provided this initial guess is close enough to the unknown root, and that f '( x)  0. We reproduce here the algorithm for Newton-Raphson method and it is from [2]. Algorithm 1. Newton-Raphson Method 1 Read x0, epsilon, delta, n Remarks: Here x0 is the initial guess, epsilon is the prescribed relative error, delta is the prescribed lower bound for f ' and n is the maximum number of iterations to be allowed. Statements 3 to 8 are repeated until the procedure converges to a root or iterations equal n. www.theijes.com The IJES Page 6
  • 3. On the Rate of Convergence of… 2 for i  1 to n in steps of 1 do 3 f 0  f ( x0 ) 4 f 0  f 0 ( x0 ) 5 if f 0  delta then goto 11 6 x1  x0   7 if ' ' '  f0  '   f0  x1  x0  epsilon then goto 13 x1 8 x0  x1 endfor ' 9 Write “ does not converse in n iterations”, f 0 , f 0 , x0 , x1 10 Stop ' 11 Write “ slope too small ”, x0 , f0 , f0 , i 12 Stop 13 Write “ convergent solution”, x1 , f ( x1 ), i 14 stop II. ESTIMATING THE RATE OF CONVERGENCE Let us take a sequence x1 , x2 ,..., xn . If the sequence converges to a value r and if there exist real numbers λ > 0 and α ≥ 1 such that lim n  xn 1  r xn  r   (5) then we say that α is the rate of convergence of the sequence. Many root-finding algorithms are in fact fixedpoint iterations. These iterations have the name fixed-point because the desired root r is a fixed point of a function g(x), i.e., g (r )  r. For fixed-point iterations, if a value close to r is substituted into g(x) then the result is even closer to r so we are tempted to write xn 1  g ( xn ). It might be very convenient to define the error after n steps of an iterative root-finding algorithm as en  xn  r . As n → ∞ we see from equation (5) that en 1   en  en   en 1  (6) (7) Dividing equation (6) by equation (7) we get the following formula given in equation (8). en 1 en   en   en1  e  n en 1  (8) Now we solve for α and it gives log en 1 en  log (9) en en 1 Using the formula given in equation (9), we can determine the convergence rate α from two consecutive error ratios. But there is still a difficulty with this approach as to compute en we would need to know the root r which in fact we don’t know. Anyway we can use the ideas above to develop a formula to estimate the rate of www.theijes.com The IJES Page 7
  • 4. On the Rate of Convergence of… convergence of any method that behaves like a fixed-point method. If the method converges, then the sequence xn it generates must satisfy the relation xn1  r  xn  r , for n > N where N is some positive integer. As mentioned above, to compute en  xn  r we need to know the value of the root r which we still do not know. We will overcome this by assuming that the ratio of consecutive errors can be approximated by the ratio of consecutive differences of the estimates of the root: en1 xn1  r xn1  xn   en xn  r xn  xn 1 (10) This allows us to approximate α with log  xn 1  xn xn  xn 1 (11) x  xn 1 log n xn 1  xn  2 Even though this only gives an estimate of α, in practice it agrees well with the theoretical convergence rates of Newton’s method and gives us a good measure of the efficiency of various forms of fixed point algorithms. Another formula that we can approximate the convergence rate α of Newton-Raphson method is 1   scaling factor X (12) No. of Iterations X f ( r ) where r is the estimated root and in case f (r )  0, the formula fails. III. WHY DO WE USE NEWTON-RAPHSON METHOD The beauty of Newton-Raphson method lies in its efficiency and accuracy. The number of correct digits in each intermediate solution roughly at least doubles in every iteration and thus, a very high accuracy is obtained very quickly and it can be judged from the fig. 2. Thus, the use of Newton-Raphson method is highly demanding. In a problem where the Newton-Raphson method converges in one iteration, it amounts to f ( x0 ) using and evaluating one single formula: xroot  x0  in order to obtain the desired root. Rather f '( x0 ) than solving any equations the users are just evaluating functions. In practice a single evaluation of the formula may fail to give the root, but two successive evaluations may give the solution. This again amounts to using the formula twice and is still reasonable and attractive. Although the functions involved may not converge so easily or the values of their derivatives at some points are too small to be used in the formula, a sufficiently small neighborhood can be found out in which they are almost convergent and the values of their derivatives at that neighborhoods can fit in the formula. By focusing our attention to such a neighborhood, Newton-Raphson method would yield the desired root in two iterations, if not in one. The starting point plays a very important role in the working of Newton-Raphson method. In general, a starting value far away from the actual solution may take more iteration to get the desired root. Sometime it may also happen that the iterations oscillate around the solution refusing to converge to it. There are many ways to avoid these conditions. Generally, choosing the initial guess sufficiently close to the solution is attractive. The closer to the zero of the function, the better it is. But, in the absence of any information about where the root might lie, a guess method might narrow the possibilities to a reasonably small interval by using the intermediate value theorem. Table 1: Rate of convergence of Newton-Raphson method in the calculation of roots of f(x)=x3-n=0. S.No Function No. of Iteration 1 f(x)=x3-1=0 1 2 f(x)=x3-2=0 7 3 f(x)=x3-3=0 8 www.theijes.com Value of the Root 1.000000000000000000000000000 00000000000000000000000000000 0000000000000 1.259921049894873164767210607 27822835057025146470150798008 1975112155300 1.442249570307408382321638310 The IJES Rate of convergence Time taken in ms 5.000000 0 1.058113 14 2.089114 16 Page 8
  • 5. On the Rate of Convergence of… 4 f(x)=x3-4=0 5000 5 f(x)=x3-5=0 9 6 f(x)=x3-6=0 9 7 f(x)=x3-7=0 9 8 f(x)=x3-8=0 9 9 f(x)=x3-9=0 9 10 f(x)=x3-10=0 9 11 f(x)=x3-11=0 10 12 f(x)=x3-12=0 5000 13 f(x)=x3-13=0 5000 14 f(x)=x3-14=0 10 15 f(x)=x3-15=0 10 16 f(x)=x3-16=0 5000 17 f(x)=x3-17=0 10 18 f(x)=x3-18=0 10 19 f(x)=x3-19=0 10 20 f(x)=x3-20=0 10 21 f(x)=x3-21=0 11 22 f(x)=x3-22=0 11 www.theijes.com 78010958839186925349935057754 6416194541688 1.587401051968199474751705639 27230826039149332789985300980 8285761825216 1.709975946676696989353108872 54386010986805511054305492438 2861707444296 1.817120592832139658891211756 32726050242821046314121967148 1334297931310 1.912931182772389101199116839 54876028286243905034587576621 0647640447234 2.000000000000000000000000000 00000000000000000000000000000 0000000000000 2.080083823051904114530056824 35788538633780534037326210969 7591080200106 2.154434690031883721759293566 51935049525934494219210858248 9235506346411 2.223980090569315521165363376 72215719651869912809692305569 9345808660401 2.289428485106663735616084423 87935401783181384157586214419 8104348131349 2.351334687720757489500016339 95691452691584198346217510504 0254311588343 2.410142264175229986128369667 60327289535458128998086765416 4139710413292 2.466212074330470101491611323 15458904273548448662805376017 8787410293377 2.519842099789746329534421214 55645670114050292940301596016 3950224310599 2.571281590658235355453187208 73972611642790163245469625984 8022376219940 2.620741394208896607141661280 44199627023942764572363172510 2773805728700 2.668401648721944867339627371 97083033509587856918310186566 4213586945794 2.714417616594906571518089469 67948920480510776948909695728 4365442803308 2.758924176381120669465791108 35852158225271208603893603280 6592502162799 2.802039330655387120665677385 66589401758579821876926831697 The IJES 0.003202 1310 0.210722 31 0.185394 16 0.167897 16 5.000000 15 0.100250 16 0.158022 15 0.117827 15 0.000235 1389 0.000235 1357 0.078721 15 0.061548 16 0.000185 1357 0.193048 31 0.050412 31 0.154323 32 0.050715 16 0.032744 31 0.041051 16 Page 9
  • 6. On the Rate of Convergence of… 23 f(x)=x3-23=0 11 24 f(x)=x3-24=0 11 25 f(x)=x3-25=0 11 9783733783750 2.843866979851565477695439400 95865185276416517273704810465 3425230235441 2.884499140614816764643276621 56021917678373850699870115509 2832389083375 2.924017738212866065506787360 13792277853049863510103004142 2573510072560 Average rate of convergence 0.072259 32 0.030560 31 0.155596 31 0.217920 IV. RESULT AND DISCUSSION We have calculated the cube roots of integers from 1 to 25 using Newton-Raphson method in the 8 interval [1, 3] with an initial guess of 1 in each case. We have used equation (12) with a scaling factor of 1x10 . “0.000000000000000000000000000000000000000000000000000000000000000000000001” is the tolerance limit used in our program. The rate of convergence, value of roots of the functions, and time takens in millisecond to obtain the roots are shown in Table-2. In the calculation of cube root of 1 and 8, we have taken α as 5.0 as f ( r ) equals 0 in each case. As this will affect the average rate of convergence, we have excluded them in the calculation of the average rate of convergence. As an example, we choose to calculate a root of the equation f ( x)  x  21  0. The graph showing the value of each intermediate value at each iteration is given in fig. 2. The graph showing the time taken in 3 millisecond to calculate a zero of f ( x)  x  n  0 for n  1 to 25 is shown in fig. 3.The speed at which the program reports a root of the function is highly attractive. In most cases, the program has taken less than 33 milliseconds to calculate a root of the function. It is highly surprising as we use “0.000000000000000000000000000000000000000000000000000000000000000000000001” as the tolerance limit in the program. We can improve the program a lot if we choose different initial guess for each function. The program executed very quickly in the calculation of cube root of 2 and it had taken 14 milliseconds to report a root. The most time consuming execution of the program was reported in the calculation of a cube root of 12 as it took 1389 milliseconds to report the root. The graph in fig. 4 shows the relative rate of convergence in the calculation of cube roots of integers from 1 to 25 of Newton-Raphson method in the interval [1, 3]. 3 Fig. 2: Value of function f(x)=x3-21 from iteration 1 to 11 www.theijes.com The IJES Page 10
  • 7. On the Rate of Convergence of… Fig. 3: Time taken in millisecond to calculate a root of f(x)=x3-n=0 V. CONCLUSION The highest rate of convergence of Newton-Raphson method has been observed in the calculation of cube root of 3 and is equal to 2.089114. The lowest rate of convergence of Newton-Raphson method has been observed in the calculation of cube root of 16 and is equal to 0.000185. Average rate of convergence of NewtonRaphson method is calculated to be 0.217920. Fig. 4: Rate of convergence in reporting a root of f(x)=x3-n=0 ACKNOWLEDGEMENTS The first author thanks his daughter Java and son Chandrayan for not using his laptop during the preparation of the manuscript. The fourth author thanks her husband for his understanding and support in the preparation of the manuscript. The fifth author thanks her family and teachers for their supports in the writing of this manuscript. REFERENCES [1] [2] [3] [4] [5] Zoltan Kovacs, Understanding convergence and stability of the Newton-Raphson method (Department of Analysis, University of Szeged, 2011). V. Rajaraman, Computer Oriented Numerical Method (Prentice-Hall of India, 2005). E Balagurusamy, Computer Oriented Statistical and Numerical Methods(Macmillan India Limited, 1988) C. Woodford , C. Phillips, Numerical Methods with Worked Examples: Matlab Edition (Springer London , 1997). N. Daili, Numerical Approach to Fixed Point Theorems, Int. J. Contemp. Math. Sciences, Vol. 3, 2008, no. 14, 675 – 682. www.theijes.com The IJES Page 11
  • 8. On the Rate of Convergence of… Ranbir Soram works at Manipur Institute of Technology,Takyelpat, Imphal, India. His field of interest includes Network Security, NLP, Neural Network, Genetic Algorithm, and Fuzzy Logic etc. Sudipta Roy works in the Department of Information Technology, Assam University Silchar. Soram Rakesh Singh is a PG student in the Department of Computer Science & Engineering at Rajiv Gandhi Institute of Technology, Bangalore. Memeta Khomdram works at National Institute of Electronics and Information Technology (Formerly DOEACC Centre), Akampat, Imphal. Swavana Yaikhom works at National Institute of Electronics and Information Technology, Akampat, Imphal. Sonamani Takhellambam works Technology,Takyelpat, Imphal, India. www.theijes.com The IJES at Manipur Institute of Page 12