The famous 'root(2)' tablet from the Yale Babylonian Collection.

This tablet is a round school tablet of unknown provenance from the Old Babylonian period. It has a picture of a square with both diagonals drawn in. On one side of the square is written the number 30 (=1/2), along one of the diagonals is the number 1,24,51,10 (=$\sqrt{2}$) and below it is 42,25,35 ($\tfrac{1}{2}\sqrt{2}$). Note that the diagonal of a square with lenght $n$ is always $n\sqrt{2}$. $\sqrt{2}$ is the coeffcient that stretches the diagonal to connect the oppositie vertices of a square.

Babylonian method square root

Based on an ancient Babylonian method (developed in 1800 B.C.) we will describe a method to calculate a square root $x=\sqrt{a}$ satisfying $x.x=a$. Suppose $x_0>0$ is an approximation for $\sqrt{a}, a>0$. Suppose further $x_0 \less \sqrt{a}$ then we have another estimate of $\sqrt{a}$: \begin{align*} x_0 &\less \sqrt{a} \to \\ 1 \less &\frac{\sqrt{a}}{x_0} \\ \sqrt{a} &\less \frac{\sqrt{a}}{x_0} \sqrt{a} \\ &=\frac{a}{x_0} \end{align*} Notice that the two estimates, $x_0$ and $\frac{a}{x_0}$, lie on opposite sides of $\sqrt{a}$. But then we could even have a better approximation by taking the arithmetical average $\tfrac{1}{2}\left(x_0+\tfrac{a}{x_0}\right)$

This brings us to defining the following recurrent relation for finding the square root of $a$. \[ x_{n+1}=\tfrac{1}{2}(x_n+\tfrac{a}{x_n})=\tfrac{1}{2}(x_n+g(x_n)) \:, n=0,1,2,\ldots \] In the graph below we see that the sequence $x_0,x_1,x_2$ converges quicky to $\sqrt{a}$. $\sqrt{a}$ is a fixed point, (the solution $x$ of $f(x)=x$ is called a fixed point of $f(x)$). The point $(x_1,x_1)$ is the intersection of the line $y=x$ with the line $L$ trough the point $(x_0,g(x_0))$ and tangent $-1$, $L_0:y-g(x_0)=-(x-x_0)$. We can see this by substituting $x$ for $y$ in the equation and solving for $x$. \begin{align*} x-g(x_0)=-(x-x_0) \to \\ 2x=g(x_0)+x_0 \to \\ x=\tfrac{1}{2}(x_0+g(x_0)) \end{align*} Notice further that the tangent of $L_n$ is the same as the tangent of $g$ at $(\sqrt{a},\sqrt{a})$, namely $-1$. Therefore we have that $x_{n+1}$ is always to the right of $\sqrt{a}$ or $\sqrt{a} \leq x_{n+1}$. This is so because with the increase from $(x_k,g(x_k))$ towards $y=x$ along $L_n$ is faster than along $g(x)$ and therefore $L_n$ will hit $y=x$ earlier than $g(x)$ and so $x_{k+1}$ must be greater than $\sqrt{a}$. Of course this is an informal argument and can not replace a rigorous proof. \begin{align*} x_n-x_{n+1}&=x_n-\tfrac{1}{2}(x_n+ax_n^{-1}) \\ &=\frac{1}{2x_n}(x_n^2-a)\geq 0 \:, n=1,2,3,\ldots \end{align*} So finally, we see that the sequence is monotonically decreasing bounded below by $\sqrt{a}$: $\sqrt{a}\leq x_{n+1} \leq x_n \ldots \leq x_2 \leq x_1$

We will formalize this with following propostion.

Proposition
Let $a>0$ and $x_1>0$ and define the sequence $\langle x_n\rangle$ for $n=0,1,2,\ldots$ by: \[ x_{n+1}=\tfrac{1}{2}(x_n+ax_n^{-1}) \] then \[ \lim_{n\to\infty} x_n=\sqrt{a} \]
Proof

We have established an algorithm for square roots, we take now a look at $n$th roots. Based on the ideas of the square root we could follow the same approach. Suppose $x_0>0$ is an approximation for $\sqrt[n]{a}, \; a>0$ and further $x_0\less \sqrt[n]{a}$. Then a new approximation follows from: \begin{align*} x_0 &\less \sqrt[n]{a} \to \\ 1\less &\frac{\sqrt[n]{a}}{x_0} \to \\ \sqrt[n]{a} & \less \frac{\sqrt[n]{a}}{x_0} \sqrt[n]{a} \to \\ \sqrt[n]{a} & \less \left(\frac{\sqrt[n]{a}}{x_0} \right)^{n-1} \sqrt[n]{a} \\ &=\frac{a}{x_0^{n-1}} \end{align*} So we have now two approximations: \[ x_0, \:\: g(x_0)=\frac{a}{x_0^{n-1}} \] We could again take their arithmetic average or any other type of weighted average. It appears that the sequence converses quadratically if we choose the weight ratio's for $x_k$ and $g(x_k)$ as $n-1:1$. We define the following recurrent relation for finding the nth root of $a$. \[ x_{k+1}=\left(\frac{n-1}{n}\right)x_k+\left(\frac{1}{n}\right)\frac{a}{x_k^{n-1}}=\left(\frac{n-1}{n}\right)x_k+\left(\frac{1}{n}\right)g(x_k) \:, k=0,1,2,\ldots \]

The process of convergence to $\sqrt[n]{a}$ follows the same argumentation as before. Instead the line $L_n$ has now a slope of $1-n$, the intersection with $y=x$ gives us: \begin{align*} x_{k+1}-g(x_k)=(1-n)(x-x_k) \to \\ nx_{k+1}=(n-1)x_k+g(x_k) \to \\ x_{k+1}=\left(\frac{n-1}{n}\right)x_k+\left(\frac{1}{n}\right)g(x_k) \end{align*} The slope of $g$ at $(\sqrt[n]{a},\sqrt[n]{a})=1-n$ and we apply the same logic as for the square root to show that $\sqrt[n]{a} \leq x_{k+1}$. Furthermore we can then also show in the same way we did for the square root that $x_k-x_{k+1}\geq 0$ . Finally a sequence results with following properties: \[ \sqrt[n]{a}\leq x_{k+1} \leq x_k \ldots \leq x_2 \leq x_1 \]

Proposition
Let $a>0$ and $x_1>0$ and define the sequence $\langle x_k\rangle$ for $k=0,1,2,\ldots$ by: \[ x_{k+1}=\left(\frac{n-1}{n}\right)x_k+\left(\frac{1}{n}\right)\frac{a}{x_k^{n-1}}=\left(\frac{n-1}{n}\right)x_k+\left(\frac{1}{n}\right)g(x_k) \:, k=0,1,2,\ldots \] then \[ \lim_{k\to\infty} x_k=\sqrt[n]{a} \]
Proof

Here follows a Haskell implementation of the Babylonian algorithm for finding the n-th root. Notice how close the Haskell implementation resembles the mathematical definition. The function starts with an initial guess x0=a/n, and continues then iterative until the next computed value does not change anymore from the previous.

References



Copyright 2012 Jacq Krol. All rights reserved. Created July 2012; last updated July 2012.