Chapter 3, Part 2
Saint-Vincent was born at Ghent in 1584, became a Jesuit teacher practicing in Rome, Prague, and later in Spain. Europe was in turmoil at this time, and as a result of his uprootings, Saint-Vincent became separated from his papers. In them he had the keys to solving the quadrature of the hyperbola, and, he believed, the squaring of the circle as well. His method was correct in the former case, not so in the latter.
Since the hyperbola is asymptotic and thus open-ended, we need to define other boundaries in order to have a finite area to measure. In addition to y = 1/x, we arbitrarily make those boundaries the x-axis, and the lines x = 1 and x = b, where b, now known as the upper limit of integration, can be any positive number. In his Geometrical Work on the Squaring of the Circle and of Conic Sections, published in 1647, some 25 years after his discovery, Saint-Vincent advanced the notion that if the upper limits were increased by a factor, that is, if they grew geometrically, the associated areas would grow arithmetically. See Figure 1.
One of Saint-Vincent's students wrote that this made the relationship, that between the upper bound and the area, logarithmic, and thus Mr. Napier's numbers became a function with an application, and a useful one at that. In the figure, if r were 10 for example, the sequence of upper limits would be 10, 100, 1000, ..., and the corresponding areas would be proportional to the base ten logarithms 1, 2, 3, .... This "integration" problem was ripe for solution, and hindsight tells us that in another half-century, the calculus would have led to the same conclusion. Shenk has an elegant development in his text .
Specifically, the area under 1/x from x = 1 to x = b is the natural logarithm (in base e) of the number b. The appendix also shows how this base number can be established.
A great hole in early integration theory was patched, as x^n became integrable for all integers. Saint-Vincent's reputation, however, was muddied by his errors in circle-squaring and because others had made the hyperbolic discovery in the years he delayed before publishing  .
The question of the area of a hyperbolic segment is rich in mathematical connections. For this author, it is a confluence of mathematical insights from antiquity, from Newtonian analysis, and from the statistical approximation techniques developed since. We will briefly describe three other approaches to this problem.
Method 1. Recall that "circle-squaring" was, until 1880, an open question and a lively, if futile, pursuit. In giving Archimedes credit for the parabolic quadrature, we have bent the original rules somewhat. Each stage of the construction is possible, but no one could execute the infinite number of stages required for the limiting area. So a quadrature at the time of one Viscount William Brouncker, co-founder of the Royal Society in England, and its first President in 1662, would have permitted the method of exhaustion.
Brouncker, using the ideas of Saint-Vincent, Fermat, and Gregory, devised a completely geometric scheme for a progression of areas which would converge on the hyperbolic segment 1/(1+x) in the interval from x = 0 to x = 1. Readers should recognize this graph as a translation of 1/x and the interval [1, 2]. Brouncker "boxed in" the area by using a series of smaller and smaller rectangles which gave upper and lower limits to the measure sought. See Figure 2.
His method foreshadowed the Riemann integral, but it was more cumbersome. It gave rise to an infinite series of fractions whose partial sums gradually grew closer to the area of the segment. He later improved on the rate of convergence by using chords instead of rectangles, and this improved the series to:
Using just five terms, we get the natural logarithm of two, accurate to three places . See Figure 3.
Perhaps the reader would agree that it is time to lay the issue of quadrature to rest. Given that the Riemann integral embodies a method of exhaustion using rectangles, the quadrature of any region bounded by known functions can be supplanted by the Calculus; moreover, after the Renaissance, the goals of science became more practical and less idealistic. Mathematicians such as Newton maintained high standards of rigor, but they also had mechanical applications to solve. In manufacturing, for example, the perfect measurement is unattainable, and a good approximation must suffice. Newton contributed much to series approximations by broadening the familiar Binomial Theorem to include expansions involving fractional and negative exponents.
Method 2. To achieve a series approximation of 1/(1 + x) for example, Newton "blew the lid" off the Pascal triangle [see Figure 4]. Note that it is logical to end each row of the customary triangle with an infinite sequence of zeros. It is akin to saying that there are no possible committees of nine people which can be formed from a pool of six. In poker, it is not possible to get a royal flush if you are dealt only four cards. We make use of these zeros in our next step.
Each successive row of the triangle is made from sums of preceding rows, so we employ an inverse process, subtracting, to create predecessors to Row Zero, the usual starting point of the triangle. For example, to create Row 3 from Row 4, we would do subtractions such as (4 - 1) = 3, the second entry. After the initial one (1) in Row -1, we take that one away from 0 to get negative one in the second position. Shifting to the right, we subtract the negative one from zero to get positive one in the third position, and so on. Amazingly, these extended rows give the correct coefficients for the infinite series expansions for negative powers of binomials . The reader, using long division, may wish to verify some of them. Our expansion of interest is:
This series converges for -1 < x < 1. So the area under the curve in the interval from say x = 0 to x = 1/2, can be approximated as nearly as we wish by term-by-term integration. The areas in other intervals then can be found using Saint-Vincent's relation.
Method 3. If the reader is willing to take a risk, to roll the dice so to speak, we can use a more modern approach to approximating this area. Enclose the area in a rectangle as in Figure 5. Now employ a random number generator to produce a random pair of coordinates for a point in the box. Record whether the point fell under the curve. To decide, use the random x-coordinate and calculate a y-value from the formula. Compare it with the random y-coordinate. Repeat. The more random points, the better the approximation will be.
Better still, have a computer simulate the entire process and do the comparing and record-keeping to boot. Use the percentage of points under the curve as the percentage estimate of area under the curve as compared to the area of the rectangle. This automated, probabilistic method is brought to you by the modern discipline of Geometric Probability, and is sometimes referred to as a Monte Carlo method. Can you guess why? We will revisit this method in our final chapter entitled "Area in the Age of Uncertainty."
The Greeks discovered conic sections and studied them. Archimedes discovered the relation between the circumference and area of a circle, and he made a good approximation of pi. He also achieved a quadrature, using the method of exhaustion, of the parabolic segment. Although the conics figured in the advancement of the theory of equations, over the next fifteen hundred years, the areas associated with conics were largely ignored. At the beginning of the seventeenth century, Kepler made a breakthrough discovery about the solar system, and the areas within ellipses became suddenly significant.
The area contained by a hyperbola and its asymptotes proved most difficult to find. Its solution made functional use of Napier's new logarithms and filled a gap in the new theory of integration. Logarithms proved beneficial to astronomers and other scientists. Newton would show that conic curves were the only possible orbits.
Mathematics underwent drastic changes in appearance. Scientists began looking for facts in their data, rather than in the logic. The probabilities, rather than certainties, of occurrences were being calculated. Newton set graphs in motion on DesCartes' grid, and he studied their rates of change. The calculus quickly gave rise to the method of series approximation of areas and other measures associated with irrational functions.
There are stormy times ahead for mathematics. Nineteenth century mathematicians will put calculus through severe testing in order to gauge its applicability and logical rigor. Then, in the 20th century, all of mathematics will be called into question, as Newton is proved fallible, and Einstein, Heisenberg, and Godel show us that the Queen of Sciences can never be as certain, nor as complete as we thought.
1. Dijksterhuis, E.J., Archimedes. Munksgaard, Copenhagen, 1956. pp. 336-345.
2. Chandrasekhar, Newton's Principia for the Common Reader. Clarendon, Oxford University Press, 1995. p. 57.
3. Maor, e: The Story of a Number. Princeton University Press, NJ. 1994. pp. 5-13.
4. Dunham, The Mathematical Universe. Wiley, NY, 1994. pp. 274-275.
5. Maor, pp. 61-66.
6. Boyer, Carl B. and Merzbach, Uta, A History of Mathematics, 2nd ed. Wiley, New York, 1989. pp.292, 293.
7. Maor, pp. 66-68.
8. Coolidge, Julian Lowell, The Mathematics of Great Amateurs. Dover, New York, 1963. pp. 141-146.
9. Maor, pp. 71-74.
10. Shenk, Calculus and Analytic Geometry. Scott-Foresman, Glenville, IL, 1984. pp. 416-429.
Uncle Bob's Puzzle Corner | Uncle Bob and Aunt Claire's Place | Math Menu | Math History Menu