INTRODUCTION - The Calculus Primer

The Calculus Primer (2011)

INTRODUCTION

The creation of Analytic Geometry by Descartes in the early part of the seventeenth century was a milestone of tremendous significance. Indeed, it was the beginning of modern mathematics in the broad perspective of history. (“Modern” mathematics in the sense of contemporary mathematics did not commence until about 1900, with the advent of functional analysis, abstract spaces, set theory, and symbolic logic.) This great step of recognizing the relation between the numbers of algebra and the entities of geometry once having been taken, it is perhaps not surprising that further advances were soon to follow, culminating in the invention of the Calculus by ISAAC NEWTON and by GOTTFRIED LEIBNIZ. This was a classic illustration of nearly simultaneous, but presumably independent creation. In this instance the invention was attended with unfortunate and, at times, bitter dispute. Newton had been using his new method of fluxions since 1674 or even earlier, but he did not publish a direct statement of his method until 1687. Leibniz, using a different approach and a different symbolism, published his first statement in 1684. Each accused the other of plagiarism, yet from the vantage point of historical perspective there can be little doubt that each must be credited with independent creation.

What is perhaps more important is the fact that the ground had been prepared for this new advance by many predecessors over a long period of time. There are traces of the methods of the Calculus discernible even among the Greeks. Thus DEMOCRITUS (c.400 B.C.) suspected the relation that exists between the volumes of cones and cylinders, although he was unable to prove it. But the influence of ZENO’S famous paradoxes about infinitely small quantities discouraged the further use of infinitesimal quantities from the study of geometry. Shortly thereafter another forerunner of the Calculus came into prominence, the method of exhaustions. This was first used in connection with the circle, and consisted essentially of doubling and redoubling the number of sides of a regular inscribed polygon on the assumption that as the process of doubling is continued indefinitely, the difference in area between the circle and the polygon would finally be “exhausted.” The method was later extended and refined by EUDOXUS (c.360 B.C.). The nearest approach to the process of summation, however, was made by ARCHIMEDES (c.225 B.C.) in connection with his study of the area under a parabola and certain other curves, as well as the volumes of certain well-known solids.

No further step toward the Calculus was taken until during or after the Middle Ages. The story is a long and interesting one, and only a few of the high spots can be touched upon here. The astronomer KEPLER (c.1610) developed certain crude methods of integration, in connection with problems of gauging, by which the volume of a solid was regarded as composed of many infinitely small cones or infinitely thin disks. His contemporary, the Jesuit priest BONAVENTURA CAVALIERI, made considerable use of the so-called method of indivisibles, in accordance with which he regarded a surface as the smallest element of a solid, a line as the smallest element of a surface, and a point as the smallest element of a line. Although the method lacked logical rigor and was essentially intuitive in nature, the results obtained by its application were valid. Another contemporary, ROBERVAL, using a similar method, but employing the device of an infinite number of infinitely narrow rectangular strips, succeeded in solving many problems of finding lengths of curves as well as areas and volumes.

About 1636 the French mathematician FERMAT extended the methods of his predecessors, attacking also the problem of finding maximum and minimum values of curves, focusing attention on the determination of tangents to a curve. In the course of his work, Fermat enunciated the general principle for determining a tangent, substantially equivalent to the modern method of setting the derivative of a function equal to zero. Thus Fermat may also, in a very real sense, be regarded as a co-inventor of the Calculus.

Two other fertile minds contributed to the pregnant atmosphere of this eventful period—JOHN WALLIS and ISAAC BARROW, both British mathematicians. Wallis’ Arithmetica Infinitorum, which appeared in 1655, showed how to apply methods of summation to the determination of the areas of triangles, the lengths of spirals, and the volumes of paraboloids and other solids. Wallis freely acknowledged his indebtedness to Torricelli, Cavalieri, Christopher Wren, and others. While Wallis was devoting his efforts to problems of integration, Barrow (1663) drew attention to the problem of tangents and methods of differentiation. He recognized the fact that integration is the inverse of differentiation.

Newton at various times employed three different methods of procedure. At first he made some use of infinitely small quantities, but soon recognized that this was not a sound basis on which to operate. His second method, the method of fluxions, was a unique contribution. According to the concept of fluxions, a curve was considered as being generated by a moving point. The change in the position of a point in an “infinitely short” time was called its “momentum”; this momentum divided by the infinitely short time was the “fluxion.” If the “flowing quantity” was x, its fluxion was denoted by images. In our notation, if x is the function f(t) of the time t, Newton’s “images” is our images ; his imagesis our images ; etc. Nevertheless, Newton eventually was dissatisfied with his own device of fluxions, and toward the latter part of his career he attempted to refine the method by a theory of limits and the notion of continued motion, or continuity.

Whereas Newton was impelled toward the Calculus by his burning interest in experimental and applied science, Leibniz was drawn to the Calculus via a purely intellectual and philosophical route. In brief, Leibniz was seeking a universal system of symbolic reasoning. Just as the philosopher-mathematician Descartes had reduced all geometry to a universal method or system, so the philosopher-mathematician Leibniz hoped to reduce all reasoning of whatever sort to a “universal characteristic,” or as we might say today, to a symbolic logic.

Unfortunately for the development of mathematics in England for the next hundred years, the notation introduced by Newton was not as convenient or as significant as the symbolism used by Leibniz; stubborn adherence to the geometric methods and fluxional notation of Newton retarded further advances by British mathematicians for nearly two or three generations. Eventually, however, the method and notation of Leibniz became universal, but not until they had been popularized on the continent by such writers as L’HOSPITAL, D’ALEMBERT, and the BERNOULLIS. Thereafter continued advances were made with breath-taking strides, heralded by the extension and ever-wider applications made by such brilliant minds as those of EULER, LAMBERT, LAGRANGE, LAPLACE, and LEGENDRE.

In retrospect, then, modern mathematics may be said to have begun at the start of the seventeenth century and advanced in five major directions: (1) the Analytic Geometry of Descartes and Fermat; (2) the Calculus of Newton and Leibniz; (3) the probability theory of Fermat and Pascal; (4) the higher arithmetic of Fermat; and (5) the mechanics of Galileo and Newton. In a very real sense we may say that modern physical science and technology are the direct product of the experimental method espoused by Galileo combined with the Calculus created by Newton and Leibniz.

The beginning of the nineteenth century saw more refinement and elaboration of the methods of the Calculus, as well as the infusion of greater rigor into the basis of analysis. Outstanding contributions in this connection include the work of Cauchy, Gauss, Jacobi, Abel, and Dirichlet. By the end of the century, modern analysis, a mighty superstructure resting securely upon solid foundations, had been essentially completed—a triumph of modern thought. However, the triumph was short lived. The dawn of the twentieth century witnessed the recurring struggle to understand the infinite, together with renewed desire to create an even more rigorous and unassailable foundation for all mathematics, and for analysis in particular. The magnificent efforts of Cantor, Kronecker, Dedekind, Weierstrass, Brouwer, Hilbert, Frege, Bertrand Russell, A. N. Whitehead, and others constitute an epic saga in the history of mathematics. The final chapter has not yet been written. In the deceptive nearness of history in the making, Gödel’s theorem of 1931 is disconcerting, to say the least. The theorem shows that it is impossible to prove the consistency of a system S of what is essentially modern mathematics by the methods of proof used within the system S.

The Calculus is a powerful but subtle mathematical tool. It is not child’s play. There is an honest difference of opinion as to whether the Calculus should be preceded by a more or less thorough course of study of analytic geometry. It is generally conceded that anything like a complete systematic course in analytics is not a necessary prerequisite. Yet a certain knowledge of fundamentals is indispensable.

In exploring the Calculus, the beginner will face a rather novel experience—that of dealing with limiting processes. He should not be discouraged if at first (or even for some time thereafter) the processes concerning limits should seem somewhat vague and elusive. To be sure, the terms “infinity” and “limit” may be troublesome for a while, and can easily be misunderstood. These two concepts—that of the infinite and that of a limit—are subtle ideas which may well be consistently and meaningfully used before (or instead of) being formally defined. He should not allow himself to think about “infinitely small quantities,” but to think rather of definite intervals, however large or small, and to regard a limit as an approximation, as a property of the neighborhood in which the limit lies. These ideas cannot be grasped quickly; one must work with them for some time. The reward of such patience will be most gratifying.