Вы находитесь на странице: 1из 2

Jack Dunn

Newton and Limits

Isaac Newton was born on Christmas Day in 1642, the year of Galileos death. When
he entered Cambridge University in 1661, Newton didnt know much mathematics,
but he learned quickly by reading Euclid and Descartes and by attending the
lectures of Isaac Barrow. Cambridge was closed because of the plague in 1665 and
1666, and Newton returned home to reflect on what he had learned. Those two
years were amazingly productive for at that time he made four of his major
discoveries: (1) his representation of functions as sums of infinite series, including
the binomial theorem; (2) his work on differential and integral calculus; (3) his laws
of motion and law of universal gravitation and (4) his prism experiments on the
nature of light and color. Because of a fear of controversy and criticism, he was
reluctant to publish his discoveries and it wasnt until 1687, at the urging of the
astronomer Halley, that Newton published Principia Mathematica. In this work, the
greatest scientific treatise ever written, Newton set forth his version of calculus and
used it to investigate mechanics, fluid dynamics, and wave motion, and to explain
the motion of planets and comets.
The beginnings of calculus are found in the calculations of areas and volumes
by ancient Greek scholars such as Eudoxus and Archimedes. Although aspects of
the idea of a limit are implicit in their method of exhaustion, Eudoxus and
Archimedes never explicitly formulated the concept of a limit. Likewise,
mathematicians such as Cavalieri, Fermat, and Barrow, the immediate precursors
of Newton in the development of calculus, did not actually use limits. It was Isaac
Newton who was the first to talk explicitly about limits. But Newton acknowledged
that If I have seen further than other men, it is because I have stood on the
shoulders of giants. Two of these giants were Pierre Fermat (1601 1665) and
Newtons mentor at Cambridge, Isaac Barrow (1630 1677). Newton was familiar
with the methods that these men used to find tangent lines, and their methods
played a role in Newtons eventual formulation of differential calculus. He
explained that the main idea behind limits is that quantities approach nearer than
by any given difference. Newton stated that the limit was the basic concept in
calculus, but it was left to later mathematicians like Cauchy to clarify his ideas about
Jack Dunn

Cauchy and Limits

After the invention of calculus in the 17th century, there followed a period of free
development of the subject in the 18th century. Mathematicians like the Bernoulli
brothers and Euler were eager to exploit the power of calculus and boldly explore
the consequences of this new and wonderful mathematical theory without
worrying too much about whether their proofs were completely correct.
The 19th century, by contrast, was the Age of Rigor in mathematics. There
was a movement to go back to the foundations of the subject to provide careful
definitions and rigorous proofs. At the forefront of this movement was the French
mathematician Augustin-Louis Cauchy (1789 1857), who started out as a military
engineer before becoming a mathematics professor in Paris. Cauchy took Newtons
idea of a limit, which was kept alive in the 18th century by the French mathematician
Jean dAlembert, and made it more precise. His definition of a limit reads as follows:
When the successive values attributed to a variable approach indefinitely a fixed
value so as to end by differing from it by as little as one wishes, this last is called
the limit of all the others. But when Cauchy used this definition in examples and
proofs, he often employed delta-epsilon inequalities similar to the ones in chapter
6. A typical Cauchy proof starts with: designate by and two very small numbers;
He used because of the correspondence between epsilon and the French word
erreur and because delta corresponds to diffrence. Later, the German
mathematician Karl Weierstrass (1815 1897) stated the definition of a limit
exactly as in our definition.