April 10, 2023

The History of Zero

By Dr. Kim Johnson, The Lukeion Project (Check out Dr. Johnson's class Counting to Computers: Math from the Dawn of Time to the Digital Age)

It turns out that one of the greatest accomplishments in the history of mathematics is---nothing.

From where we sit today, the tools of mathematics seem so obvious that they hardly bear thinking about. It might surprise you, then, to learn that one of these everyday tools took thousands of years, deep philosophy and even controversy to become part of our everyday tool kit. What is this development that we use every day without even thinking about, and yet was so difficult for the most brilliant and advanced mathematicians in ancient times to grasp? The number zero.

Before Zero

In the mists of time, mathematics seems to have been used primarily for counting or measuring things. Whether you are counting the number of animals in your flock or the distance to the next town, the concept of assigning a number to “no animals” or “no distance” seemed unnecessary and overly complicated. After all, “zero cows” is the same thing as “zero apples,” while “one cow” and “one apple” are decidedly different from each other.

This is not to say that ancient civilizations never used any concept related to zero. The Egyptians had massive building projects which required sophisticated engineering: they used a number to refer to the ground level. They also had a concept of a “zero balance” when no money was owed. The symbol for these was called nef, which also meant “perfect” or “complete.” 

The Greeks had an even harder time with the concept of zero than the Egyptians. Pythagoras declared, “All is number,” but what he really meant was “All is ratio.” Numbers to him represented the ratio between two quantities or lengths. This is useful in geometry (recall the concept of similar triangles) and in music (the ratio between lengths of string that produce an octave is 1:2) but terribly difficult for the concept of zero--a ratio of zero is almost nonsensical. Aristotle, whose philosophy reigned over the western world through the Middle Ages, stated that, “Nature abhors a vacuum.” Not only was zero not a number, but philosophically speaking, it was impossible. The paradoxes of Zeno of Elia (490-430 BC) point out the difficulties that the existence of zero would imply.

The absence of the number zero became a problem for Babylonian computation. Unlike the Egyptians, Romans, and Greeks, the Babylonians had a positional place value system where each number meant different things depending on where it occurred. In our system, where the positions represent powers of 10, the digit 2 in 2,528 means either 2,000 (the first 2) or 20 (the second 2). In the Babylonian system different positions represent different powers of 60. The leftmost 2 would mean 2x(60x60x60) while the rightmost means 2x60. This is fine unless you need a number which contains a zero. For the Babylonians, 11 could mean 101, 1001, 110, and so on, depending on context. They eventually developed a place holder to mean zero, but it only worked in the middle of a number, not at the end. They could distinguish 202 from 22, but not 220 or 2200.


The Discovery of Zero

Fortunately for mathematics, something changed in about 300 AD. Indian mathematicians began developing the symbol sunya which referred to the number zero. Rather than just a place holder, sunya (first a dot, then a ring) was used as a number. It makes sense that zero should be developed in the context of eastern rather than western philosophy. The concept of nothingness or emptiness, anathema to the Greeks, plays a much larger part in Indian religious philosophy.

 

In the 7th century, the great mathematician Brahmagupta (598-668 AD) created a whole system explaining how zero worked with other numbers. Some of his rules included the fact that a fortune, take away zero, is a fortune; or a fortune multiplied by zero is zero. He did commit the mathematical sin of dividing by zero, but that concept remained difficult for mathematicians well into the modern era. Brahmagupta also talked about negative numbers with more comfort than western mathematicians would into the 18th century but that’s a blog post for another day. 


 

After Zero

It seems obvious to us now but zero took a while to catch on in the rest of the world. The first people to adopt zero—and see how useful it was—were the Arabs. Most notably, Al Khwarizmi (c. 780-850 AD) used Hindu-Arabic numbers as he was developing algebra at the House of Wisdom in Baghdad. He described zero, which he called sifr, in his book Algoritmi de Numero Indorum as “The tenth figure in the shape of a circle.”

Several individuals were instrumental in introducing zero to Europeans. One of the earliest, Pope Sylvester II (c 946-1003) had studied the Hindu-Arabic numerals when he went to Muslim controlled Spain to study mathematics and science. Because of his facility with these numbers (and probably for other reasons) he was accused of consorting with the devil.   

The next great advance in using this Hindu-Arabic numeral came when Leonardo di Pisa—better known as Fibonacci (c.1170-1240) wrote his great work Liber Abici. In addition to problems about rabbits, he introduced the numerals he had learned about when he studied in North Africa among Muslim mathematicians. He also introduced algorithms for doing basic mathematics including 28 pages about how to do long-division. We still use his method today.


 

 

“These are the nine figures of the Indians 9, 8, 7, 6, 5, 4, 3, 2, 1. And so, with these nine figures, and with the symbol 0, which is called zephyr in Arabic, whatever number you please can be written, as is demonstrated below.”

Even after zero came to be used in the west, it wasn’t immediately accepted by the public. Financial institutions in Florence banned the use of zero in 1299. They felt that there were too many opportunities for fraud because someone could use place value to change the values on the checks that were being written. Even today we write “seven hundred fifty and no/100 dollars” on our checks---perhaps as a deterrent to adding zeros and increasing the value of your check.

The medieval philosophy textbook Margarita Philosophica (Gregor Reisch, 1503) shows a picture of a contest between someone computing with the old counting board on the right, and the new-fangled Hindu-Arabic numerals on the left. You can see who is happier. The Hindu-Arabic number system eventually won out, including its use of zero.

Zero has proven useful beyond allowing a more efficient number system. Zero eventually became the star of the number line and of the Cartesian plane, invented by Pierre de Fermat and Rene Descartes, and used to visualize algebraic functions and equations. Zero is essential in the invention of calculus. Newton and Leibnitz use infinitesimals to find a tangent line to a curve, quantities that are almost zero but not quite. Don't forget that zeros make up about half of every binary computing system.

The history of zero spans thousands of years, multiple civilizations and continents---and yet it is something so simple that we use it every day.

If this story of the connection between math and culture piques your interest, you might want to sign up for the course Counting to Computers: the history of math from the dawn of time to the computer age. This course is appropriate for strong pre-algebra students through high school students taking higher math.

No comments:

Post a Comment

Keep Technology Caged

AI Isn’t Your Academic Friend By Amy Barr with The Lukeion Project At least in theory, students can now grab their essay or paper prompt f...