Skip to main content

Zero

  • Posted

At my Sixth Form, there was a discussion group called the Stride–Darnley Society, named after a former English and Drama teacher who was a major advocate of public speaking and debating. The group was set up in her name for Sixth Formers to present and discuss essays on topics that they enjoyed. This is an essay I wrote for the society, on the number zero.

I presented two essays: one on the number zero, and one on electoral politics (specifically, the mathematics of voting systems). Hardly surprising, since I’d eventually go on to study maths at university, and I try to attend regular History of Maths lectures, which touch on similar topics to my essay on zero. I enjoyed writing both of the essays, and since I now have a writing outlet on the web, I thought I’d post one of them. It’s mostly unedited, except to check that it’s made the conversion to HTML okay.

I last opened this in March 2010, so I hope it’s still interesting, several years later.

The importance of measuring nothing

What is 7 minus 7? The answer seems obvious to us, living in the twenty–first century. But odd as it sounds, ’zero’ had to be invented, and it took millennia to catch on. What I want to explain today is why zero took so long to be accepted, and why it’s so fundamental to modern mathematics.

Historians believe the idea of a number to represent nothing arose with the Indian mathematician Brahmagupta in 620 AD, who also developed the Hindu–Arabic numerals 0 to 9 we use today. The system reached Europe in the eleventh century, from Spanish Muslims travelling from the Iberian peninsula. The Italian mathematician, Leonardo of Pisa, better known as Fibonacci, saw the Arabic numerals while working with his father in the customs house of the North African town of Bugia. He introduced them to European mathematicians, and they were quickly adopted as the standard for mathematics, including the digit 0. Before then, Europe had used an amalgamation of number systems – including Greek, Egyptian, Babylonian and Roman.

I want to highlight here how important 0 is to a number system, with the example of the Romans. Their system was in use for centuries, but it had no proper concept of nothing as a number. They only gave it a name, nulla, or nothing. And other cultures at the same time, although they used other methods to indicate the absence of a value, not one called it a bona fide number in itself.

Now let’s take a relatively simple sum, say 43 + 24. With our decimal system, we can easily line the numbers up, do the addition, and get the answer 67. But the Romans had to carry out the sum XLIII + XXIV. No amount of lining up will ever give the answer LXVII. Even representing large numbers is daunting. Take a year, like 1999:

Working with these numbers is just daunting. Imagine trying to subtract, multiply or divide with this counting system. Abaci and tally systems helped alleviate the problem, but otherwise Roman mathematicians faced some pretty demanding sums.

And to see how much this held the Romans back, let’s count the number of influential Roman mathematicians. There were none. Despite close contact with the Greeks, a highly mathematical society, the Romans have made very little contribution to mathematics. Perhaps the mathematical historian, Morris Kline, put it best when he said:

The first disaster in mathematical history was the advent of the Romans, whose entire role in the history of mathematics was that of an agent of destruction.

The Romans borrowed a great deal of mathematics from ancient Greek society – this is how they built their roads, bridges, temples, houses for which they are famed – all based upon the foundations of engineering laid down by the Greeks. But in terms of innovation, they contributed nothing. The use of such cumbersome numerals stifled the dawn of new mathematics in what was otherwise a very technically advanced culture.

So clearly a decimal system like that of the Arabs is more practical than that of the Romans. But it still took many years for them to catch on. While mathematicians used them predominantly since Fibonacci’s time, merchants still preferred to use Roman numerals. It wasn’t until the sixteenth century that Arabic numbers, and hence the number zero, became commonly used in Europe.

But perhaps it’s understandable that the Romans and merchants weren’t keen to embrace the idea of a number for nothing. For them, maths was a very practical subject – architecture, engineering and accounting. In these areas, actual, physical things are being measured and counted, so they had no need to ever quantify nothingness. After all, if an engineer is designing a building and ends up with a length of nothing, chances are that somewhere he’s made a dodgy calculation.

To these earlier people, zero didn’t have any practical use. In fact, the idea of a zero being considered a number sent the ancient Greeks into a fit of philosophical, and even religious, arguments and disputes over what exactly constitutes a number, and whether numerical status can really be applied to a representation of nothingness.

But like it or not, zero eventually made its way into European number systems. I’ve already shown how Roman numerals made heavy calculation cumbersome? But beyond this, is it really necessary to use zero? I mean, can’t we just do with digits 1 to 9?

Absolutely not. Zero is fundamental to a number system, because it acts as a placeholder to hold the place of empty values. For example, 100 in the decimal notation tells us that there are no values in the ones or tens, but there is a 1 in the hundreds. So we read it as one hundred, but no one ever says, one hundred no tens and no ones. If we cross over to the other side of the decimal point, .001 becomes one thousandth. There are no tenths and no hundredths. It’s a very useful digit.

But once we put zero in our number system, we have to face how it interacts with other numbers. With addition, subtraction and multiplication, it plays quite nicely. But there was one operation that played havoc with zero: division.

Division by zero is one of the Gordian knots of mathematics. Ancient mathematicians like Brahmagupta grappled with the problem, but got nowhere. Today, if you ask a mathematician ``What happens if I divide by zero?” you’ll simply get the answer, “You can’t. It’s against the rules.” Today, division by zero is never allowed.

Some people think dividing by zero is infinity. But this isn’t true. Division is defined as the inverse of multiplication: if you divide by zero, and then multiply by zero, you should get back to the number you started with. However, multiplying infinity by zero produces only zero, not any other number. There isn’t anything which can be multiplied by zero to get a nonzero result, so the result of division by zero is, literally, ``undefined.”

But it took many years for this results to be accepted. And this isn’t the whole story. What, for example, might you get if you divide zero by zero? Or zero to the power zero? As you can see, there are a lot of problems with this ``unnumber” – which is what zero is, a problem to represent something that doesn’t exist – you can see why the number took so long to catch on.

Once accepted though, from this Pandora’s Box of a number sprang a whole new world of mathematics. Let’s look at a few of them.

Perhaps one of the simplest benefits we can see is by drawing a number line. Zero as a number does not have a polarity: that is to say, it is neither positive nor negative. Before zero, we only knew about the numbers on the right hand side of the number line: the positive numbers. But until mathematicians accepted zero as a valid number, we couldn’t unlock the negative numbers, and the vast quantity of new mathematics that comes from them.

Because it divides the number line in two like this, it’s incredibly important for scales. It’s the pivotal point for thermometers, and the Kelvin temperature scale dictates a point called absolute zero: the coldest temperature an object can reach. An when dealing with limits, velocity and acceleration, the direction that movement has in relation to zero dictates whether it is positive or negative.

Another mathematical concept that arises from negative numbers is that of the imaginary number. The square root of a number is the number which, when multiplied by itself, produces a specified quantity. For example, 7 is the square root of 49. If you try and square root a negative number though, you hit a problem. A positive number squared is positive; a negative number squared is also positive – is any number squared negative? Well, yes. These are called imaginary numbers, based around i, the square root of minus one. Imaginary numbers have very important real–world applications. Perhaps one of the most practical lives in our televisions. We?re used to getting dozens of channels, but with a single input signal. Each channel has its own unique frequency in the signal, and the TV has to filter out all the other channels and display only the frequency you want.

The design of this incredibly complex mechanism is done by computer, but at the heart of these computations are imaginary numbers. If you have anything more than a few channels, the only humanly way possible to manage the output is with imaginary numbers. Otherwise, you?re stuck trying to solve something called a tenth order differential equation, and just trust me here, it?s as difficult as it sounds. So when you go home this evening, put the telly on, watch Sky or whatever, you can thank imaginary numbers, and so you can thank zero.

And if we treat 0 as a value, a quantity of nothing, we open up an entirely different world of applications. Have you ever wondered how computer generated imagery, or CGI works? A computer has to produce and solve countless mathematical equations. For example, a doughnut shape (in mathematical terms, a torus), uses something called a quartic equation. And these equations rely on some terms being equal to zero; the computer can then crank out the algebra and draw the shape. But without a zero, solving this equation becomes impossible, and so does drawing these complex shapes.

So, to wrap up. Zero has caused mathematicians no end of grief and discomfort, but it’s a key part of mathematical history. It took centuries to reach universal acceptance, but today zero would be impossible to live without. Modern computers might still exist without zero, but they would be far less efficient. Large numbers can exist without zero, but they would be more cumbersome to manage. Higher mathematics could still exist without zero, but the proofs would be far less elegant and need much more work.

One of the most important values in number theory, with its myriad of uses, it’s safe to say that zero is worth far more than the sum of its parts.