Sunday, April 05, 2009

The value of zero

Considering the reverence which we give to the adoption of 0 as a place holder in our system of numeric notation, it is intriguing to think about how much effort we put into getting rid of it

Especially in the (early) computer age, with storage space at a premium, we had to get rid of the things, squash them down or squeeze them out

More intriguingly, in its way, is how we cope with the trailing zeroes in VERY BIG numbers: we give them a name. Million, googol, gazillion

But milliard or billion? English or American? Instead of resolving this by taking refuge in scientific notation – 10 to the 9 – or longer phrase – thousand million – or alpha/numeric abbreviation – ‘000 million – we just slug it out, helped no little by inflation. When UK GDP (in thousands of £) reached 7 figures, then government expenditure, then departmental budgets – we just gave in & went for the billion. And got cavalier - £1m registered only in the final digit, so who is going to worry about that, between friends

But the problem did not just come with the computer age or 20th century inflation. The ordinary human brain cannot cope with long strings of digits in the way that it can with antidisestablishmentarianism – most of us stop at a 4 digit PIN

I remember reading somewhere (can it really have been in one of the editions of Yule & Kendall) of long-ago experiments which showed that the average human can count, at a glance, only up to 5 objects, though some few may be able to count up to 8. Above that number the normal person has to (mentally) point & enumerate – unless of course the objects are arranged in a recognisable pattern, such as on playing cards

So it is not just the zero digits we want to compress or suppress, squash into a form easier to handle – give it a name

But there is another strange thing. Zero itself does not have a single proper unique name by which it is invariably called - nobody uses cipher now, except figuratively

Zero
Nought
Nil
Oh

In that it is like 1,2,3 & maybe 4 or 5 (rarely & only in specialist fields for digits bigger than that). Two may be referred to as pair, twin or double for example

It does not even have an unambiguous symbol of its own

Early typewriters did not give it a key – why bother when you can use the O (some did not have 1 either, since I or l would do)

A fact which became noticeable for a while when newspapers were first computerised & journalists & subs proved not to know their O from their 0

In the days when programmers had to write out their code by hand to be transformed into machine-readable code by key-punch operatives, there were attempts to distinguish the two by the use of Ø. The problem was, remembering which was which

It was also important to make clear when a real space was intended, rather than just variable, wonky spacing between your handwritten code; Δ was adopted for this

So Zero does not necessarily have or mean nothingness, in the way that twoness belongs to 2. It may stand for not known, missingness, or just absence

And that is before you get into the philosophical problems of how no number can be a number, & the problems of division, limits & infinity