Although we humans are able to speak and write glibly of large numbers, numbers that greatly exceed the capacity of other organisms to comprehend, the truth is that we still have difficulty understanding how much large numbers like a million or billion actually represent. In many cases, we use these numbers primarily as relative, rather than as absolute, indicators, such as that 1,000,000 is larger than 100,000, 523,948 is larger than 85,295, 1045 is larger than 1042, and that 5,374,261 is larger than 5,374,259.

Numerically speaking, round numbers like 100, 500, 2000, 1,000,000, and so on are not any more significant than, say, 96, 503, 2013, or 1,428,171. And yet, there are many people who attribute a greater significance to these round numbers which they don’t attribute to these other, seemingly random numbers.[1] Part of the reason for this is because, beyond approximately five or ten objects, it is not possible for most people to know exactly how many objects there are in a group without actually counting them one by one. And in a large group of objects, such as 53 or 218, most people could not tell, merely at a glance, whether the number of objects contained in the group differs from the number in another group that contains a slightly different number of objects, such as 49 or 227, when these objects are arranged haphazardly.

In recent times, for many humans, the year 2000 had a significance that the year 1997 or 2005 did not have. There were even some credulous people who expected something special or miraculous to occur at the beginning of this year, or at some point during its duration. Of course, these silly humans failed to realize that, outside of the human world, these numerical years have not the slightest significance. There are probably no other organisms, not even chimpanzees, monkeys, dolphins, elephants, or whales – those organisms that, after our exalted species, are considered to be the most intelligent on Earth – that number years successively as we humans do, and far beyond the length of one individual’s lifespan. Even outside the Christian parts of the human world, there are many people who adhere to a different calendar, so that, for example, in the Muslim calendar, the year 2000 fell between 1420 and 1421, while in the Chinese calendar, it coincided with the year 4698.

The same is true of the seven-billionth person to be born. First of all, no one knows exactly when this dubious milestone was surpassed, since countries vary in the accuracy of their population estimates. Secondly, these population figures, although they are relatively stable in some countries, are continually changing, even on a daily or hourly basis, as some people die and others are born. Hence, to seek to attribute this round number to a specific individual is an example of the common human tendency to impose a precision that is often lacking in our knowledge of the world we live in. If the human population at the moment that the seven-billionth human being was declared to have been born was presented in scientific form, it should have been presented with its degree of error, which would likely have measured in the tens of millions of humans.

In truth, the year 2000 was merely a point in the endless continuum of time, while the seven-billionth person was merely a minor point on the continuum of events, in this case the recent alarming and environmentally highly destructive phenomenon of the constantly expanding human population.

If one were to ask a group of people to pick a random number between, for instance, 1 and 1,000,000, most people, I think, would pick numbers like 854,273, 69,251, 2,409, 347, 295,118, and so forth. However, I doubt that many people would choose round numbers like 500,000, 80,000, 200,000, 100, or 10,000. And yet, these latter numbers are just as random as the former numbers. Hence, what I have called the fallacy of round numbers is related, in an inverse sense, to our conception of randomness. Even non-round numbers such as 1,111,111, 369,369,369, or 987,654,321 would not be considered by some people to be random because there is an evident pattern in their sequence. This gives rise to the following (not always valid) definition of randomness: a phenomenon that does not exhibit any observable pattern in its distribution, appearance, structure, or occurrence.

People also have a tendency to think and plan in terms of round numbers and figures. For example, who is there that schedules an appointment, meeting, lecture, or conference at a seemingly random time such as 2:43, 5:19, or 7:02? At most, at least where I live, the tendency is to schedule things at intervals of ten or fifteen minutes, such as at 5:00, 5:15, 5:30, 5:45, and so on. Of course, these dominant models of behaviour could change, but the fact that they don’t change is related to our tendency to think in terms of round numbers and regular intervals of time. In financial matters, such as salaries and prices, one also sees this tendency to think in terms of round, even numbers.

The common practice of scaling down large numbers into percentages, meaning a figure that lies between 0 and 100, since these are numbers that can readily be understood by most people, can sometimes lead to absurd or grotesque results. For example, as the amounts of money that are managed by large financial funds, such as hedge funds, have increased prodigiously in recent decades, in some cases from millions to many billions or trillions of currency units, the common habit of thinking in terms of percentages can lead to greatly exaggerated remunerations. In other words, our inability to understand very large numbers like billions of currency units, and hence, our preference for thinking in terms of percentages, is one reason for the pestilence of preposterous pay. Although a fraction of one percent, such as 0.25%, may not sound like a lot, when one is dealing with very large numbers, such as tens of billions of currency units, this seemingly tiny percentage is magnified into remunerations that measure in the tens of millions of currency units.[2]

Of course, these kinds of simplifications of the world we live in are helpful or necessary in order for the finite creatures that we are to make sense of the world and live our lives in accordance with the sort of creatures that we are. This is an unavoidable feature of all living creatures. But unlike other organisms, we humans are aware, if only dimly, that the particular way we perceive the world is not exhaustive or complete. Both philosophy and its later more successful intellectual offspring, science, represent humanity’s collective attempts to surpass these inherent limitations by expanding our horizons and worlds,[3] such as with powerful microscopes and telescopes that enable us to see objects that are invisible to the unaided eye, or cannot be seen by it in the same detail.

And yet, despite our species’ impressive accomplishments in these regards, each and every human being still retains the inherent tendency to oversimplify things, whether in one’s judgments, perceptions, evaluations, categorizations, comparisons, and so forth. What I have called the fallacy of round numbers – the common tendency to attribute a greater significance to round numbers that one doesn’t attribute to other numbers – is closely related to this tendency.

A superstition is generally defined as a common belief that does not correspond to reality, or has no basis in fact. These false beliefs spread through a population because of the strong innate human tendencies both to imitate and conform to other people’s behaviours, which tendencies include the things they believe. Although the term is no longer widely used, these kinds of false beliefs still exist and can attain considerable credulous circulation among the members of a society. For instance, in economics at the present time, there are many superstitions that are taught in schools and universities about how the economy works or how it ought to operate, such as that markets are efficient, that human beings are rational creatures, that privatization always increases efficiency and reduces prices, that the market price of a thing, or the market wage, is an indication of its “true” or optimum price or wage, and that human behaviour in the economic realm does not need to be regulated by the government. Of course, these fallacious beliefs are not labelled as such by economics professors, largely because they fail to recognize their fallaciousness.

In philosophy, the Rationalist belief that logic and mathematics are reliable, or even infallible, guides to discovering the truth about the world we live in is another example of a common superstition that does not always accord with reality. And yet, despite the legion of examples to the contrary, there are many Rationalists and others who adamantly hold to this superstition, just as religious people tenaciously hold to their religious superstitions, or dogmas.

Although science has rid the world of many ancient superstitions, its practitioners are also not immune to believing in superstitions, which in turn may become the source of other common superstitions. The fundamental scientific superstition is the belief that both Life and the Universe developed entirely by chance, in accordance with the physical, chemical, and other laws of nature. This central tenet of scientific atheism has begotten a plethora of other superstitions in the modern age, such as that, since it is believed to have originated and developed by chance, there is a high probability that Life exists on many other planets in the Universe, that we humans will one day colonize other planets, and that we will eventually be able to create living organisms from non-living materials, which will be able to replicate, regenerate, repair themselves, evolve, and exist indefinitely, even after our species’ extinction.

But just like many other common superstitions of the past, my opinion is that these presently widespread pseudo-scientific beliefs are all wrong. Although we humans are capable of conceiving a great many things that, prior to their conception, didn’t exist, both this ability, which often leads us astray, and our inability to grasp very large numbers in the same way that we can small numbers, are manifestations of the particular way that we categorize, interpret, and seek to understand the world. While some of these tendencies limit our understandings, other of these tendencies mislead our understandings, while others, which are due to our imitative nature, can do both. The perennial challenge in our species’ pursuit of greater knowledge and understanding of the world we live in is to eliminate, or to minimize as much as possible, these common sources of error, so that they are less likely to lead us into error, hubris, false expectations, and temptation.


[1] Other examples of this tendency is the fact that breaking the ten-second barrier in the 100m dash, and the four-minute barrier in the mile, were considered as being somehow more significant than other records in these distances.

[2] Since percentages represent the scaling of a range of numbers to a range from 0 to 100, this is similar to the practice of using maps to represent a geographical area. A person who thinks that a fraction of one percent is a small percentage, and therefore is not significant, is like a person who mistakes the small distances on a map as signifying that the distances they represent in reality are also not very large.

[3] The recent success of Western science has marginalized other, more traditional ways of perceiving or categorizing the world, such as mysticism, shamanism, and religions in general.