At the heart of the theory of evolution is the untested assumption that random changes, called mutations, can beget beneficial modifications or improvements in an organism’s genome, which beneficial modifications are then propagated by natural selection so that they become more common in a species’ gene pool. Although biologists routinely declare that most mutations are bad, and therefore they are not beneficial or advantageous, they don’t seem fully to realize what this statement means. For in the next breath or sentence, they declare that, at least occasionally, these random and generally harmful genetic changes are capable, under the guiding hand of natural selection, of producing the many different beneficial changes and increasingly complex organisms that are visible in the fossil record. It is truly astonishing that the validity of this assumption has never been questioned by the many Darwinian Dummies who have accepted it uncritically, simply because it accords with their dogmatic belief that Life on Earth arose and developed by chance, in accordance with the primordial law of natural selection. If one considers the matter objectively, one will see that evolutionists simply assume, without any evidence for this belief, that the many, highly ordered changes that are visible in the fossil record, such as the transformation from single-celled organisms to multicellular organisms, from aquatic organisms that breathe with gills to terrestrial organisms that breathe with lungs, from terrestrial organisms to organisms that can fly, from some flying dinosaurs to birds, from non-flying mammals to bats, from terrestrial mammals to cetaceans, the development of marsupials and their later radiation into many different species, and so on, were all begotten by chance, that is, by random changes in an organism’s genome which were then selected and propagated by natural selection because they were beneficial.
By the Randomness Principle, I mean the hypothesis that random changes made to a complex system will tend to decrease its order, and hence, increase its entropy. It is obvious that this principle negates the assumption that lies at the heart of the theory of evolution: if the Randomness Principle is true, then this means that the theory of evolution is mostly false; or, expressed in another way, for the theory of evolution to be true, then the Randomness Principle must be wrong.
Considered in the context of the Second Law of Thermodynamics, it can easily be seen that the Randomness Principle is a corollary of this fundamental law of physics. Since randomness is a form of entropy, and since random changes made to a complex system will tend to increase its entropy or disorder, it is absurd to suppose that random changes can preserve its order, while transforming a complex system or structure into a different form, or even increase its order, such as is believed to have occurred in the case of living organisms. Random changes made to a complex system will gradually increase the disorder and degradation, or entropy, of the system. And the more random changes that are made to it, the greater will be the resulting disorder and degradation. For example, in the case of architecture, random changes made to an existing structure most certainly cannot transform it into a completely different kind of structure, such as from a medieval cathedral into a modern building or skyscraper; and far from preserving its present degree of order, these random changes will inevitably tend to diminish that order.
To illustrate what I mean by a complex system, let us consider the total number of possible combinations of letters in a “word” of a given length in the English language. For a “word” of a single letter, there are 27 possibilities. For a “word” of two letters, there are 27 x 27 = 729 possible combinations. To obtain each successive total, one simply has to multiply the previous total by 27. By performing these simple calculations, we obtain the following table of the total number of possible combinations of letters of a given length in the English language from one to ten letters:
1 – 27
2 – 729
3 – 19,683
4 – 531,441
5 – 14,348,907
6 – 387,420,489
7 – 10,460,353,203
8 – 282,429,536,481
9 – 7,625,597,484,987
10 – 205,891,132,094,649
In this illustration, order is represented by the combinations of letters that actually form a word in the English language. The salient point that can be deduced from this exercise is that, as the number of letters increases, the percentage of those combinations that form a real word diminishes very quickly, so that, even by five letters, this percentage is almost certainly below one percent. Increasing the length of the combinations by a few more letters will diminish this percentage even further, so that it becomes a miniscule fraction of one percent. Eventually, one will reach the point where, beyond a certain length, there are no English words of that length or longer, meaning that the probability of randomly generating a word of that length or longer is zero. Since this result is not very interesting, in the case of longer combinations of thousands or millions of letters, we can make it more relevant to the discussion about living creatures by adding the space character as the twenty-eight “letter,” which is used to separate combinations of letters into shorter units, which units are more likely to form words. Then we can ask what the probability is of such long combinations of thousands or millions of randomly-generated letters forming a coherent text that has meaning.
A combination of one, two, three, or even four letters is not a complex system. Hence, there is a relatively high probability that a real word of these lengths could be generated randomly. In the case of words of one letter, this probability is 3/27 or 11%, since there are three English words consisting of just one letter: “a,” “I,” and the ejaculative or exhortative “O.” Similarly, by starting with a real word of that length and then making random changes to one of its letters, there is a fairly high likelihood that the result would be a real word. For example, by starting with “cot,” randomly changing the first letter could yield the words “dot,” “got,” “hot,” “jot,” “lot,” “not,” “pot,” “rot,” “sot,” or “tot;” randomly changing the second letter could yield the words “cat” or “cut;” and random changes made to the last letter could yield “cob,” “cod,” “cog,” “col,” “con,” “cop,” “cor,” “cow,” or “cox.” Continuing this process with one of the resulting, or second-generation, words would also result in a different word with a fairly high probability. For example, “cat” could be transformed into “bat,” “fat,” “gat,” “hat,” “mat,” “pat,” “rat,” “sat,” “tat,” “vat,” “cot,” “cut,” “cab,” “cad,” “cam,” “can,” “cap,” “car,” or “caw.” And the process could be continued with other resulting words.
Obviously, the longer the combination of letters, the more complex this system becomes. If one were to perform this exercise in the case of combinations of nine letters, the likelihood either that the randomly-generated combination would be a real word or that, starting with an English word of nine letters, such as “realistic,” “miniscule,” “existence,” “structure,” “reproduce,” “presently,” or “organisms,” and then randomly changing one of its letters to see if it would beget a different English word is already vanishingly small, since, as we can see from the table, there are more than seven trillion such possible combinations. And this probability continues to diminish and approach zero the longer is the sequence of letters.
In considering this example and how quickly the total number of possible combinations increases exponentially with each additional letter, so that, inversely, the percentage of ordered results diminishes very quickly and approaches zero, let us not forget that even the simplest single-celled organism is much, much more complex than a combination of even a few hundred or thousand letters. In fact, the complexity of even the simplest single-celled organism – which has several hundred genes, each of which genes contains anywhere from hundreds to thousands or millions of nucleotide base pairs, which can occur in one of four possible types – is comparable to the complexity of a book containing many millions or billions of words.
Let us consider a computer monitor whose screen measures 1000 x 1000 pixels. Hence, there are 1,000,000 pixels on the screen. Any regular computer user knows how many different possible combinations of these pixels can beget a coherent image, whether they are photographs, videos, text, graphic or animated images, drawings, and so on. These many different coherent images are comparable to the many different species of organisms that exist on the Earth. But much larger than this number is the even larger number of possible combinations of the one million pixels, the vast majority of which are incoherent and meaningless combinations of the pixels. In this case, this total is the number of different possible colours for each pixel to the power 1,000,000.
Each pixel consists of three phosphors whose colours are blue, green, and red. There are eight possible combinations of these phosphors: all on, which yields the colour white, all off, which yields black, one of the three on, yielding one of the three primary colours, and two of the three on, which yield other colours. Besides these, when one or more of the phosphors is only partly lit, other colours such as grey, orange, and pink can appear on the screen. Hence, there are more than ten possible combinations of the three phosphors in each pixel. To simplify the calculations, let us suppose that the number of different colours for each pixel is ten. This would mean that the total number of possible combinations of the one million pixels on this computer screen is 101,000,000, or 1 followed by one million zeros – a very, very large number.
As in the case of the combinations of “words” of different lengths, there are two questions we can ask: First, what is the likelihood that a randomly generated combination of the one million pixels on the computer screen will generate a coherent image? Second, beginning with a coherent image, regardless of what that image is, if we make random changes to the particular combination of pixels that comprise it, will these changes over time beget a different coherent image, or will they tend to beget disorder, so that, as more and more random changes are made to the image, it will tend increasingly to degenerate into a random combination of pixels?
Given the computer technology that we have presently, it is quite easy to try to answer both of these questions, the first of which relates to the likelihood that Life on Earth arose by chance, and the second to whether the many transformative changes that we know organisms have undergone repeatedly throughout the long history of Life’s existence on the Earth could have resulted from a series of random changes.
The consideration of the second question will show us what would actually happen to a living organism if the changes that are made to its genome were truly random, as is posited by the theory of evolution. Starting with a coherent image, each successive random change to one of the pixels will tend to diminish the image’s visual order. These random changes will appear as individual blemishes in a part of the image. For a long time, measured by the total number of changes, the image will still be visible, but there will come a point when it will no longer be visible, especially to a person who hasn’t seen the original image. Henceforth, with further random changes, rather than forming a different image, what will almost certainly ensue is that the image will continue to deteriorate into a visually meaningless combination of randomly-coloured pixels. In the case of a living organism, a few random changes in its genome will probably not result in a radical change in either its morphology, behaviour, or the way it develops. But as these random changes accumulate, there will increasingly be visible deleterious changes in its morphology and behaviour, as well as the way in which it develops. In other words, there will be an increase in its genetic entropy, which will be manifested as an unfitness to survive and procreate in its environment, as we would expect to happen due to the accumulation of all these random changes.
A real-world illustration of this phenomenon is the fact that the genome of a person who has been exposed to a high dosage of radiation does not improve. In this case, rather than being made one by one, these random changes are made all at once, with damage to parts of the genome also occurring. If such women give birth, their children exhibit all manner of strange, and in some cases hideous, birth defects. And yet, many people continue to repeat the unproven assumption that random mutations can improve an organism’s genome, and ultimately the gene pool of the species to which it belongs via the filtering process of natural selection.
The same process can be illustrated in other fields of human endeavour, such as music. If we were to take a long symphonic work and start making random changes to the individual notes, what will result is neither a preservation of the work’s musical order nor an improvement in it, nor will it result in a transformation of it into a different kind of work. Instead, these random changes will tend to increase the entropy, or noise or cacophony, of the work, so that eventually it will sound like an incoherent, disorderly, and unpleasant combination of random notes played by the different instruments.
The same thing would happen to any literary text whose individual letters are changed randomly. With just a few changes, the legibility of the text will not diminish significantly, since most people can guess what a misspelled or missing word is supposed to be. An example of this is the fact that most published works of any considerable length contain misspelled or missing words, and I suspect that other experienced readers besides myself can guess what the correct or missing word is from the context. But as this process continues, parts of the text will become more and more difficult to understand, to the point that some parts of it will become unintelligible, meaning that the intended meaning of the author has been lost in those parts. Even if one introduces a selection process somewhat akin to natural selection, by selecting words only if the random changes made to one of its letters results in another English word, while rejecting all other changes, this will still result in a loss of meaning, or literary order, since one cannot simply replace one word with another word of the same length and expect to preserve the meaning of the sentence in which it occurs.
If the Randomness Principle is true, then what this means is that the range of validity of the vaunted theory of evolution is very greatly reduced to applying only to those sexually reproducing organisms whose members possess a range of genetic variability that allows certain offspring, because of the particular combinations of genes which they happen to possess, to adapt to changing conditions in their environment. It may also be possible that simple, meaning single, mutations can likewise beget beneficial changes in an organism’s genome, which will enable it to beget more viable offspring that possess this new gene than other organisms of the same or other species; however, even in these cases, one can never rule out the possibility that these relatively simple beneficial changes were made by God, who has overlooked and largely determined the development of Life on Earth ever since its inception. But the widespread belief that a multitude of random changes made to an organism’s genome can likewise beget beneficial, in this case meaning adaptive, outcomes is pure Darwinian nonsense. In all of these cases, whether literary, visual, musical, or genetic, the greater the number of random changes that are made to a complex system, the less will be its order, and hence, the greater will be its entropy or disorder.
There is also the question of both why and how species of organisms have come to possess the range of genetic variability – not more and not less – that they possess. This range of variability is crucial for all species of sexually-reproducing organisms. Obviously, too little variability will mean that the individuals are not well-prepared if significant changes occur in their environment, to which they may have difficulty adapting. But too much variability is not desirable either, since this increases the likelihood of giving birth to a living organism that cannot function as a normal member of that species can.
The complexity of even single-celled organisms, measured in terms of the number of different possible combinations of the nucleotide base pairs that form the organism’s genes, is of an order of complexity that most human beings never have to consider in the course of their lives, no matter what kind of work they do or what interests they have. It is this backdrop of ignorance and misconception that has allowed so many gullible human beings to assume that random changes made to these incredibly complex living systems could beget improvements or increases in the organism’s genetic complexity and order, when clearly this is manifestly impossible in the finite world that we live in.