﻿ ﻿MATHEMATICS WITH NO EVOLUTIONARY ADVANTAGE - EVOLUTION, MATHEMATICS, AND THE EVOLUTION OF MATHEMATICS - The Remarkable Role of Evolution in the Making of Mathematics - Mathematics and the Real World

## Mathematics and the Real World: The Remarkable Role of Evolution in the Making of Mathematics (2014)

### 5. MATHEMATICS WITH NO EVOLUTIONARY ADVANTAGE

In this section we will examine a number of aspects of mathematics that, apparently, are not carried by our genes because they did not provide an evolutionary advantage during the formation of the human species (other nonnatural aspects of mathematics will be discussed later on). The current discussion is speculative, but further on we will present evidence corroborating the observations made here. We emphasize once again that the lack of an evolutionary advantage we are referring to relates to a period in which the genes determining the human species were developing. That is why mathematics of the type we will discuss here is not natural to intuitive thinking. This does not mean that this aspect of mathematics is not important or useful. Just the opposite. This type of mathematical ability provides a great advantage in the later evolution of human societies, but the time that has elapsed since human societies developed is not long enough for these abilities to have been etched into their genes.

The language of mathematics makes much use of quantifiers, expressions such as “for every,” or “there exists” that appear in mathematical propositions. For example, Pythagoras's famous theorem, which was proved as early as two thousand five hundred years ago, states that for every right-angled triangle, the sum of the squares on the two sides equals the square of the hypotenuse. The emphasis is on the quantifier “for every.” Another useful claim states that every positive integer is the product of prime numbers. A recent famous example is Fermat's last theorem. The hypothesis that it was correct was formulated as early as the seventeenth century but was unproven until the proof by mathematician Andrew Wiles of Princeton University, which was not published until 1995. The theorem states that for every four natural numbers (i.e., positive integers) X, Y, Z and n, if n is greater than or equal to 3, the sum Xn+ Yn cannot equal Zn. Throughout the thousands of years of development of modern mathematics, the proof that a particular property always holds was considered an achievement.

However, is it natural to examine whether a particular property always holds? When something occurs repeatedly under certain conditions, does it naturally give rise to the question whether it occurs every time those conditions hold? Not so. If experience shows that a tiger is a dangerous predator, the conclusion drawn is that if one meets a tiger one should flee or hide. Losing energy or time in abstract thought about whether that particular tiger always devours its prey, or whether every tiger is a dangerous predator, would not afford an evolutionary advantage.

Another concept often referred to in mathematics is the concept of infinity. The Greeks proved that there is an infinite number of prime numbers. Is the urge to prove this statement a natural one? On observing many elements, is it reasonable to ask whether there is an infinite number of them? Again, I think it is not. Imagine ancient man discovering that a certain region is teeming with tigers. Is it worthwhile for him to consider whether there is an infinite number of them, or would it be preferable for him to get as far away as possible from that area as quickly as possible? The question “Is there an infinite number of tigers?” and even the question “Are there many more tigers than the large and dangerous number that I have already seen?” are academic questions, which will only harm those who devote time and energy to them and hence will impair their chances of surviving in the evolutionary struggle.

Another type of claim developed by mathematics is expressed in the reference to facts that cannot exist. A statement such as “If A does not occur, then B will occur” is commonplace among teachers, students, and researchers of mathematics. We will come across many such examples further on. This way of thinking is also not natural. Activity of the human brain is based on association, on the recollection of things that happened. To base oneself on an event that did not take place may be possible and useful, but does not come easily or intuitively. When you enter a room, you look at what is in it and devote less thought to what is not there. We should repeat that we are not claiming that searching for an infinite number of mathematical elements, or proving that a certain property always holds, or relating to the negation of a possibility is an unworthy, unimportant, or uninteresting activity. What we are claiming is that those activities are not natural and that without a mathematical framework that suggests these possibilities, a reasonable person or an untrained student would not intuitively ask those questions.

Another attribute that is not innate in human nature is the need for rigor and precision. Mathematics is proud that a mathematical proof, provided it does not contain an error, is like an absolute truth. Mathematics therefore developed techniques of rigorous tests intended to lead to that absolute truth. Such an approach cannot have been derived from evolution. Genes do not direct humans to act rigorously to remove any possible doubt. The following anecdote illustrates this convincingly.

A mathematician, a physicist, and a biologist were sitting on a hill in Ireland and looking at the view. Two black sheep wander past them. The biologist says: “Look, the sheep in Ireland are black.” The physicist corrects him: “There are black sheep in Ireland.” “Absolutely not,” says the mathematician, “In Ireland there are sheep that are black at least on one side.”

Is the mathematician's claim, however rigorous and correct it may be, reasonable and useful in daily life? Of course not. In that sense, life is not mathematics. In life, even in ancient times, it is and was worthwhile and desirable to allow a lack of rigor, and even to allow errors, in order to achieve effectiveness. If a tiger's head can be seen above a bush, a man should not insist on being precise and saying that it has not been proven that the specific tiger has legs, but instead he had best distance himself from there as fast as he can.

We have claimed that the use of quantifiers and the interest in negatives or the reference to facts that cannot exist were not absorbed into the human brain during the evolutionary process and are not intuitive. Indirect evidence supporting this claim may be derived from studies that examined how many mathematical operations the human brain can perform consecutively. Calculations such as addition and subtraction can be performed one after the other almost without limit. A person can be asked to perform a long series of multiplications, additions, division, and so on, and if he manages to remember the order, for instance by discovering a pattern in it, he can internalize the instructions and develop intuition regarding the next operation. This does not apply to quantifiers and negations. “Every dog has a collar that is not green.” That statement uses three concepts of logic: every; has; and is not. Studies have shown that even if someone can remember the order of the operations, the largest number of quantifiers that the brain can absorb is seven. Beyond that, even the most capable person cannot assess the outcome of the operation. It is interesting that the limit to the number of logical operations the human brain can absorb is seven, the same number as the maximum number of elements that animals can identify (see section 2 above). Other indirect evidence is provided by the existence of certain individuals, some of them autistic and some with Asperger's syndrome, who can perform complex arithmetic calculations with amazing speed and accuracy. However no individuals have been found who can similarly perform complex logical operations. The reason is apparently that the ability to perform arithmetic calculations exists in the brain naturally and is strengthened disproportionately in people whose limitations do not allow them to develop other abilities. Logic is not one of those extreme abilities.

Why is it important to identify mathematical abilities that are innate by virtue of evolution and to identify other attributes that are not innate? Humans think intuitively, associatively, and it is possible and easy to develop intuition based on natural abilities. Abilities contained in the genes are easier to develop, nurture, and use. It is harder to do that with abilities that are not natural to the human species. The recognition that there is a distinction between those two types of mathematical operations and understanding the source of that distinction are important to the understanding and utilization of human thought. In the sections that follow, we will see how these differences are significant to the development of mathematics, and in the last chapter of the book, we will discuss the implications of recognizing these differences for teaching mathematics.

﻿