The International History Project
In the 4th century BC, the Greek philosopher Plato somewhat flippantly defined "man" as an erect and featherless biped. Subsequently Diogenes the Cynic, in an equally flippant fashion, displayed a plucked chicken and declared, "Here is Plato's man." Plato's student, Aristotle, also was concerned with verbal definitions and distinctions, but he went on to describe the natural world in a matter-of-fact fashion that has earned him recognition as the founder of the biological sciences. In his work on biology, he avoided the effort to treat biological entities by the use of rigid formal logic, and, though he made some inevitable errors in fact, his pragmatic approach has served as a model for biological observation ever since.
From long before the time of the ancient Greeks, human beings were generally recognized as members of the animal world. Much later, in the middle of the 19th century, Charles Darwin, in his brilliant book 'On the Origin of Species by Means of Natural Selection' (1859), forced the world to face the fact that all the living creatures of the world had almost certainly descended from a common ancestor. He further developed that view in his work 'The Descent of Man' (1871), in which he specifically stated that humankind ultimately shared a common origin with the rest of animate nature. At the time when Darwin was writing, there was only the most rudimentary sort of a fossil record to support his view, and he was further hindered by the use of the term man to stand for the human species as a whole. As that word suggests, there was a tendency to conceive of males as typifying the human condition. Obviously, females are of equal importance to the survival of the human species, and, somewhat belatedly, the field of biological anthropology has come to realize that males and females require equal attention if the phenomenon of humankind and how it emerged from its nonhuman predecessors is ever to be understood.
It came as something of a surprise when scientists determined that human beings share almost 99 percent of their genetic material with chimpanzees. This led one scientific journalist to refer to humans as "the third chimpanzee." Despite all that is held in common, however, the differences are crucial and allow humans to be allotted their own genus and species, Homo sapiens. Human feet have lost their grasping capabilities and clearly reflect the fact that humans are characteristically bipedal while chimpanzees and all of their other relatives are characteristically quadrupedal as well as being more clearly adapted to tree-climbing as part of their normal way of life. Humans also lack the fur coat that all other primates possess, and, relative to body size, human brains are nearly three times as large as those of the apes.
Finally, all human groups are completely dependent on the use of language, without which they could not survive, and there is nothing comparable among their nearest nonhuman kin. The learning of previous generations is passed on by linguistic means, and new insights and experiences by individuals can become the property of the group as a whole when these are verbally transmitted. This clearly is a key to human survival, and it is a uniquely human attribute. That body of verbally transmitted learning and traditions is referred to as culture.
Humans live in a culturally conditioned world to such an extent that it can be referred to as a cultural ecological niche. This is the arena in which the survival of the human species is played out. The occupants of the cultural ecological niche impose a series of selective pressures on each other as they use language and other aspects of culture to their advantage. In general, those who have trouble learning the rudiments of language will have less chance for survival. The cultural ecological niche puts a premium on those portions of the brain associated with linguistic capability. One would expect, then, that the evidence for the increasing complexity of the prehistoric cultural record would be linked to an increase in brain size of the associated prehistoric hominids. This is indeed the case.
Researchers can assess brain size and the form of limbs and feet of prehistoric specimens to see what they can tell us about the course of human development. It is much less easy, however, to tell such things as whether or not the prehistoric creatures in question had lost their fur coatings yet or whether they had developed the capacity for articulate speech. The soft tissues of the body do not generally survive the process of decomposition. Although the archaeological and anatomical record does provide us with some indirect clues by which the answers to such questions can be suggested, those answers are only tentative.
Evidence from East African fossils indicates that erect-walking bipedalism began at least 3.5 million years ago, substantially before there was any significant expansion of brain size over anthropoid ape levels. This was also a million years before the earliest known use of stone tools. At the same time, however, the canine teeth had ceased to project above the level of the other members of the tooth row the way they do in virtually all nonhuman relatives. This may indicate that the canine teeth were no longer being used for defensive or aggressive purposes and that they were being replaced in this capacity by handheld implements made of perishable materials. Even though actual tools have not been discovered, it seems unlikely that a relatively slow-moving terrestrial biped lacking defensively enlarged canine teeth could have survived without them.
THE STUDY OF ANCIENT HUMANS
Prehuman bipeds predated stone tools, which appeared approximately 2.5 million years ago. Their distribution and ways of life are known only from rare discoveries of usually incomplete skeletal remains. In most instances, their discovery was the work of paleontologists as a somewhat incidental by-product of their efforts at recovering the much more abundant skeletal remains of various kinds of prehistoric mammals (see Skeleton). Stone tools, however, do not dissolve and disappear the way bones often do. After their makers had discarded them, they continued to exist as witnesses to the activities of the early hominids--a term used for creatures that are more than just apes and which includes everything from prehuman bipeds up to contemporary human beings. Actual hominid skeletal remains are quite rare, but once stone tools begin to appear in the archaeological record, the areas occupied by toolmakers can be traced through the time writing begins and prehistory proper comes to an end.
The task of discovering the extent and form of prehistoric tools is the focus of archaeology. Since the records for prehistoric human existence are found underground, the most fundamental archaeological procedure is excavation--the essence of the archaeological dig. This is not a random procedure. Local residents, farmers, or workers who happen upon prehistoric tools will often alert archaeologists to an area of promise for systematic investigation. Overlying soil is removed, with care being taken to note its natural layers and the exact position of each object and artifact discovered in the process. The coordinates of each item are recorded so that the archaeologist later can try to determine how the various pieces came to be located where they were found, which can lead to understanding of the activities of the original makers. An essential part of the procedure is to collect information that can be used to establish a time level in the past when the hominid activities took place.
Estimating the Age of Finds
In the 19th century, attempts to establish the age of archaeological items that predate the beginning of written records were frequently inaccurate. Geologists provided estimates of how long it would probably take for layers of silt of a given thickness to accumulate in lake basins and river beds by the processes of wind- and rain-driven erosion of adjacent highlands. There was a lot of guesswork involved, and, though the geologists successfully established rough orders of magnitude, anything more precise required the use of techniques that simply were not available until well into the 20th century.
Right after the end of World War II, in a spin-off of the nuclear technology that had been part of the effort to create the atomic bomb, Willard F. Libby developed the radiocarbon, or carbon-14, dating technique. Carbon is one of the essential elements in all living matter, and a set proportion of the carbon incorporated in living tissue is radioactive and decays at a known rate. When an organism dies, no new carbon is taken up by any of its tissues. Not only do the soft parts decay and disappear, but the carbon-14 decays at a regular rate. The proportion of radioactive carbon-14 atoms to nonradioactive carbon-12 atoms can tell the analyst how long it has been since the tissue being analyzed belonged to a living creature. As radioactive elements go, carbon-14 decays relatively rapidly, and the technique ceases to give reliable indications of age for anything much older than 40,000 or 50,000 years.
During the 1950s, geophysicists explored the use of other radioactive elements for similar purposes. The first method to yield successful results was the potassium-argon (K/Ar) technique, which can accurately date materials ranging from hundreds of thousands to millions of years old. There was still a gap, however, between the oldest radiocarbon dates and the youngest potassium-argon dates, and it was during just that period of time that some of the more crucial events in human evolution took place. Subsequently, the investigation of the proportions between the radioactive isotopes of elements such as protactinium, rubidium, strontium, thorium, and uranium and their stable end products helped provide a check for the K/Ar estimates and also helped fill in some of the remaining blanks.
The assessment of the magnetic polarity of various prehistoric strata has also been useful in dating material. When the products of volcanic eruptions such as lava cool and crystallize, the crystals behave like tiny magnetized particles which align themselves according to the Earth's north-south axis--the Earth being a gigantic magnet with the North being the negative pole. Periodically in the past, however, the North and South poles have reversed themselves, with the North Pole spending a period as the positive pole. This does not seem to have affected conditions for the living world, but it does mean that the materials from volcanoes that erupted during such periods will have a reversed polarity. Careful work, especially on ocean-floor drilling cores, has enabled geologists to build up a picture of polarity reversals throughout the geological past. There are long periods of normal and equally long periods of reversed polarity with many irregularly spaced normal and reversed intervals that occur within them. While a reversed or normal period is not automatically dated without having a reliable radiometric assessment of the layer in question, the pattern does allow us to place given sequences in relation to each other. This has helped establish relative dates for strata that cannot themselves be dated directly.
More recently, the trapped electron charge in certain kinds of crystals has been measured to give an estimate of the length of time since the crystal was deposited at that particular location. This is the basis for the techniques called electron-spin, or paramagnetic, resonance (ESR, or EPR) and thermoluminescence (TL) dating procedures. Low levels of radioactivity from surrounding sediments get trapped in crystalline material such as tooth enamel or carbonate deposits that accumulate in cave sediments. By checking for the amount of naturally occurring radiation in the surrounding sediments, geophysicists can produce measurements of the trapped signal that are direct indicators of how long the crystal in question has been in that particular deposit. These techniques work well for ages that range from a few thousand up to more than a million years. This is the time period during which many of the crucial events of human evolution were taking place.
THE TIME SPAN OF HUMAN EVOLUTION
The earliest hominids identified so far were found in Africa and date from the Pliocene epoch, which began about 7 million years ago and ended just less than 2 million years ago. The succeeding period, the Pleistocene epoch, began just under 2 million years ago and ended about 10,000 years ago, at which time it gave way to the Holocene, or Recent, epoch. The Pleistocene has sometimes been referred to as the Ice Age, but it was not just a single period of unrelieved glaciation (see Ice Age). Older scientific studies describe four glacial onsets, but it is now widely accepted that glacial intensification has occurred at intervals of about 100,000 years for the past million years and more. Approximately a dozen periods of global cooling can be identified during which the areas that are now the north temperate zone experienced varying degrees of glaciation.
Human beings are derived from tropical primates, and to this day humans retain the physiological characteristics of tropical mammals. Survival, even in the temperate zone, would not be possible without the aid of such cultural elements as artificially constructed shelter, clothing, and heat. Until culture had developed to the point where it could provide such assistance, the earliest hominids were restricted to residence in the tropics. Although the anatomical evidence--bipedalism and reduced canine tooth size--suggests that tools of a perishable nature must have been used by the first known hominids 4 million years ago, the first recognizable stone tools appear in African deposits late in the Pliocene epoch about 2.5 million years ago.
Distribution of Early Hominids
The locus, or site, of human emergence was clearly in Africa. Humankind's closest relatives, the chimpanzee and the gorilla, are African, and it is accepted by many that humans and chimpanzees shared a common African ancestor perhaps 5 million years ago, before the Pliocene epoch began. Our own bipedal ancestor is known from skeletal fragments at least 4 million years old found in East Africa in northern Kenya and Tanzania. Skeletal fragments of those early bipeds dating from 3 to 4 million years ago have been found from central Ethiopia all the way down to South Africa.
It was announced in 1996 that French paleontologists digging in a dried lake bed in the Central African nation of Chad had discovered remains of a hominid they believed represented a new species. The scientists said that the hominid, Australopithecus bahrelghazalia, lived between 3 million and 3.5 million years ago. The discovery promised to challenge the conventional thinking about the location of the origin of human ancestors. The new find in the Chad desert was made 1,500 miles (2,400 kilometers) to the west of contemporary hominid fossils unearthed in Ethiopia.
The distribution of those who used stone tools beginning 2.5 million years ago can be traced from the tools they left behind, and these are found throughout the plains country of Africa, from its southern tip up through the grasslands of East Africa and across the northern edge of the continent to the Atlantic shores of northwest Africa. Evidently, human ancestors were adapted to living in relatively open country. Neither their bones nor their tools have been found in areas that were covered with tropical forests.
At some time during the Lower Pleistocene, between 2 and 1 million years ago, the distribution of stone tools spilled out of Africa, and it is clear that their makers had moved out into the tropics of the rest of the Old World. The only possible land route out of Africa is via the connection between Egypt and the Middle East, and it is not surprising that the earliest dated evidence for tools outside of Africa should be from Israel. These tools, from the site of 'Ubeidiya, date from about 1.4 million years ago, and their form is almost exactly the same as that of tools found in Olduvai Gorge in Tanzania that date from the same period. Tools dating from later periods are found in some quantity throughout the Indian subcontinent and over into Southeast Asia. Although there are dating problems, slightly more recent tools have been found in Europe and in China, indicating that their makers had extended their range up into the temperate zone both eastward and westward. There is reason to suspect, however, that human occupation in the temperate zone was only intermittent and that each time there was an intensification of glacial conditions the temperate-zone toolmakers failed to survive. The ensuing postglaciation return of the hominids to the temperate areas then represented renewed colonization from the population that continued to maintain itself throughout the tropics of the Old World.
There are two schools of thought concerning the locus of the origin of modern human form. One holds that modern human form was a unique African contribution, and that after it arose in Africa it then spread throughout the world and extinguished the earlier hominids that had been living there ever since their initial spread from Africa early in the Pleistocene. Within this school of thought there are differences of opinion on just when that spread of modern human form took place. Some place it several hundred thousand years ago, while others see it as having taken place between 100,000 and 50,000 years ago.
The second school of thought notes that the archaeological evidence supports only one major movement out of Africa in the early Pleistocene. In this view, modern human form emerged more or less simultaneously throughout the whole area occupied across the Old World as a consequence of the effects of cultural innovations that were spread from one group to another and which effectively changed the nature of impinging selective forces throughout the range of human occupation. In this view, the modern African form developed in Africa from non-modern African antecedents; the modern Asian form emerged from preceding non-modern Asians; and the modern European form came from the transformation of non-modern European ancestors.
EMERGENCE OF HUMAN FORM
The Problem of Names
The name hominid is used for everything that is not properly an anthropoid ape right up to and including modern human beings. The term is useful since it is doubtful that most people would think of the earliest representatives as fully human if living examples were encountered in the world of today. Although they walked on two feet, there is reason to suspect that they still maintained a chimpanzee-like fur coat and that they retained climbing capabilities that have long since been lost in the human line; it is also most doubtful that they possessed anything that would be recognized as a language. On the other hand, they had features that pushed them in a human direction to an extent not found in any living apes. Not only were they bipedal, but they almost certainly depended on the use of tools for their survival, which means that they had the rudiments of culture to an extent not true for any living nonhumans in the world today.
The first representative of these early hominids to have been discovered was examined by the anthropologist Raymond A. Dart in South Africa in 1924, and he named it Australopithecus (meaning "southern ape") africanus. The specimen was an immature skull, face, and jaw, and, though there is still a question as to what it would have looked like had it grown to maturity, this name has been used for many of its contemporaries of more than 2.5 million years ago. Some of its contemporaries display aspects of difference--relatively larger jaws and teeth, for example--that have led to their recognition as separate species. Many professionals, however, feel that they still belong to the same genus--Australopithecus. For this reason, the whole range of early hominids, from the time of their first appearance well back in the Pliocene up to the onset of the Pleistocene nearly 2 million years ago, have been referred to as australopithecines.
Late in the stretch of time encompassing the australopithecines there are clear signs that differentiation was becoming more apparent, and an adaptation was gradually emerging that would come to be recognized as a true human being deserving of recognition as a member of the genus Homo. Various fossil fragments from the time between the first appearance of stone tools and the appearance of the first unequivocal member of that genus, H. erectus, somewhat earlier than 1.5 million years ago have been classified as, or referred to, the genus Homo. Many tentative specific, or species, names have been offered, and the one that is most widely referred to is H. "habilis," though there is no agreement on just what specimens actually belong to that species. (The use of quotation marks around terms of nomenclature indicates that there is some doubt about whether the name was properly proposed or whether it can be properly applied to the specimens in question.)
By 1.6 million years ago, it is clear that one hominid line had achieved a brain size double that of australopithecines, or somewhat more than two thirds of the modern average. Tooth size had dropped markedly from the australopithecine range down to the top of, and somewhat above, the modern range; and the hints of an ancillary climbing adaptation had disappeared. Clearly this is a true if primitive human being, and this is what is recognized by the species designation H. erectus. Although there is evidence to show that cerebral reorganization had taken place in the speech-control center of the brain, there are reasons to doubt that erectus possessed what we would recognize as language. This is the reason erectus is recognized as specifically distinct from H. sapiens.
By the time modern levels of brain size had been reached about 200,000 years ago, there are aspects of the archaeological record that lead scientists to suspect that the rudiments of speech had also been developed. This is the point where it is legitimate to use the specific designation H. sapiens. However, these archaic representatives of H. sapiens differ from the preceding H. erectus only in the possession of modern levels of brain size. All the other features of the dentition and aspects of skeletal and muscular robustness can hardly be distinguished from the comparable features of H. erectus.
The emergence of fully modern form was the result of a series of reductions in the face, jaws, and teeth, and in separate reductions in the postcranial levels of muscularity and skeletal reinforcement. Fully modern human form is classified as H. sapiens sapiens. The separate trajectories, or lines of development, of dentofacial and postcranial reductions were responses to cultural adaptations that arose in different parts of the world at different times. Eventually these reductions spread to all human populations.
Creatures that are recognized as australopithecines are known from specimens that date from as early as 4 million to perhaps as recently as 1 million years ago. From 4 million to 2.5 million years ago, there is no evidence for stone tools, though there is reason to believe that australopithecines relied on implements of a perishable nature for their survival. All of the evidence known so far comes from Africa, and the earliest specimens have been found in East Africa from central Ethiopia to Tanzania.
From the anatomy of the pelvis, leg, and foot, it is evident that the australopithecines were erect-walking bipeds and not quadrupeds. The hands, then, were freed from regular duties in locomotion. However, the anatomy of the hand, wrist, shoulder, and chest suggests that australopithecines retained a climbing capability that was later lost. It is possible that they continued to use trees as places of refuge at night.
Although the canine teeth had ceased to project beyond the level of the others in the tooth row, australopithecine molars were actually larger than those of anthropoid apes of comparable body size. The lack of projection of the canine teeth has been taken as evidence that defense was relegated to tools held in hands that were no longer used for locomotor purposes. The large size of the molars suggests that the diet included items that had a higher grit content than the diet of apes like chimpanzees. It is possible that australopithecines were competing with baboons for the available resources of the African grasslands, since baboons also have relatively large teeth in proportion to their body size. Access to edible roots and tubers could have been aided by the use of hand-wielded digging sticks, and it has been suggested that such implements might well have doubled as defensive implements.
All the available indications suggest that the australopithecines started as grassland gatherers. Male body size was twice that of females, a degree of sexual dimorphism that is more comparable to that found in the gorilla than in chimpanzees or humans. This suggests a social system in which one or a few dominant males monopolized sexual access to females within the group.
By 2.5 million years ago, the addition of stone tools to the cultural repertoire signals the beginning of a major shift in adaptation. For a long time, archaeologists focused on what they called the pebble tools themselves, but there is now reason to suspect that those were principally blanks from which the real tools were struck. The actual tools used were the flakes that were removed from those pebble cores, and the flakes were used until the edges became dull, at which time a fresh flake was detached for use. From the quantities of flakes found in association with processed animal skeletons, it appears that the flakes were used as butchering implements. And from the circumstances surrounding the location and positioning of the skeletons in question, it seems likely that the australopithecines were scavenging the carcasses of large animals--such as hippopotamuses and elephants--that had died of natural causes or been killed by predators.
There were at least two major categories of australopithecine: one with very large teeth and heavy jaw muscles, referred to as "robust" australopithecines, and another called "gracile" australopithecines. The main difference, in fact, is in the size of the jaws and teeth but, beyond that, there is no appreciable difference in body size between the robusts and the graciles. Sexual dimorphism was pronounced in both groups. The evidence suggests that the large-toothed robust australopithecines concentrated strictly on plant foods with high abrasive content, while the gracile ones selected from a wider range of edible resources. The number and arrangement of small openings for blood vessels in the skulls of gracile australopithecines indicate the development of a surface circulation mechanism involving the skin on the outside of the skull that would keep the brain from overheating during strenuous activity. This mechanism is even more well-developed in the ensuing H. erectus and H. sapiens, and it suggests that the gracile australopithecines were beginning to adjust to the pressures of hunting activities pursued in the heat of the day. The increased capacity for dissipating heat, as indicated by the development of this kind of circulation to vessels in the skin, suggests that selective forces may have been at work to begin to reduce the fur coat normally present on terrestrial mammals.
The various specimens referred to as H. "habilis" are largely known from East African collections belonging to the period between the last known gracile australopithecines and the first clear-cut representatives of H. erectus approximately 2 million years ago. Quite obviously a major change was taking place in hominid adaptation. Robust australopithecines continued without change and eventually became extinct after H. erectus was firmly established, though it is not known whether erectus played any role in the disappearance of the robust australopithecines.
The transformation of a gracile australopithecine into a recognizable representative of H. erectus required a doubling of brain size and a major reduction in the size of the molars. During the period that contains those specimens sometimes referred to as H. "habilis," all sorts of combinations occur. There are specimens that have enlarged if not fully erectus-size brains but that retain fully australopithecine-size teeth, and there are specimens where there has been no brain expansion at all and yet tooth reduction is down to erectus levels. There are also many intermediate specimens. Adding to the confusion is a proliferation of names applied to the same specimens. One of the most convincing specimens to bear the name H. "habilis" has also been called A. "habilis" by one distinguished authority and Pithecanthropus rudolfensis by another.
There is no doubt that there was a transition going on between 2 and 1.6 million years ago. The problem is to identify which specimens should be classified together and what they should be called. The confusion has left many specialists quite uncomfortable, which is why the term "habilis" is often left in quotation marks.
By 1.6 million years ago, the confusion is resolved, and two obvious hominids can be identified: one is a surviving strain of robust australopithecine, and the other is H. erectus. For that time period, demonstrable H. erectus specimens are known only from Africa. The tools with which they are associated are distributed throughout the grasslands of the continent from the southern tip up through the plains of East Africa, north to the Mediterranean, and west to the Atlantic coast. Actual specimens are known from Olduvai Gorge in Tanzania, East Turkana and Nariokotome in Kenya, and Swartkrans in South Africa.
By 1 million years ago, there was only one hominid left--H. erectus--and it was distributed throughout the tropics of the Old World from Africa through the Indian subcontinent and on into Southeast Asia. The fossil remains are rare and fragmentary, but the area of occupation is indicated by the distribution of the tools that were the products of erectus' manufacturing activities. The stone tools were apparently used for butchering hunted animals, and the indications are that H. erectus was engaged in the systematic hunting of game in addition to continuing the collecting activities that had characterized the subsistence of the earlier hominids. H. erectus had effectively become one of what has been called the "large-carnivore guild," and it was the successful exploitation of the hunting way of life that allowed them to spread out of Africa and throughout the Old World tropics. During interglacial episodes throughout the latter half of the Lower Pleistocene and much of the Middle Pleistocene (750,000 to about 120,000 years ago), H. erectus made temporary incursions into temperate latitudes at both the eastern and western extremities of the Old World.
The tools that are most commonly associated with the manufacturing activities of H. erectus are the ovate, bifacial Acheulean hand axes, named after Saint-Acheul, France, the site where they were first identified in the 1840s. However, other evidence suggests that those bifacial tools had not been developed before the erectus hunters spread out of Africa. If, as suspected, their function was in severing the major joints of the animals hunted, the hand axes were more an adjunct than a basic component of hunting technology. In any case, hunters at the erectus stage had spread all the way to the Far East before bifacial tools were invented. These were slowly adopted and spread among the inhabitants from Europe through India, but they never reached the eastern edge of erectus habitation.
The physical differences between H. erectus and modern humans are most evident in the head. Brain size ranges from just over half the modern human average up to the lower reaches of the modern human range; tooth size is about 50 percent larger than that of present-day people living in the north temperate zone; and cranial reinforcements such as browridges include the most strongly developed of any in the course of hominid evolution. From the neck on down, however, differences are more those of degree than of kind. Long-bone shaft thickness is greater than that found in living humans, and there is pronounced evidence for muscularity and joint reinforcement. Sexual dimorphism is considerably less than in australopithecines but more marked than in present-day humans. Otherwise, such things as stature and proportions of limb length are essentially the same as they are today.
The one thing necessary to convert H. erectus into H. sapiens was the expansion of the brain to modern levels--that is, the increase from an average of about 1,000 cubic centimeters (61 cubic inches) to somewhere in the neighborhood of 1,350 cubic centimeters (82.4 cubic inches). Complicating this assessment is the fact that brain size is proportional to body size in any given group, and that erectus brain size increased from less than 900 cubic centimeters (55 cubic inches) in the late Lower Pleistocene to 1,200 cubic centimeters (73 cubic inches) and more in the Middle Pleistocene.
When all is taken into account, it appears that modern levels of relative brain size were reached sometime between 300,000 and 200,000 years ago, in the late Middle Pleistocene. The people in which this can first be observed were no taller, on the average, than humans living today, but they were distinctly more robust. These people have been called Neanderthals, or H. sapiens neanderthalensis. The first such specimen was discovered in the German valley called the Neander Tal in 1856. Neanderthal brain size was actually near 1,500 cubic centimeters (91 cubic inches), but, because Neanderthals were bulkier than modern humans, the proportion of the brain to the body was essentially the same as it is today.
For many years, scientists believed that Neanderthals were the direct ancestors of modern humans. However, in 1997, a landmark study of DNA extracted from the 1856 fossil produced strong evidence to suggest that this was not the case. The scientists compared the DNA sequence of a region of the fossil mitochondrial genome with the sequence of a comparable region in the mitochondrial DNA of modern humans. The results indicated a threefold difference between the sequences of modern humans and Neanderthals. Furthermore, the type of differences, as well as their locations in the sequences, suggest it is highly unlikely that Neanderthals contributed to the modern human mitochondrial gene pool. The scientists further calculated that, while Neanderthals and modern humans did indeed share a common ancestor, the two lineages diverged between 550,000 and 690,000 years ago; the first known Neanderthal is placed at approximately 300,000 years ago, while the first modern humans are believed to have evolved around 200,000 years ago.
After the brain-body ratio reached its modern proportions, relative brain size stabilized in all human populations and has remained the same ever since. It is clear that major behavioral changes are indicated by the archaeological record. For the first time, there is evidence for regional differentiation in the style of making stone tools. At first, the areas of shared stylistic similarities are relatively large: all of Western Europe, for example. Subsequently, these regionally restricted areas of shared style elements become smaller and more clearly defined. Although the implications cannot be proven, it may be significant that the regions within which shared styles were maintained are strikingly similar in extent to what are separate language areas for human populations today.
This is probably as close as researchers shall ever get to obtaining evidence for the origin of language as we know it. With language, the innovations of the brightest members of a group can be transmitted to all other members.
Technology and Modern Form
About 10,000 years ago, after the attainment of modern levels of intellectual and linguistic capacity, the remaining changes that produced modern physical form were all in the nature of reductions from Middle Pleistocene levels of robustness. Unlike the achievement of modern mental capabilities, however, the various aspects of robustness reduction pursued different trajectories in different parts of the world. These trajectories were related to the development of particular aspects of technology in particular places. These technologies in turn changed the nature of the imposition of selective forces for their inventors and for those who subsequently received the inventions.
Two examples of how the development of technology affects the imposition of selective forces have been particularly important to the shaping of human form. First is the control of fire and its use for food preparation. While there are claims for the hominid control of fire going back into the Pliocene, the undisputed evidence for the continuous and necessary use of fire goes back only as far as the late Middle Pleistocene, just about the time at which H. sapiens can first be recognized. It was the control of fire that allowed people to remain in the western part of the north temperate zone--ranging from the Middle East to the Atlantic shore of Europe--during the next-to-last glaciation somewhat more than 200,000 years ago.
Fire not only provided necessary warmth for physiologically tropical beings in a glacial climate, but it was also essential to thaw food that had become frozen between the time that the game was killed and the time that it could be consumed. For the first time, cooking was obligatory. This not only made eating frozen food possible, it meant that the total amount of chewing over the course of an individual's life was reduced from previous levels. The relaxation of selective forces allowed mutations to accumulate, and, because most mutations lead to a reduction of the structure that the mutated genes modify, the consequence was the reduction of jaw and tooth size for the descendants of those who first cooked their food simply in order to make it ingestible. The consequence of this is that modern human dentofacial reduction has proceeded furthest among the inhabitants of the north temperate zone and particularly among those who continue to live in that stretch from the Middle East to Western Europe where obligatory cooking can first be identified.
Cooking and facial reduction probably first developed among the northern Neanderthals, and their African contemporaries produced an innovation--the spear--that led to the modification of the body from the neck down. The stone tools used by the Neanderthals in Europe, the Middle East, and Africa have been referred to as Mousterian tools after the site of Le Moustier in France where they were first identified in the 1860s. Hand axes continued to be made, often of a much more refined type than those of the earlier Pleistocene, but the Mousterian industry is characterized by a proliferation of points and scrapers made from flakes. The scrapers in the north were clearly for the preparation of animal skins for clothing.
In Africa, long, elegantly prepared flakes were part of the assemblage of Mousterian tools. These are called Levalloisian flakes, after the Paris suburb where they were first identified at the beginning of the 1930s. Levalloisian flakes were not important in European Mousterian assemblages, but they were quite common in the African equivalent of the Mousterian, the Middle Stone Age, or Mesolithic, which goes back to about 200,000 years ago. Experimental testing has shown that the fracture patterns observed on many of the African Levalloisian flakes could only have been produced by their use as projectile points.
It would appear that the thrown spear as an adjunct to hunting activities was initially an African innovation. For the first time, the hunter did not literally have to come to grips with his prey and could impale it from a distance. The consequence was the relaxation of selective forces maintaining Middle Pleistocene levels of muscularity and robustness in the body below the neck. Reduction in postcranial robustness then appeared for the first time among Africans towards the end of the Middle Pleistocene. Representatives of these actually got as far north as Israel, where they appear at the site of Qafzeh early in the Late Pleistocene about 100,000 years ago. While the Qafzeh people had a relatively modern appearance, the regular use of fire for cooking had not yet penetrated into sub-Saharan Africa, so the African representatives at Qafzeh had unreduced Middle Pleistocene levels of jaw and tooth size.
Eventually, cooking technology spread southward, and jaw and tooth reduction proceeded in a fashion parallel to what had been going on in the north for the previous 100,000 years or more. Projectile technology was adopted by the people of the north, and, in comparable fashion, the result was the reduction in the previous levels of arm and shoulder robustness over the course of time. These aspects of technology spread independently into other parts of the world, and the consequences for their adopters proceeded in a predictable manner. As a result, no population in the world today retains a fully Middle Pleistocene level of robustness in either its dentition or its chest and shoulder morphology; that is, there is no living human population that would qualify for the designation Neanderthal.
There is one other technology that led to a reduction of robustness and muscularity even more effectively than the use of projectiles: the invention and application of string. Nooses, snares, and nets not only allowed the capture of prey without the previously necessary levels of exertion, but they also gave people access to a huge biomass (the quantity of living matter available within a habitat) that had formerly been unavailable. Flocks of birds, schools of fish, and rabbit-size mammals represent an enormous food resource. After the end of the Mousterian--the technological complex usually associated with the Neanderthals--people of the ensuing Upper Paleolithic did not at first see much of a change in their way of life. In the latter part of the Upper Paleolithic, starting somewhat more than 20,000 years ago, the use of a string-based technology changed the nature of expectable dietary resources, and one of the immediate consequences was a dramatic increase in human population size and density.
The earliest evidence of art created by humans also dates from this period. The dramatic polychrome paintings of game animals on the walls of caves in southern France and Spain provide evidence that the realm of human visual imagery was every bit as well developed 15,000 to 20,000 years ago as it is today. It was just good fortune that led to the preservation and discovery of these galleries of prehistoric art, and there is no reason to think that contemporaries of those remarkable artists elsewhere throughout the Old World were not every bit as talented. Circumstances just did not favor the survival of their creations.
The last glaciation of the Pleistocene came to an end between 10,000 and 12,000 years ago, and the disappearance of the ice sheets coincided with the first entry of humans into the Western Hemisphere--the New World--by way of what was then dry land connecting Siberia and Alaska. Already at that time, people in various parts of the world were experimenting with new methods of subsistence. Technology was used to prepare and process certain plants that previously would not have been of much value for human consumption. Grains and legumes--such as beans and peas--require modification by heat before they can be digested by human beings, and, even before the invention of heat-proof clay containers, people had discovered that milling grains into meal allowed the grains to be cooked in the form of wafers or loaves. This opened up a vast realm of dietary resources and prepared the way for the enormous expansion of human populations that began after the Pleistocene had ended.
The focus on the collection of plant foods led naturally to efforts to assist the propagation and growth of those plants, and the shift was under way towards full-scale farming--agriculture. This led to the development of settled farming communities. This signaled an end to the nomadism of the Pleistocene hunting and gathering way of life.
It should be noted that, with only a tiny and dwindling number of exceptions, the vast majority of the peoples of the world today are sustained by the products of agriculture. Even the few who can still be called hunter-gatherers practice a sophisticated post-Pleistocene or Middle Stone Age kind of foraging. Only a very few small contemporary human populations are pursuing the kind of subsistence strategy that human ancestors followed from the time that H. erectus spread out of Africa more than a million years ago to the end of the Pleistocene just over 10,000 years ago. And those few populations are subject to the same kind of selective forces that affected H. erectus. In this sense, virtually no modern human group is biologically adapted to its current way of life.
The increase in population density and the proximity to domesticated animals that were the consequences of a farming way of life greatly increased the spread of communicable diseases. Peoples with several thousand years of agricultural background--that is, nearly all people alive today--have developed a degree of immunity or resistance to many diseases. However, there have been few visible changes in human form that have arisen as a consequence of the development of a settled, agriculturally supported way of living.
There is one change, however, that does merit mention. The application of heat to frozen foods back in the Pleistocene had reduced the selective forces maintaining prehistoric levels of jaw and tooth size, but the invention of pottery eliminated those selective forces entirely. The result was that the descendants of the first humans to invent and use pottery have undergone more dental reduction than any of the other people in the world. The modern form of the human chewing machinery, then, is the result of the only really significant visible biological change to have occurred since the appearance of agriculture about 10,000 years ago. Scientists may predict that the technological aids that are now virtually universal will lead to other types of morphological reduction. It takes at least 10,000 years, however, for the consequences of new technology to become evident in such physical changes. The prediction of specific changes, therefore, cannot properly be regarded as science because they are not in the realm of the verifiable.
Human Origins Terms
anthropoid. Relating to the lesser apes (the gibbons) or the great apes (the orangutans, chimpanzees, and gorillas). The anthropoid apes are the nearest living relatives to humans.
bifacial. Shaped on both faces; for example, a stone tool such as a hand ax of the Lower Paleolithic.
biomass. The amount of living matter in a given area or habitat, especially that available as a source of energy to organisms in the habitat.
bipedal. Habitually walking on two legs. Many animals walk on two legs--for example, birds--but only hominoids have pelvis and legs adapted for the support and propulsion of the body in the erect position. Chimpanzees, gorillas, and others besides humans are frequently bipedal. Humans, however, are the only ones that are habitually bipedal--that is, while humans can climb trees and employ other forms of locomotion, the human body is specifically adapted for upright locomotion on the ground. The adaptation of bipedalism was an important development in human evolution.
browridge. Bony arch above the eye socket. Heavier and joined together to form a single arch above the nose in primitive humans; much reduced and separated in modern humans. Heavy single arch present in gorillas, Homo erectus, and, to a lesser extent, Neanderthals.
canine tooth. The conical, pointed tooth located between the lateral incisor and the first premolar. core. Stone from which flakes are struck during the process of making stone tools.
dentofacial. Term used in describing the characteristics of the teeth, jaws, and face. erect posture. Erection of the trunk. All primates have the ability to walk in the upright position--with the trunk held erect--though many are habitually quadrupedal when walking or in the resting position, and the structure of their bodies is adapted accordingly. Humans and their ancestors are the only primates whose body structure is adapted for maintaining the habitually erect posture while both standing and walking.
flake tool. A stone tool made from a piece of stone detached from the core by striking it off. Two examples are hand axes and scrapers.
gracile. Small, lightly built.
hominid. Member of the family Hominidae, the zoological classification in which humans and their ancestors belong. Included are prehumans such as australopithecines. Not included are the living anthropoid apes--the great apes of the family Pongidae (orangutan, chimpanzee, and gorilla) and the lesser apes of the family Hylobatidae (gibbons).
hominoid. Member of the primate superfamily Hominoidea, which includes humans, the apes, and their extinct relatives.
Pliocene. The youngest epoch of the Tertiary period, from about 7 million to about 2 million years ago, immediately preceding the Pleistocene epoch. Name means 'most recent.' Human and ape evolutionary branches were separate at this time.
Pleistocene. The earlier epoch of the Quaternary period, beginning about 2 million years ago. Sometimes considered to date up to the present, though the period termed the Holocene is usually considered to have begun about 10,000 years ago and to continue up to the present. Characterized by unstable global climate (this is the period of the Ice Ages). Modern humans appeared in the Late Pleistocene.
point. Tool thought to have been used for a spear or projectile. Would be fastened to a shaft for use as an arrowhead, for example. Points of both stone and bone have been found.
postcranial. Term for the bones of the body other than the cranium, or skull, and jaw.
primate. Member of the zoological order that includes lemurs, bushbabies, Old and New World monkeys, apes, and humans. Primate order is recognizable as far back as 70 million years ago.
quadrupedal. Habitually walking on four limbs. Compare with bipedal. robust. Large, heavily built.
scraper. A flake tool thought to have been used for either cleaning skins (side scraper) or woodworking (end scraper or steep scraper).
sexual dimorphism. Physical differences--such as those of shape, size, or color--between the males and females of a species.
strata (singular: stratum). Layers of geological or archaeological material deposited over time. Strata, and the materials within them, can often be classified and dated when the sequence of geological or archaeological events can be reconstructed. The sequence of deposits is called its stratigraphy.
temperate zone. Geographic regions between the tropic of Cancer and the Arctic Circle and between the tropic of Capricorn and the Antarctic Circle. Temperate zone climate is characterized by cool or cold winters and warm or hot summers.
tropic zone. Geographic region between the Tropic of Cancer and the Tropic of Capricorn; the equatorial zone.
type site. Usually, the site at which a stone-tool industry was first discovered. Sometimes means the site containing artifacts that best represent the style.