Warning: mysql_real_escape_string(): 57 is not a valid MySQL-Link resource in /var/www.arte-coa.pt/Classes/DataSource.php on line 92 Warning: mysql_query(): 57 is not a valid MySQL-Link resource in /var/www.arte-coa.pt/Classes/Ligacao.php on line 103 Warning: mysql_real_escape_string(): 57 is not a valid MySQL-Link resource in /var/www.arte-coa.pt/Classes/DataSource.php on line 92 Warning: mysql_query(): 57 is not a valid MySQL-Link resource in /var/www.arte-coa.pt/Classes/Ligacao.php on line 103 Côa

LanguagePrintCritical dictionary


João Zilhão

The last fifty years of scientific research established beyond reasonable doubt that the earliest human ancestors appeared in Africa some time around two million years ago. Soon after, these Homo erectus people expanded into Eurasia. By one and a half million years ago, they had already reached the Indonesian island of Java, but it would take a bit longer for Europe to be stably settled.

The earliest evidence comes from Iberia, where the so-called Homo antecessor fossils from Atapuerca date to about one million years ago. Coeval African fossils are scarce, but, altogether, the evidence suggests that a trend towards increased brain size and correlated changes in the shape of the skull was under way at this time throughout the entire Old World. Geneticists have related these changes to a second Out-of-Africa expansion, represented, archeologically, by the spread of the Acheulian technocomplex, whose iconic stone tool is the handaxe or biface.

Keywords: religion; language; writing; origins of art; limit; landscape




The last fifty years of scientific research established beyond reasonable doubt that the earliest human ancestors appeared in Africa some time around two million years ago. Soon after, these Homo erectus people expanded into Eurasia. By one and a half million years ago, they had already reached the Indonesian island of Java, but it would take a bit longer for Europe to be stably settled.


The earliest evidence comes from Iberia, where the so-called Homo antecessor fossils from Atapuerca date to about one million years ago. Coeval African fossils are scarce, but, altogether, the evidence suggests that a trend towards increased brain size and correlated changes in the shape of the skull was under way at this time throughout the entire Old World. Geneticists have related these changes to a second Out-of-Africa expansion, represented, archeologically, by the spread of the Acheulian technocomplex, whose iconic stone tool is the handaxe or biface.

Subsequent geographic isolation led to the differentiation of these Acheulian populations into two lineages. In Europe and western Asia, Homo erectus became Neandertal man some time around 500,000 years ago. At the same time, in Africa, Homo erectus became Homo sapiens (or “modern humans”), and, some 50,000 years ago, in the framework of a third Out-of-Africa event, spread into Eurasia, Australia, and, eventually, the Americas.

That early African sapiens are ancestral to all living humans is now widely accepted, but the role played by Eurasian Neandertals in the evolution of humankind remains controversial. The level of their separation in taxonomy, the extent of their differences in biology, behaviour and culture, and their ultimate fate, are among the hottest topics in Paleoanthropology. Neandertals are no more, but when, why and how were they replaced? And did they go extinct without descent, or were they assimilated into the expanding populations of modern humans and, therefore, are they also to some extent ancestral to living humans?

These questions have fundamental implications for the understanding of the emergence of art, language and symbolic thinking in the human lineage. The long lasting geographical segregation of the two paleontological taxa, Homo neanderthalensis and Homo sapiens, and the ultimate replacement of the former by the latter, are widely assumed to imply that they were truly different biological species. And, as textbook definitions require species to differ in behaviour as much as in morphology, the corollary expectation is that significant behavioural differences, with attendant cognitive implications, must have separated anatomically “modern” people from coeval “archaic” humans, namely the Neandertals.

The notion that such a separation existed at biospecies level dovetails with speculations that certain features of complex human culture which are undocumented in the archaeological record of Homo erectus and other early humans — such as art, or ritual burial — must have emerged as a by-product of the processes involved in the speciation of the African sapiens. Under this “Human Revolution” hypothesis, the absence of those features reflects the lack of the required cognitive capabilities. In this view, it is only only after the acquisition of such capabilities by the first “modern humans” that the corresponding behavioural correlates could be and indeed were externalized in archaeologically visible ways.

Material evidence consistent with this paradigm — marine shell beads, interpreted as personal ornaments and, in some instances, associated with skeletal remains of early modern humans — has been obtained over the last decade at archaeological sites dated to between 75,000 and 100,000 years ago in Palestine, Morocco and South Africa . In the ethnographic present, personal ornaments play the role of conveyors of the social identity of persons — group membership, gender, and individual life-history characteristics (age, marital status, etc.). Working with such symbolic systems of personal presentation and re-presentation implies language and requires cognitive capabilities unknown among our closest living relatives, the chimpanzees. Although we have to bear in mind that, prior to the invention of writing systems, the evidence for language can only be indirect, all of this still is rather uncontroversial. But was the emergence of such capabilities in the human lineage as recent a phenomenon as postulated by the “Human Revolution”?

A major empirical hurdle faced by this paradigm is that the putative speciation event leading to Homo sapiens occurred between 150,000 and 200,000 years ago, which begs an obvious question: If symbolic thinking and modern cognition are a simple by-product of the biological processes involved in that speciation event, why did it take at least 50,000 years for manifestations of those capabilities (such as the African shells beads) to appear in the archaeological record? And why was it that yet another 50,000 years were necessary for the emergence of figurative art (the earliest examples of which are the cave paintings of Chauvet, in France, and the animalistic, anthropomorphic and therianthropic ivory figurines from different caves of the Swabian Jura, in Germany, dated to about 35-40,000 years ago).

From the ethnographic record, we also know that the visual display of objects conveying information on the personal and social identity of the individuals carrying such objects is targeted at encounters with strangers or people infrequently met. There are two rather good reasons for this: firstly, without some prior experience of interaction, the meaning of the visual symbols would be opaque to the viewer; secondly, identifying one’s affiliation or identity to family and immediate acquaintances does not require material symbols. The possibility exists, therefore, that the appearance of ornaments in the archaeological record reflects the crossing of demographical thresholds above which long-distance interaction networks involving alliance, exchange or mating were necessary.

If so, then the absence of evidence for “modern” cognition prior to about 100,000 years ago would not be evidence for its absence among anatomically “modern” humans prior to that time; it would simply mean that, in those days, the social life of humans had not yet effected the “release from proximity” that eventually generated the need to have symbolic identities and ways of displaying it. But once we admit that the emergence of the earliest evidence of “modern” cognition can relate to social, not biological processes, we have no choice but to ask ourselves whether the same does not apply as well to earlier humans, namely Homo erectus. Given that brains do not fossilise and that the evidence for language in Palaeolithic times is indirect, could it be that, cognitively, these earlier ancestors were also fully human, i.e., gifted with such behavioural features as language and all its paleontologically invisible neurological correlates? Put another way, could it be that language and symbolic thinking appeared in the earliest, not the latest stages of human evolution, but did not externalize in ways amenable to preservation in the material record of the prehistoric past until much more recently?

Given the genetic and paleontological evidence that the European lineage leading to Neandertals had already branched off of the African stem by half a million years ago, the archaeology of the Neandertals provides the ideal testing ground for the different views of the emergence of language and “modern” cognition. If the “Human Revolution” is right, then neither personal ornaments nor art should be found among the Neandertals. If either is, then the “Human Revolution” is refuted and we must look for alternative ways to explain the emergence of those behaviours in the archaeological record.



Neandertals are named after a skeleton found in 1856 at the Kleine Feldhoffer cave, in the Neander valley, near Düsseldorf. Today’s scientists, however, are not the only group of people for whom “Neandertal” has a well-defined meaning. The word is also used in common language to disqualify dislikeable individuals, including political opponents. Opening any dictionary immediately brings up these alternative meanings. The Cambridge On-line, for instance, gives the following: 1) “relating to a type of primitive people who lived in Europe and Asia from about 150,000 to 30,000 years ago”; 2) said “of people or beliefs very old-fashioned and not willing to change”; 3) said “(of people) rude or offensive.”

In order to understand the Neandertals’ enduring bad reputation, we have to bear in mind that, in the mid-nineteenth century, Evolution was conceived in the framework of a progressivist mindset — the directional development of ever more complex and sophisticated forms of life from a simple, primitive common ancestor, with humans sitting at the top of the ladder. Evolution also implied, as Darwin eventually made explicit, that humankind had ape-like ancestors. In this context, two things were, in retrospect, entirely predictable: firstly, a predisposition to interpret any intermediate fossil forms as “part-ape/part-man” in both morphology and behaviour; secondly, because things are what they are only trough their opposition with what they are not, a predisposition to imagine the “animality” of those ancestors as consisting of features representing the exact opposite of “humanity” as Victorians perceived it.

 To make things worse, progressivist preconceptions were compounded by scientific error. One of the first Neandertal articulated skeletons to be found, 100 years ago now, was that from the cave site of La Chapelle-aux-Saints. The famous French physical anthropologist Marcelin Boule made a classical and in many respects paradigmatic description of this fossil. Unfortunately, he also mistook for the normal Neandertal condition numerous and major pathological, arthritic malformations developed late in the ontogeny of the elderly subject of his study. As a result, both popular and scientific opinion converged in considering Neandertals as a side branch, a dead-end of Evolution, both distinct from and inferior to true humans. As late as 1953, Neandertals were still portrayed as the archetypal “half-man/half-beast” of a famous Hollywood film. But, in the 1960s, this prevailing view was challenged. Boule’s error was exposed, and greater emphasis was placed on the significance of skeletons found in Palestine in the 1930s.


These Palestinian fossils, recovered at two nearby cave sites in Mt. Carmel, seemingly displayed an intermediate anatomy, prompting suggestions that the Near East had functioned as a zone of admixture between European Neandertals and early African sapiens. Moreover, there was growing recognition at this time that, archaeologically, the two groups had been doing pretty much the same thing throughout the period between 100,000 and 50,000 years ago. Their stone tools were indistinguishable, and they had buried their dead, a practice that implies world views and religious beliefs. In sum, the two lineages behaved in ways whose level of complexity required the use of language and symbols, as should be expected from cranial capacity — Neandertal brains were in fact larger than ours.

These developments led many scholars to wonder whether Neandertals, instead of an unrelated side-branch, could have been a regional variant of a single evolving human species and, as such, the direct ancestors of today’s Europeans. In this view, called the Multiregional Hypothesis, present racial diversity would be the outcome of a deep-rooted continuity between today’s populations and those of the remotest past. There would have been one and only one Out-of-Africa event, modern Asians and Europeans would be the descendants, through a series of convergent changes in morphology, of the first Homo erectus settlers, and features such as the big noses of Europeans would be an example of the persistence of Neandertal “blood” in living humans.


The 1980s saw the birth of an entirely new line of inquiry, human genetics, which eventually questioned this 1960s view of the Neandertals as fundamentally human. The study of variation in the mitochondrial DNA of extant people led to the conclusion that we are all very closely related, implying a very recent last common ancestor, one who would have lived in Africa some 150,000 years ago. Because mtDNA is inherited through the maternal line, this genetic ancestor was called Eve. In the Eve scenario, her children would subsequently replace the Neandertals and other coeval archaic forms of Eurasian humans, which would all become extinct without descent. This view was strengthened by the successful extraction of fossil mtDNA from the original Neandertal specimen, in 1997, followed, in the last couple of years, by preliminary sequencing results for the entire nuclear genome of another Neandertal individual from the Croatian cave site of Vindija.

The weight of scientific opinion saw in these results support for the notion that the Neandertals were phylogenetically distant and belonged in an altogether different species. Unaffected by the “Human Revolution”, they must have lacked language, or have only had an exceedingly primitive version of it. Moreover, no division of labour and no form of social organization beyond that required by the group’s need to reproduce would have existed, and the so-called Neandertal burials probably were nothing more than simple body disposal without religious meaning. In these circumstances, the outcome of contact situations could only have been total replacement with no admixture. “Humans” would have seen “Neandertals” as unsuitable non-human mates, and the cognitive superiority of our ancestors meant that they would inevitably have prevailed in the competition for territory and resources.

Paradigm Lost


The empirical evidence generated by the last decade of research has falsified the behavioural tenets of the “Human Revolution”. Ironically, at the same time as archaeologists working in Africa were uncovering evidence supporting “modern” cognition tens of millennia prior to the Out-of-Africa of Eve’s children (and conceivably explaining it), archaeologists working in Europe were also uncovering evidence that, contrary to the postulates of the “Human Revolution”, complex and sophisticated cognitive and intellectual capabilities were also apparent in the material culture of Eurasian Neandertals.

For instance, excavations carried out between 1994 and 1997 in a German brown coal mine near Schöningen yielded three 400,000-year-old wooden artefacts of great significance. Long and pointed, they were made from the base of individual spruce trees, with the maximum thickness and weight at the front and long tails that taper towards the proximal end. In all these respects, they resemble the javelins of field-and-track competitions, suggesting use as projectile weapons rather than thrusting spears. They further imply that the laws of ballistics underlying the shape of modern javelins had already been mastered by the founding fathers of the Neandertal lineage.

Further evidence for sophisticated craftsmanship comes from Neumark-Nord, another German brown coal mining site. Chemical analysis of organic residue adhering to a flint flake recovered in levels dated to more than 100,000 years ago showed it to be an extract of oak bark macerated in water, of a kind used until the ethnographic present in the tanning of hides for the manufacture of water-proof clothing and shoe wear. In the 1930s, a nearby site, the Ilsenhöle rock shelter, had already yielded a few bone awls from the time of the latest Neandertals, around 40,000 years ago. Combined, this evidence suggested a long tradition of hide working for the manufacture of clothes and other equipment.

This should come as no surprise. Good-quality artificial insulation was a pre-requisite for survival in Ice Age central Europe, where, considering the wind-chill effect, average winter temperatures ranged between -20 and -30ºC. Thermoregulatory models show that the minimum external temperature Neandertals would have been able to support if dressed in a modern business suit was -24ºC. In the absence of even such basic level of clothing, only a thickness of body fat below the skin in excess of 3 cm could have provided equivalent protection. The weight of such fat, however, would be of some 50 kg, an amount that would leave an 80 kg Neandertal very little left for muscle, bone and other tissue; or, if added to the 80 kg of a lean, muscular body, would transform the average 1.65 m tall male Neandertal into the archetypal obese, unable to procure his own subsistence in a society that lacked cash, automobiles, and shopping malls. The implication is clear: like present-day subarctic peoples, Neandertals must have had good quality clothing as well as all the other gear without which survival in such environments is impossible.

Further and more telling evidence that Neandertals were quite good at Chemistry came from yet another German brown coal mining site, Königsaue. In 1963-64, salvage excavations yielded two fragments of birch bark pitch used for stone tool hafting, one of which still bore a fingerprint of the Neandertal that manipulated it. These items have since been directly dated by radiocarbon to more than 45,000 years ago, and a study of their composition showed that they are not unmodified natural products, such as the bitumen used in the Near East for the same hafting purposes since at least 200,000 years ago. In fact, they are a synthetic raw-material, the first ever in human history. They were produced through a several hour-long smouldering process requiring a strict manufacture protocol: under exclusion of oxygen, and at tightly controlled temperatures, between 340 and 400ºC. Experimental archaeologists have so far been capable of replicating this manufacture procedure only by using fire technology from the much later Neolithic period.


This evidence suggests that, in fact, Neandertals were cognitively as well endowed as we are. But what about their biology? Were they in fact a separate species? And was the reason for their eventual extinction somehow related to their biological separateness?

Careful consideration of the mtDNA evidence shows that such notions have little basis. Given that contemporary populations of chimpanzees are more diverse than all living humans and Neandertals put together, the parsimonious interpretation of the genetic evidence is that, by Primate standards, present-day humans are abnormally homogeneous, not that Neandertals were a different species. Such homogeneity is consistent with a recent origin for modern humans, but it does not rule out the possibility that Neandertals and other archaic groups contributed to the gene pool of subsequent populations.

In fact, the most recent synthesis of the history of modern human dispersals based on mtDNA places the immigration of the bearers of the oldest extant European variants (haplotypes H, I and U) at some 30,000 years ago. Since anatomically modern people are documented in Europe since at least 40,000 years ago, it follows that the mtDNA variants characteristic of those original settlers must be as extinct as those of the Neandertals. Obviously, this does not mean that such early European moderns belonged to a species different from our own, and one that went extinct without descent. By the same token, no such inferences can be made for the Neandertals on the basis of the absence of their mtDNA among living humans. The take-home message is that the mtDNA of present-day Europeans reflects recent demographic history, not the remote interactions (or lack thereof) between African sapiens and non-African archaics.

Moreover, recent developments contradict the notion that Neandertals were a different species and show that, even if they had been, they were so close that admixture at the time of contact was inevitable, and did happen. That much is suggested, for instance, by the microcephalin gene, involved in the control of brain size during development, whose adaptive allele, which occurs in 70% of today’s humans, seems to have introgressed from an archaic lineage, most probably the Neandertals. And, despite unresolved contamination issues, results from the Neandertal genome project also showed that, in 30% of the single nucleotide polymorphisms for which the comparison was possible, the analyzed Neandertal individual had the derived (that is, human), not the ancestral (that is, chimpanzee) allele. Given the time of divergence between the two lineages, such a high percentage implies gene flux and is incompatible with simple population split models.


More importantly, a recent study of species intersterility versus time of divergence suggests that the whole debate concerning the taxonomic status of Neandertals is a lot like the proverbial Byzantine argument about the sex of the angels. The study shows that, among the many lineages of mammals for which fossil or molecular data are available, 1.4 million years is the minimum amount of time for reproductive separation to emerge between two branches splitting from a recent common ancestor. This minimum was observed among gazelles. Among hominids, however, the interval between generations is at least four times longer. The implication is one of no reproductive isolation between contemporary lineages of hominids separated by less than five to six million years of divergence. Such a length of time corresponds to the entire evolutionary life span of the hominid family, and is at least ten times the duration of the interval separating the Neandertal/modern splitting event from the period of contact in Europe. By Mammal standards, therefore, Neandertals were not, and could not have been, a different biological species.

Paradigm Found


Until about ten years ago, the presence of ornaments at late Neandertal sites was acknowledged by supporters of the “Human Revolution” but disregarded as a by-product of “imitation without understanding” of modern human behaviours observed in contact situations. The following analogy was famously proposed by a distinguished British archaeologist to make the point: “if a child puts on a string of pearls, she is probably doing this to imitate her mother, not to symbolize her wealth, emphasize her social status, or attract the opposite sex.”

Research carried out since 1998 on the Châtelperronian culture of France and northern Spain has dramatically changed the picture. At the Grotte du Renne, in France, the Châtelperronian levels yielded bone awls identical to those from the Ilsenhöle, but with three differences: they came in larger numbers; some featured regular decorative patterns; and they were associated with body ornaments. These finds were published in the early 1960s, but their significance was impaired by doubts on the authorship of the Châtelperronian. In 1979, however, a Neandertal skeleton was found in a Châtelperronian context at the French site of the St.-Césaire, and, in 1996, the association was confirmed for the fragmentary remains recovered at the Grotte du Renne itself. Eventually, it became clear that the Châtelperronian, with its suite of personal ornaments, was an independent Neandertal development predating modern human immigration by several millennia. The conclusion that Neandertal society was symbolically organized is further strengthened by results from use-wear analyses of hundreds of chunks of black pigment from another and even earlier French cave site, the Pech de l’Azé. These analyses concluded that they were pencils used for body painting.

In short, late Neandertals had attained a level of cultural achievement identical to that documented among their African contemporaries. What happened when these two fully symbolic cultural traditions eventually met should be treated, therefore, without preconception. Did they exchange genes and memes? Or was mutual avoidance the rule, resulting in the extinction of one of the sides?

The answers must be sought in the biology and culture of the post-contact populations, those of the earliest modern humans of Europe. If we find no Neandertal contributions in those post-contact populations, then we must conclude that interaction and admixture were trivial or non-existent. But, if Neandertal contributions are indeed apparent, then we must conclude that significant interaction and admixture occurred, regardless of whether such contributions were or were not subsequently lost.

In 1998, the discovery of an excavation of the 30,000-year-old burial of a five year old child in the Lagar Velho rock shelter, Portugal, provided hard evidence for a hypothesis that a group of scholars had been entertaining for many years: That populations of the African lineage spreading into Europe would have interbred with the local Neandertals, whose disappearance would have been largely a process of assimilation, not extinction without descent. In fact, this child featured an anatomical mosaic mixing characteristics for the most part modern, such as a clear, prominent chin and a high cranial vault, with characteristics reminiscent or even distinctive of the Neandertals and other archaic Eurasian populations, such as the robusticity of the leg bones, the arctic, cold-adapted body proportions, and several minor features in the skull, the mandible and the dentition. These features are known to be genetically inherited, so their presence indicates a part-Neandertal ancestry for the child. Soon after the Lagar Velho discovery, in 2003-2005, the Romanian cave of Oase was to provide additional evidence — the mandible of a young adult and the near complete cranium of an adolescent, dated to 40,000 years ago and at present Europe’s earliest modern human fossils — in support of this notion.

The archaeological evidence supports this scenario. The Protoaurignacian culture of western and central Europe is contemporary with the Oase fossils and, as such, the first cultural entity that reliably can be assigned to European early moderns. The personal ornaments of the Protoaurignacian are consistent with this notion: For the most part, they are made of the same small marine shell beads of diverse taxonomy but identical basket-shaped morphology found among modern human cultures of the Near East and Africa, where they go back to some 100,000 years ago. By comparison with these cultures, however, the Protoaurignacian also features some novelties, such as pierced animal teeth, namely of red deer and fox. These kinds of pendants are completely unknown in Africa and the Near East prior to the time of contact. But they are precisely the types of ornaments that characterize such pre-contact Neandertal-associated cultures as the Châtelperronian. The parsimonious explanation for these elements of discontinuity with the African/Levantine tradition of personal ornamentation can only be that modern humans acquired them from the indigenous, Neandertal societies where such novel elements originally emerged.


The implication of these finds is that, in their strict, original formulations, Multiregional Evolution and Mitochondrial Eve are now both obsolete views of the tempo and mode of human evolution. The paleontological and archaeological evidence favours Assimilation models and refutes the notion that Neandertals were a different species. And, for hard-line supporters of the Neandertal’s fundamental separateness, the evidence still carries the twin implications that 1) archaeologically visible manifestations of fully symbolic sapiens behaviour emerged independently among different human species and, 2), that the biological/genetical foundations for that behaviour must therefore have existed in the human genus prior to the split between the African and European lineages.

So, even in the framework of multi- rather than single-species views of human evolution, the corollary of the last decade of empirical discoveries is that explanations of the emergence of “behavioural modernity” as a simple by-product of a putative speciation event in the late Middle Pleistocene of Africa are refuted — the “hardware” requirements for symbolic thinking must have been in place before half a million years ago, when the Neandertal lineage began to diverge. This conclusion has two additional corollaries: firstly, that the search for the genetic and cognitive processes underlying the emergence of language and symbolism in the human lineage needs to be refocused on aspects of the differentiation and evolution of Homo erectus people between two and one million years ago; secondly, that the much later appearance of personal ornaments represents a qualitative leap in culture, reflecting the operation of demographic and social factors.

The commonplace notion that the first modern humans in Europe were “astonishingly precocious artists” whose superior cognition sufficed to explain the demise of the Neandertals is also in contradiction with the facts. The documented artistic skills of the earliest European moderns are identical to those documented in late Neandertal cultural contexts, and consist simply of patterned markings applied to bone tools with decorative or functional purposes. The earliest figurative art (the cave paintings and ivory figurines from France and southern Germany), in fact, post-dates by five millennia the first archaeological or paleontological indicators of modern human immigration. Much as with personal ornaments, the explanation for these novel developments must therefore be sought in transformations occurring at that time in European society, not in human brains.

The ethnographic record abundantly documents that rock art primarily functions as a way of embodying places with economic, ideological or social significance, and the thousands of open air petroglyphs of the Côa Valley, in Portugal, show that the same holds true for the Palaeolithic period. Thus, the parsimonious explanation for why art only appears in the archaeological record around 35,000 years ago is that only then did the need arise for systems of social identification/differentiation extending beyond the individual to include the landscapes and resources claimed as territory by the different groups to whom people advertised their allegiance through the use of body ornaments. Sculpted figurines, in turn, are likely to have represented manifestations of the same phenomenon in the personal and domestic arenas of behaviour.

The need for such systems can easily be explained as a consequence of adaptive success, with technological innovation leading to demographic growth and implying both increased inter-group competition and increased regulation of that competition. In such a context, it is easy to understand the adaptive value of the emergence of ceremonial behaviours addressing issues of property and rights over resources, and of the development of myths and religious beliefs relating such rights to real or ideal ancestors. Therein lays the origins of art, not in an evolutionarily late mutation endowing modern humans with the capacity for symbolic thinking. The corresponding “hardware”, in fact, must have been in place as soon as the size and shape of the brain case entered modern ranges of variation and the cultural record documents behaviours that require language, i.e., symbolic thinking by definition. The paleontological record concurs in suggesting that such a Rubicon had already been crossed by half a million years ago. The rest is History.


Illustration captions


Figure 1 – Nassarius kraussianus shell beads (~1 cm large, on average) from Blombos Cave, South Africa.

Figure 2 – Horse figurine (~5 cm long), ivory, from the Vogelherd cave, southern Germany.

Figure 3 – 1909, Boule-inspired reconstruction of the La Chapelle-aux-Saints Neandertal by the Czech artist Frantižek Kupka.

Figure 4 – The spread into Europe of the continent’s mtDNA founder lineages H, I and U is supposed to have taken place only 30,000 years ago, implying that the mtDNA of the modern humans who first settled Europe some 40,000 years ago is as extinct as the Neandertals’.

Figure 5 – Pierced and grooved pendants made of animal bone and teeth, the most common personal ornaments of Europe’s late Neandertals, all from basal Châtelperronian level X of the Grotte du Renne (Arcy-sur-Cure, France): a-b. fox canines; c. bison incisor; d. lateral phalange of reindeer.

Figure 6 – The cranium of the Oase 2 individual (Peştera cu Oase, Banat, Romania).

Figure 7 – Different models for the explanation of the replacement of Neandertals by modern humans in Europe.

Figure 8 – Rock no. 1 of Canada do Inferno (Côa Valley, Portugal).




BOUZOUGGAR, A.; BARTON, N.; VANHAEREN, M.; D'ERRICO, F.; COLLCUTT, S.; HIGHAM, T.; HODGE, E.; PARFITT, S.; RHODES, E.; SCHWENNINGER, J.-L.; STRINGER, C.; TURNER, E.; WARD, S.; MOUTMIR, A.; STAMBOULI, A. (2007) — 82,000-year-old shell beads from North Africa and implications for the origins of modern human behavior. «Proceedings of the National Academy of Sciences USA», 104 (24), p. 9964-9969.

CANN, R. L.; STONEKING, M.; WILSON, A. C. (1987) — Mitochondrial DNA and human evolution. «Nature», 325, p. 31-36.

CONARD, N. J. (2009) — A female figurine from the basal Aurignacian of Hohle Fels Cave in southwestern Germany. «Nature», 459, p. 248-252.

D’ ERRICO, F. (2003) — The Invisible Frontier. A Multiple Species Model for the Origin of Behavioral Modernity. «Evolutionary Anthropology», 12, p. 188–202.

D’ERRICO, F.; ZILHÃO, J.; BAFFIER, D.; JULIEN, M.; PELEGRIN, J. (1998) — Neanderthal Acculturation in Western Europe? A Critical Review of the Evidence and Its Interpretation. «Current Anthropology» 39, Supplement, p. 1-44.

ESWARAN, V.; HARPENDING, H.; ROGERS, A. R. (2005) — Genomics refutes an exclusively African origin of humans. «Journal of Human Evolution», 49 (1), p. 1-18.

EVANS, P. D.; MEKEL-BOBROV, N.; VALLENDER, E. J.;  HUDSON, R. R.; LAHN, B. T. (2006) — Evidence that the adaptive allele of the brain size gene microcephalin introgressed into Homo sapiens from an archaic Homo lineage. «Proceedings of the National Academy of Sciences USA»,  103 (48) , p. 18178-18183.

FORSTER, P. (2004) — Ice Ages and the mitochondrial DNA chronology of human dispersals: a review. «Philosophical Transactions of the Royal Society London B», 359, p. 255-264.

GILMAN, A. (1984) — Explaining the Upper Palaeolithic Revolution, in SPRIGGS, M. (ed.) — «Marxist Perspectives in Archaeology»

© CÔA Todos os direitos reservados© All rights reserved