Search This Blog

Monday, 11 April 2016

And still yet more on pre-evolutionary design.

Another Problem Bacteria Have to Worry About, and How They Solve It Effectively


How human speech contributes to Darwinism's crisis.

Human Language: Noam Chomsky, Universal Grammar, and Natural Selection


Michael Denton

Editor's note: In his new book Evolution: Still a Theory in Crisis, Michael Denton not only updates the argument from his groundbreaking Evolution: A Theory in Crisis (1985) but also presents a powerful new critique of Darwinian evolution. This article is one in a series in which Dr. Denton summarizes some of the most important points of the new book. For the full story, get your copy of Evolution: Still a Theory in Crisis. For a limited time, you'll enjoy a 30 percent discount at CreateSpace by using the discount code QBDHMYJH.

In the early 1960s, in one of the landmark advances in 20th-century science, Noam Chomsky showed that all human languages share a deep invariant structure. Despite their very different "surface" grammars, they all share a deep set of syntactic rules and organizing principles. All have rules limiting sentence length and structure and all exhibit the phenomenon of recursion -- the embedding of one sentence in another. Chomsky has postulated that this deep "universal grammar" is innate and is embedded somewhere in the neuronal circuitry of the human brain in a language organ. Children learn [human] languages so easily, despite a "poverty of stimulus,"1 because they possess innate knowledge of the deep rules and principles of human language and can select, from all the sentences that come to their minds, only those that conform to a "deep structure" encoded in the brain's circuits.2
The challenge this poses to Darwinian evolution is apparent. Take the above-mentioned characteristic that all human languages exhibit: recursion. In the sentence, "The man who was wearing a blue hat which he bought from the girl who sat on the wall was six feet tall," the italicized words are embedded sentences. Special rules allow human speakers to handle and understand such sentences. And these rules, which govern the nature of recursion, are specific and complex. So how did the computational machinery to handle it evolve? David Premack is skeptical:
I challenge the reader to reconstruct the scenario that would confer selective fitness on recursiveness. Language evolved, it is conjectured, at a time when humans or protohumans were hunting mastodons... Would it be a great advantage for one of our ancestors squatting alongside the embers to be able to remark, "Beware of the short beast whose front hoof Bob cracked when, having forgotten his own spear back at camp, he got in a glancing blow with the dull spear he borrowed from Jack"? Human language is an embarrassment for evolutionary theory because it is vastly more powerful than one can account for in terms of selective fitness. A semantic language with simple mapping rules, of a kind one might suppose that the chimpanzee would have, appears to confer all the advantages one normally associates with discussions of mastodon hunting or the like. For discussions of that kind, syntactical classes, structure-dependent rules, recursion and the rest, are overly powerful devices, absurdly so.3
There is considerable controversy over what structures in the brain restrict all human languages to the same deep structure. Some linguists reject an innate neurological organ devoted specifically to language. Conceiving that it is only the brain's general abilities that are "pre-organized," they envisage language as a learned skill based on a "functional language system" and design constraints, distributed across numerous cortical and subcortical structures.
Yet however it is derived during development, there is no doubt that a unique deep structure underlies the languages of all members of our species. It is because of the same underlying deep structure that we can speak the language of the San Bushman or an Australian aborigine, and they in turn can speak English. The fact that all modern humans, despite their long "evolutionary separation" -- some modern races such as the San of the Kalahari and the Australian aborigines have been separated by perhaps 400,000 years of independent evolution -- can learn each other's languages implies that this deep grammar must have remained unchanged since all modern humans (African and non-African) diverged from their last common African ancestor, at least 200,000 years ago. As Chomsky puts it:
What we call "primitive people"... to all intents and purposes are identical to us. There's no cognitively significant genetic difference anyone can tell. If they happened to be here, they would be one of us, and they would speak English... If we were there, we would speak their languages. So far as anyone knows, there is virtually no detectable genetic difference across the species that is language-related.4
As I mentioned in the last article in this series, it is not only the deep structure of language that has remained invariant across all human races. All races share in equal measure all the other higher intellectual abilities: musical, artistic, and mathematical ability, and capacity for abstract thought. These also, therefore, must have been present in our African common ancestors more than 200,000 or more years ago, and must have remained unchanged, and for some reason latent since our common divergence. To suggest that language and our higher mental faculties evolved in parallel to reach these same remarkable ends independently in all the diverse lineages of modern humans over 200,000 years ago or more, would be to propose the most striking instance of parallel evolution in the entire history of life and be inexplicable in Darwinian terms.
References:
(1) Chomsky, The Science of Language, p. 5.
(2) Ibid., Part 1.
(3) David Premack, "Gavagai! Or the future of the animal language controversy," Cognition 19: 207-296, see pp. 281-282.
(4) Chomsky, The Science of Language, p. 13.
Image: Noam Chomsky, by jeanbaptisteparis via Flickr.