Haidar starts by asking three questions:
Among the many unanswered questions in grammar theory, the following figure prominently.
- First, what is it that enables children to successfully cope with the structural complexities of their mother tongue while professional grammarians tend to fail when modelling them?
- Second, what determines the narrow system corridor for human grammars?
- Third, are the grammars of human languages the offspring of a single proto-grammar instantiating a "Universal Grammar" (monogenic) or are the shared traits of human grammars the result of convergent changes in the grammars of human languages of diverse ancestry (polygenic)?
Natural languages have the properties they have because they reflect the properties which our language-learning and language-using human brain capacities can cope with. Systems beyond this capacity are thereby excluded. A grammar that is not effectively learnable will never become a grammar in use. On the other hand, grammars may develop into luxurious systems when adapting to the system potential of human brains within its boundaries. The human language processing resources are recruited resources with a non-verbal evolutionary history. Language processing is parasitic on available resources. Therefore, with respect to language processing, their properties are contingent on a non-linguistic selection background.
Grammars that children could not fully grasp would not come into existence, simply because no human brain would acquire and then use them. Why are grammars learnable? ... Languages are learnable since the grammars of human languages have been selected for learnability by the numerous generations of grammar acquiring brains. Such an answer must be framed in terms of insights we owe to Charles Darwin. Complex systems do not suddenly appear from nowhere. They are the result of evolutionary processes that apply to human 'software' packages or 'apps' for languages, a.k.a. grammars of natural languages.He then discusses the evolution of grammars:
Languages share a habitat, namely the same neuro-cognitive environment for acquisition and use. A language with grammar G can survive only if G happens to enter enough brains. Darwinian evolution is not substance-bound, that is, it is not restricted to the genome of biological systems. Evolution inevitably takes place whenever a self-reproducing system is open for variation and is embedded in a context that constantly sieves out variants. ... In the long run, only those variants will survive that are not sieved out. In other words, variants win that happen to turn out as 'fitter' within their selecting environment.
Evolution inexorably results in adaptation to the selecting environment. The selecting environment for grammars is the ensemble of cognitive capacities of our brains that has gotten recruited for language processing. Their history of evolution is independent of language. In the evolution of humans, complex grammars of languages are too recent an achievement to be a result of biological selection on its own. Grammars adapt to the brain.
When discussing grammar in more detail Haider states:
The acquisition paths in first language acquisition recapitulate steps in the evolution of grammars in the history of mankind. ... It is an educated guess that our human ancestors, just like today's children, have started with two and three word utterances, with little to no restrictions that would deserve the denomination 'grammar'. From then on, cognitive selection has been working steadily and unavoidably and it rewarded and conserved variants that turned out to ease processing or learning.
The specific ensemble of brain resources recruited for language processing may appear to be domain specific. However, its components are not domain-specific at all. They have been recruited from the already existing and therefore available cognitive processing resources of the primate brain. ... Grammars are cognitive apps for language usage. These apps have been shaped by cognitive evolution. Biological evolution has shaped brains that happen to provide the computational capacities for 'running' such cognitive apps.
Human acoustic decoding, for instance, capitalizes on categorical perception. This capacity of our brain is not species-specific. ... However, as it is an available and useful resource of human brains, too, it got recruited for language processing. The whole ensemble of human computation resources is the selecting background environment for the evolution of grammars. A grammar variant has a chance to occupy more brains if it is better adapted, that is, if it is rewarded by brains that reward structures that can be processed more easily and effectively. Due to the lack of script in most languages, the historical depth of documented grammar changes is shallow. Nevertheless, what we know is sufficient for realizing clear effects of ongoing evolution by variation & selection.
- Grammars are learnable since learnability is the prime factor of the selecting cognitive environment. Better learnable variants will occupy more brains and spread.
- Human languages stay within a corridor delimited by originally non-verbal brain resources recruited for actual grammar usage.
- There is no need for insisting on a monogenetic origin of human grammars. Cognitive evolution fully accounts for the cross-linguistically convergent as well as divergent traits of human grammars.