Monday 15 October 2018

Natural Language: The Evolution of Grammar

A reference in the blog Babel's Dawn alerted me to an interesting paper "An anthropic principle in lieu of a "Universal Grammar" by Hubert Haider, Department of Linguistics & Centre of Neuroscience at the the University of Salzberg.

Haidar starts by asking three questions:


Among the many unanswered questions in grammar theory, the following figure prominently.
  1. First, what is it that enables children to successfully cope with the structural complexities of their mother  tongue while  professional grammarians  tend to  fail when  modelling them? 
  2. Second, what determines the narrow system corridor for human grammars? 
  3. Third, are the grammars of human languages the offspring of a single proto-grammar instantiating a "Universal Grammar" (monogenic) or are the shared traits of human grammars the result of convergent changes in the grammars of human languages of diverse ancestry (polygenic)? 
 The last of these questions is mainly concerned with debunking the ideas of Chomsky and I will pass over it in this blog post. More importantly he argues strongly that natural language and the grammars associated with them are due to normal evolutionary forces and are limited by the resources present in the human brain.  He does not detail the nature of the resources provided by  the brain - which is very relevant to my research into the evolution of human intelligence.  For this reason I quote extracts from his text below, and then give a brief example of how my model demonstrates the way in which the neural network in the brain supports the resources his approach needs to support language . 

Natural languages have the properties they have because they reflect the properties which our language-learning and language-using human brain capacities can cope with. Systems beyond this capacity are thereby excluded. A grammar that is not effectively learnable will never become a grammar in use. On the other hand, grammars may develop into luxurious systems when adapting to the system potential of human brains within its boundaries. The human language processing resources are recruited resources with a non-verbal evolutionary history. Language processing is parasitic on available resources. Therefore, with respect to language processing, their properties are contingent on a non-linguistic selection background.
Discussing the role of children learning language he says:
Grammars that children could not fully grasp would not come into existence, simply because no human brain would acquire and then use them. Why are grammars learnable? ... Languages are learnable since the grammars of human languages have been selected for learnability by the numerous generations of grammar acquiring brains. Such an answer must be framed in terms of insights we owe to Charles Darwin. Complex systems do not suddenly appear from nowhere. They are the result of evolutionary processes that apply to human 'software' packages or 'apps' for languages, a.k.a. grammars of natural languages.
He then discusses the evolution of grammars:
Languages share a habitat, namely the same neuro-cognitive environment for acquisition and use. A language with grammar G can survive only if G happens to enter enough brains. Darwinian evolution is not substance-bound, that is, it is not restricted to the genome of biological systems. Evolution inevitably takes place whenever a self-reproducing system is open for variation and is embedded in a context that constantly sieves out variants. ...  In the long run, only those variants will survive that are not sieved out. In other words, variants win that happen to turn out as 'fitter' within their selecting environment.                       
Evolution inexorably results in adaptation to the selecting environment. The selecting environment for  grammars  is the  ensemble of  cognitive  capacities of  our brains  that  has gotten recruited for language processing. Their history of evolution is independent of language. In the evolution of humans, complex grammars of languages are too recent an achievement to be a result of biological selection on  its  own. Grammars adapt to  the  brain. 

When discussing grammar in more detail Haider states:
The acquisition paths in first language acquisition recapitulate steps in the evolution of grammars in the history of mankind.  ... It is an educated guess that our human ancestors, just like today's children, have started with two and three word utterances, with little to no restrictions that would deserve the denomination 'grammar'. From then on, cognitive selection has been working steadily and unavoidably and it rewarded and conserved variants that turned out to ease processing or learning.

He continues:
 The specific ensemble of brain resources recruited for language  processing may appear to be  domain specific. However, its components are not domain-specific at all. They have been recruited from the already existing and therefore available cognitive processing resources of the primate brain. ... Grammars are cognitive apps for language usage. These apps have been shaped by cognitive evolution. Biological  evolution  has shaped  brains  that  happen  to provide the computational capacities for 'running' such cognitive apps. 
Human acoustic decoding, for instance, capitalizes on categorical perception. This capacity of our brain is not species-specific. ... However, as it is an available and useful resource of human brains, too, it got recruited for language processing. The whole ensemble of human computation resources is the selecting background environment for the evolution of grammars. A grammar variant has a chance to occupy more brains if it is better adapted, that is, if it is rewarded by brains that reward structures that can be processed more easily and effectively. Due  to  the lack  of  script  in  most  languages,  the historical  depth  of  documented  grammar changes is shallow. Nevertheless, what we know is sufficient for realizing clear effects of ongoing evolution by variation & selection. 

He concludes with summary answers to the three question he asked at the start:
  1. Grammars are learnable since learnability is the prime factor of the selecting cognitive environment. Better learnable variants will occupy more brains and spread.
  2. Human languages stay within a corridor delimited by originally non-verbal brain resources recruited for actual grammar usage. 
  3. There is no need for insisting on a monogenetic origin of human grammars. Cognitive evolution fully accounts for the cross-linguistically convergent as well as divergent traits of human grammars. 
The point he was making is that language and grammar have evolved by using pre-existing facilities in the brain - and this fits in very well with my evolutionary model which says that animal and human brains work in fundamentally the same way although there are clearly some differences in scale. However he does not describe how the brain "happens  to provide the computational capacities for 'running" language - while the CODIL-based model of how the brain works provides a mechanism which links the activity of the neurons to a system which may well model a primitive early form of natural language.

I am currently drafting a paper which will go into further detail - and may well give some detailed examples in relating to language in further blog posts,  but in the meantime a simple example. 

MURDERER =  MACBETH; VICTIM = DUNCAN; WEAPON = DAGGER.

When viewed as a neural net the above CODIL statement defines the links between the neural net nodes MURDERER, MACBETH, VICTIM, DUNCAN, WEAPON and DAGGER.  These node names are all nouns, and almost certainly the very first steps towards evolving natural language would be to allocate names to objects. In addition the CODIL statement is likely to be earily understood by anyone familiar with the words. But CODIL also supported rule based processing so there would be no difficulty in programming the neural net to produce sentences such as "MACBETH used a DAGGER to murder DUNCAN. 

No comments:

Post a Comment