There was a point in which the AI community adopted Chomsky grammars. Grammars are Cartesian universals.
The other approach (N. Wiener) would have been to use feedback control system (later: stochastic control) to interpret learning. It did not happen. It is happening now. Big data, correlation based strategies. Deep learning.
This very interesting article by David Auerbach about Summa Technologiae by S. Lem captures exactly this transition: the interaction in living systems, of on the one hand ORDER STRUCTURE SIGNAL and on the other morphism (in the sense of J. Lanier), PLASTICITY, NOISE.
This dyadic structure is everywhere:
- in educational systems (rigidity vs. creativity): Freud’s “Unbehagen in der Kultur” vs. Malinowski, Engels and Wilhelm Reich;
- in the definition of formal languages (grammar based a’ la Chomsky vs. feedback based a’ la Turing, Piaget, and mostly Wiener)
- in the ideas underlying political structures of the modern world: “for Lem, communism and capitalism are delusional twin faiths: communism, that we can collectively and centrally control chance and causality; capitalism, that chance and causality will intrinsically prove benevolent and productive for us.” (D. Auerbach)
See “Un paradiso Perduto”, by Marcello Cini and of course N. Wiener’s “Cybernetics” and “The Human Use of Human Beings”.
Leave a Reply