F. BONSACK evaluates critically F. VARELA's concept in the following terms:
"His system is as follows: There is, basically, a causal net, internal to the system; the components mostly interact: He speaks about "organizational closure". Of course, he makes clear that "closure" is not the equivalent of "closedness". However it is not less true that the first looks much like the latter and that the only noteworthy difference is that it accepts "perturbations" to which the internal net of interrelations reacts globally, reinstating equilibrium. What is generally called information, for example the sensations, are reduced in that way: there are not inputs, but only perturbations.
"And the outputs? One looks in vain for them: the system closes on itself, it does not act outward. VARELA gives at page 225 of his book "Principles of Biological Autonomy", a schema where a net of interactions may be seen: an arrow coming from outside but no arrow pointing outward. In his NEUCHATEL conference, he described at large an annular cellular automata, which he presents as a model of autonomous system. Again, arrows may be seen coming from outside, but none directed outward. He writes explicitely: "I say "closure" in the sense that any intersection of the system leads not to a response towards the outside (BONSACK's emphasis), but to a new interaction within the system itself". And farther on: "The consequences of the system's operations are system's operations". The system thus closes on itself.
"How to understand this curious stand?
"The most plausible hypothesis is that, for VARELA, the system's activity is centered on its own conservation, on a continuous autopoiesis, on the defense of its own integrity – which may be admitted.
"However VARELA seems to forget two aspects:
"The first one is that, according to his own theory, autonomous units may fuse to form an autonomous unit of superior order, for example a group of individuals, a society, a culture. And, in such a frame, the inferior units must necessarily produce an output, providing something to the other units, or performing some function within the frame of the superior unit. Contrarywise, no superior unit could exist, these would be only a collection of inferior units, all closed on themselves, without links or interactions.
"The second aspect that VARELA seems to forget, is that, among the retroactive causal chains which permits the maintenance of the individual 's integrity, many do not remain internal, but cross through the world. Let us think for example to the search for food!
"This entails two consequences:
"Firstly: as the chain crosses outward, there is an output when it exits from the system and, at least, one input when it comes back.
"Secondly: if the chain is to be efficient, a knowledge of the world must exist, of the objects and the causal relations that are to be used. It is thus unavoidable, in one way or another, that the system be equipped with a world representation that will permit it to simulate the objects behavior. It is thus unavoidable for it to count with genuine information inputs (which are something different from perturbations) in order to be able to create that world representation. Yet, VARELA rejects not only any input, but altogether any world representation. He writes (I translate):
"In this perpective, what we call "representation" is not a correspondance with some kind of given state of matter, but better, a coherence with the continuous maintenance of its own identity. Such regularities ("behavioral regularities ") which we choose to call "symbolic", are not operative for the system, because it is us who establish a correspondence from a viewpoint which is not within the operation of the system" (end quote).
"One has the impression, reading VARELA, that he attaches a positive value to anything internal to the system, while all that is external is negatively valued: For example, information become a "perturbation". For him, the outside is the ennemy of the autonomous system" (BONSACK, 1990, p.110-112).
P. CARIANI, inspired by G. PASK, relates organizational closure with "the self-construction of the observer". He writes: "When a device gains the ability to construct its own sensors, or in Mc CULLOCH terms "this ability to make or select proper filter on its inputs", it becomes organizationally closed. The device then controls the distinctions it makes on its external environment, the perceptual categories which it will use. On the action side, the device acquires the ability to construct its own effectors, and with them gains control over the kinds of actions it has available to influence the world. The self-construction of sensors and effectors thus leads to an epistemic autonomy, where the organism or device itself is the major determinant of the nature of its relations with the world at large… This basic concept of structural closure and its consequent, functional autonomy, underlies many of the closely related notions of semantic closure (PATTEE), autopoiesis (MATURANA, VARELA), self-modifying systems (CSANYI, KAMPIS), self-reproducing automata (von NEUMANN), anticipatory systems (ROSEN) and the recurrent "nets with circles" of Mc CULLOCH and PITTS" (1993, p.30).
In synthesis, organizational closure can, and must be acquired, even if on the base of a primeval biological level of closure.
Finally, CARIANI makes the following and very significant comment about programmed devices: "They fail to achieve epistemic autonomy relative to their designers because all possibilities have been prespecified by their designers" (p.30). This means that externally imposed algorithms preclude the formation of internal ones.
G. PASK clarified these aspects as early as 1958 and 1961(b). Possibly too early to be well understood!
- 1) General information
- 2) Methodology or model
- 3) Epistemology, ontology and semantics
- 4) Human sciences
- 5) Discipline oriented
To cite this page, please use the following information:
Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]
We thank the following partners for making the open access of this volume possible: