1)"The content of a message apt to trigger some action" (J.de ROSNAY, 1975, p.168).
In this view, as stated by W.R. WINBURN, "information is "the difference that makes a difference" (1991, p.558).
The definition given by J.de ROSNAY is a most general, and pragmatic one.
It opens however some intriguing questions, as shown by the following example: A sculptor starts his/her work on some block of stone. The "information" is what she or he has in mind (let us not here inquire wherefrom it came!). The work progresses through the sculptor "message" to the stone, by hewing slowly some face or figure out of it. The message is transmitted through the nervous and muscular system of the sculptor and the chisels.
The "receiver" is the stone, but it is quite a passive, not an active one. Such is the mystery of the creative act. It is kind of a message of the creator to some passive material, which thereafter acquire a power, becomes a bearer of information as the creator communicates with him or herself and to others through the "informed" material, which is now "informative".
On the other hand, many receivers can be directly activated by a message (see for example "selective content of information").
In J.de ROSNAY words: "Aristotle introduces an important distinction between both meanings of the word information. On the one hand, information is understood as "acquisition of knowledge" (it is the act to inform oneself by observing some object or nature). On the other hand, information means "power of organization" or "creative action" (p.168).
2) A more or less organized set of data transmitted from one intelligent system to another.
Information, sometimes confused with its communication, is not merely basic to systemic methodology, but altogether a central and many-sided systemic concept.
1- Information exists within informed systems, i.e. organized entities able to translate some basic data into coherent action.
2- Information has a specific human meaning.
3- Information "is descriptive: it is contained in answers to questions that begin with such words as What, Which, Who, How many, When and Where" (J. GHARAJEDAGHI, 1985)
4- It has moreover a semantic meaning (Mac KAY's "logon "), different from its quantitative measure.
5- However, information can also be measured and, in this quantitative sense used to dispel some degree of uncertainty.
6- It can be related to the increase or local decrease of entropy production, i.e. tipify degrees of systemic organization.
The concept of information has been quite enriched, but, alas, also muddled as a result of the development of computer science (informatics).
Confusion started when C. SHANNON published his "Theory of Communication", which nearly everybody calls erroneously, a "Theory of Information". As an electrical engineer working for a telephone company, his basic purpose was to investigate the communication of information through a channel, namely at the time telephone wiring.
On this specific aspect Mac KAY writes: "A further question was where, if anywhere, the notion of "meaning" fitted into the scheme of things. SHANNON's analysis of the "amount of information" in a signal, which disclaimed explicitly any concern with its meaning, was widely misinterpreted to imply that the engineers had defined a concept of information per se that was totally divorced from that of meaning" (1969, p.5).
As to what information really means, we may quote T.S. Eliot who asked somewhere: "Where is the knowledge we lost with information? Where is the wisdom we lost with knowledge?". Possibly, some deciders would add: "Where is the information we lost with data?". Indeed, during the last 40 years the technology of information transmission and treatment has seriously obscured the problem of meaning, in its different guises.
Data cannot be transformed into information by themselves. Whatever is offered by a newspaper, a radio or a T.V. set, or a computer, is merely raw material, which acquires significance only when integrated within a frame of reference. Thus, only a modicum of previous knowledge, offering such a frame can transform data into information. In turn, this information may then transform or expand pre-existing knowledge, for example through a Popperian process of falseation (for which metarules are needed). Finally, the proper uses of well-informed knowledge is ELIOT's wisdom, as a meta-frame of reference.
A massive flood of data, even ordered in some base, merely overwhelms the best minds if not organized within a well defined frame of reference and sustained by clearly stated purposes.
Using adequate physical and semantic codes, we communicate our observations and our interpretations to other people, in order to:
- confirm, correct or invalidate what we did register or conclude;
- obtain feedback in order to complement our own results;
- seek a consensus leading to common understanding and collective action;
Only after such constantly renewed exchange, we may speak of "information" referring us to confirmed and circulating meanings admitted by some specific group of people. Moreover, information becomes knowledge only when organized within a net of interconnected meanings whose coherence is generally recognized after due testing and confirmation by numerous people and during a long span of time. This is the social frame of reference.
As a still tentative synthesis, "information" is a many-sided concept, conveying by natural or artificial means collective sets of quite various representations acquired by consensus within specific groups of people.
It necessarily implies a network of interactive elements, whose interactions first create rules and thereafter obey them. This supposes change, i.e. a space-time flow. Such an understanding is less psychologically oriented, while still, unavoidedly anthropocentric.
As such, the concept can be used in many different ways, if we keep in mind that its value is relative and that we must define clearly, in every case, what frame of reference we are using.
3) "That which adds to a representation" (Mac KAY, 1969, p.163).
Mac KAY comments: "… this leaves open the possibility that the information may be true or false. When a representation alters, we define the new information as true if the change increases the extent of correspondence between the representation and the original.
"The information is said to be false if the change diminishes the extent of this correspondence. Strictly, the truth or falsehood is an attribute of the resultant representation, but it is customary to attribute it to that which has given rise to the change in the representation" (p.163).
4) "… that which does logical work on the organism's orientation (Mac KAY, 1969, p.96).
Mac KAY adds in this case: "(Whether correctly or not, and whether by adding to, replacing or confirming the functional linkages of the orienting system)" and… "Thus we leave open the question whether the information is true or false, corrective or confirmatory, and so on" (Ibid).
5) "A measure of selection among a given set of possibilities" and also "Sometimes a measure of the extent to which uncertainty is reduced" (G. PASK, 1961, p.115).
"A message that produces a change in any of the receiver's probabilities of choice" (Adapted from R.L. ACKOFF & F.E. EMERY, 1972, p.144).
These last authors state: "Because of the pervasiveness of the use of information in the restricted (technical) sense of SHANNON's information theory (SHANNON and WEAVER, 1949),it might seem preferable to use another term here. But since the way that we use information here conforms more closely to common usage than does SHANNON's, if a change is required it would seem preferable to change SHANNON's term ". Let us remember that that part of SHANNON's theory is of quantitative nature" (Ibid).
The same authors quote C. CHERRY who wrote: "In a sense, it is a pity that the mathematical concepts stemming from HARTLEY have been called "information" at all. The formula for Hn is really a measure of one facet only of the concept of information; it is the statistical rarity or "surprise value" of a source of signs" (1957, p.50).
A. RAPOPORT states: "The concept of information is as central in cybernetics and communication engineering as the concept of energy is in classical physics.
"Energy has been the unifying concept underlying all physical phenomena involving work and heat. Information became the unifying concept underlying the working of organized systems, i.e. systems whose behavior was under control so as to achieve some preset goals" (1966, p.5).
As may be appreciated, this very fundamental aspect is merely implicit in PASK's and EMERY and ACKOFF's definitions.
P.S. HENSHAW explains: "Information is an entity apart from the means by which it is processed, the symbols by which it is represented, or the responses made to it. Messages contain intelligence (dispositional, organized, usable and available information), but as such do not themselves constitute intelligence.
"Information can be transmitted and stored by means of impulses and symbols, but it exerts an influence only after removal from the encoded message and interaction with other information, as can happen in a cell, a brain or a computer…
"Information as such is not amenable to quantification in a manner comparable with that which applies to entities having force, mass and charge.
"In the last analysis, information consists of data, especially unrelated facts and statistics. With organization of data, intelligence is created and with further organization of organized data, intelligence is developed and advanced" (1975, p.18).
It seems that HENSHAW uses the term "intelligence" in the sense of "understanding".
Moreover, information "… can be created and destroyed" (p.18). This is also ambiguous. What can be destroyed are raw data, or organized sets of data, or the results of the treatment of data, or even some forms of understanding of these results.
In effect some original rules of organization of perceptions are needed in order to obtain organization of data, and that both kind of organization must become constructed as properties of a progressively self-constructed network.
- 1) General information
- 2) Methodology or model
- 3) Epistemology, ontology and semantics
- 4) Human sciences
- 5) Discipline oriented
To cite this page, please use the following information:
Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]
We thank the following partners for making the open access of this volume possible: