You are on page 1of 8

Norbert Wiener (2000), Cybernetics: or Control and Communication in the

Animal and the Machine (1948), Cambridge, Mass.: MIT Press.

Note: cybernetics is a general term but I could also call this area of
research in organised systems, information theory or communication
theory or systems theory - the beginnings of an understanding of
complex systems.

Herein, the entire understanding of control and communication (in the


animal and machine) is characterised by the term 'Cybernetics' for
Wiener (2000) - taken from the Greek word meaning 'steersman'. Writing
in 1948, but recognising a longer history concerning governors (for
instance, the first article on feedback mechanisms by Maxwell in 1863),
the link to controlling mechanisms can be found in the etymological
roots of the terms themselves (governor is a Latin corruption of the
Greek word for steering) - the earliest references being to the
steering of ships and the feedback therein (2000: 11-12).

The link between philosophy and mathematical logic is crucial to an


understanding of the development of cybernetics too, in the increasing
mechanisation of processes for computation (by this, I mean the more
general understanding of computation as the procedure of calculating;
determining something by mathematical or logical methods). Leibniz in
particular is important in combining 'universal symbolism' and 'a
calculus of reasoning' into the 'construction of computing machines in
the metal' (Wiener, 2000: 12). The link is thus drawn between
mathematical logic and the mechanisation of processes of thinking.

Factory:

An interesting aspect of Wiener's work on cybernetics is that it


carries a moral thread in accepting that new developments in
mechanisation have 'unbounded possibilities for good and for evil'
(2000: 27). He is thinking of an obvious instance such as the
deployment of the atomic bomb, but more specifically the idea of the
automatic factory and assembly line production. He sees this as a
distinct reality, as a 'non-metaphorical problem' (2000: 27), imagining
a workforce of mechanical slaves to perform human labour. He remains
undecided as to whether this is a good thing or a bad thing, but thinks
any assessment cannot be simply made in terms of the market (or money
saved as a result of mechanisation) but must include an understanding
of the conditions of labour. For him, any level of 'competition'
between machine slave labour and human labour is a certain acceptance
of the conditions of slave labour even if on the surface it appears to
decrease human suffrage.

Wiener traces the possibilities from the first industrial revolution -


'the devaluation of the human arm by the competition of machinery' to
the 'modern' industrial revolution where the devaluation will become of
the human brain (2000: 27; he is thinking of the brain of the skilled
scientist and administrator in particular). In such a scenario, 'the
average human being of mediocre attainments or less has nothing to sell
that it is worth anyone's money to buy' (2000: 28). Therefore a society
has to be forged that works on a set of principles other than those of
market forces. This, for Wiener, is crucially important for anyone with
an interest in labour conditions such as labour unions. In other words,
labour requires a broader understanding of its conditions to include
social, political, economic and technical questions.

Automata:

For Wiener, it is the desire to produce automata that is expressed at


each stage of human and technological development - 'the living
technique of the age' (Wiener, 2000: 39). If technical development can
be traced from the age of clocks in the seventeenth and early
eighteenth centuries to a later eighteenth and nineteenth century age
of steam engines to the twentieth century of control and communication,
the clock mechanism continues to be a good object to describe such
changes - and to describe analogue and digital processes.

More precisely, in the time of Newton, it is clockwork automaton that


combines technical achievement with philosophical ideas. For instance,
Descartes considers non-human living things as automata at this time as
a way of reconciling the idea of them having no souls (Wiener, 2000:
40). In this thinking, main and matter are seen to be autonomous
entities but little attention is given to the dynamic interrelation of
the two. This is why the work of Leibniz is particularly influential
for Wiener, as someone engaged with the dynamics of mind and matter.
Leibniz introduces the idea of 'monads', semi-autonomous entities that
correspond to each other through a pre-established harmony (of God;
2000: 41). These monads are compared to clocks 'wound up so as to keep
time together from the creation for all eternity', with such perfect
workmanship of the Creator that they do not run out of time with each
other (as would be the case with human produced clocks; 2000: 41). They
reflect one another but this is not a causal relation, in that they are
autonomous to all intensive purposes and closed from influence from the
outside world. Automata, for Leibniz, is constructed like clockwork.

In the nineteenth century, machine and natural automata (plants and


animals) were considered rather differently in parallel to the
predominant technology of 'energy'. Living organisms were thus seen in
a limited fashion to be 'heat engines', burning fuel, but again with
little attention to the complexities of operations - that take account
of external and internal factors (such as energy flow, metabolism, and
incoming and outgoing messages akin to sense organs). To Wiener:
'In short, the newer study of automata, whether in the metal or flesh,
is a branch of communication engineering, and its cardinal notions are
those of the message, amount of disturbance or "noise" - a term taken
from the telephone engineer - quantity of information, coding
technique, and so on.' (2000: 42)
In this thinking, an analogue is struck between the human and machine
systems, but rather differently than simply seeing living things as
machines. Storage and transfer of information might be described in
physiological terms, but this is not to say one simply stands for the
other or is equivalent but that a description in these terms might lead
to complex understandings of the processes at work. In some ways this
seems to be a return to a mechanistic world-view but a thoroughly
agnostic one, and one complicated by a fuller description of living
matter as organism.

The relationship to time is also important for Wiener as he sees input-


output as a consecutive relation of past-future (where does the present
stand in such a conception?). According to Wiener (in his chapter
'Newtonian and Bergsonian Time'), the modern conception of automata
conforms, not to a Newtonian model, but to a Bergsonian one - in
keeping with the description of living organisms. Bergson emphasises
the inadequacy of a Newtonian description of biology: 'the difference
between the reversible time of physics, in which nothing new happens,
and the irreversible time of evolution and biology, in which there is
always something new.' (Wiener, 2000: 38; residing somewhere between
the matter of Newton and the anthropomorphism of 'vitalism').

Feedback (see Bill Nichols notes too):

Much of the Wiener book is concerned with mapping these principles of


describing the living organism in all its complexity (and the
mathematics lies beyond my comprehension andgives me a big headache).
One important aspect is 'feedback' that describes a certain
intervention in the transmission and 'return' of information. Wiener
describes this rather unpredictable aspect as a human link in a chain
of events but also as an automated link with human intervention (2000:
96). His example is a thermostat, switching on or off depending on the
temperature of a particular space at a point in time. If all works
well, the temperature might remain constant. The governor of a steam
engine is a classic example of a mechanical version of the same
principle, regulating velocity depending on the load the machine bears
and keeping its operations constant. The point for Wiener is that
voluntary movement in humans is regulated in much the same way, and
that human 'disorders' can be used to demonstrate faulty feedback in
this way (such as Ataxia, and this can be further explained in
mathematical terms but I prefer an act of faith in this regard). In
such a scenario, a 'compensator' (something that can be controlled from
the outside because the load fluctuates) is required as well as an
'effector' (the input-output relations) in order to compensate for the
faulty information feedback and to reinstate control (2000: 113). This
is grossly oversimplified (especially in this case, but also in the
book) especially when translated to the human organism but the overall
principles clearly hold some insights. [see Steve Grand's Creation:
life and how to make it (2000) for a more contemporary but all the same
simplified perspective]

Feedback operates between the eye and muscles to make sense of the
difference between objects (make meaning in other words) - a visual-
muscular feedback system. This is relatively simple in the operation of
a flatworm and thus easy to see the parallel between living organisms
and artificial mechanisms. In the human organism (or more specifically
in Wiener's example with human vision, 2000: 133-143), these operations
are decidedly complex with interlinked subordinate feedbacks that work
together in complex ways (like a computing machine to Wiener) - such
that the organised whole is more than the sum of its parts ('gestalt').
The point of course is to better understand the human organism to built
better artificial mechanisms, and in turn in the spirit of feedback,
through this process to better understand the human organism (Wiener is
thus hopeful about the potential for sensorial prothesis). He adds a
warning to those who may draw 'specific conclusions from the
considerations of this book do so at their own risk' (2000: 144; he is
concerned with the correlation of cybernetics and psychopathology). The
visual cortex (or indeed brain) and the computer have much in common
but are not one and the same or reducible to one another. However,
their behaviours are similar and reveal much about eachother in their
chain of operations, perhaps especially in understanding errors and
malfunctions (for instance, in distinguishing functional and organic
disorders).

By this comparison, it can be seen that it is not simply an empty


physical structure of the computer that corresponds to the brain but
'the combination of this structure [hardware] with the instructions
given it at the beginning of a chain of operations [software] and with
all the additional information stored and gained from outside in the
course of this chain [feedback]' (2000: 146). In other words, hardware
and software operate through complex interrelation and correlation of
form and contents. The human brain as the most complex and longest
chain of operations in the animal world is thus particularly prone to
disorder and breakdown (reparation extends the analogy through the
deployment of disk-doctor when perhaps disk-psychoanalyst would be
additionally required to satisfy the functional and organic errors).

Stupid (not-intelligent) system

This analogy is further extended in the chapter 'Information, Language


and Society' to other organisational structures that are constituted
through smaller organisational units (2000: 155; what has become known
as 'cell theory'). This is apparent in examples from Liebniz to Hobbes;
from an understanding of the human organism as a mass of smaller living
organisms to the Leviathan nation-state made up of lesser parts and
individuals. It is the social organisation that allows for the
qualitative as well as quantitative difference in elevating the
collective over the individual. Social animals such as humans (but also
other herds, packs, nests, hives, colonies) organise themselves into
efficient collective groupings that extend the individual to that of
the overall system they are part of (like a nervous system). His
example, and many since (such as Kevin Kelly), is the beehive where:
'All the nervous system of the beehive is the nervous tissue of some
single bee. How then does the beehive act in unison, and at that in a
variable, adapted, organised unison? Obviously, the secret is in the
interconnection of its members.' (2000: 156) The interconnections are
complex, and vary in complexity from the operations of the hive to that
of civil society (from the Holy Roman Empire to the United States of
America; extended again to the disorderly order of Hardt and Negri's
'Empire') that acts upon the intricacies of language and communication
technologies for the transmission of information. These language skills
are also in addition to the subtle communications skills that are
intrinsic to the human or animal organism (ref. Chomsky on language?).

Wiener describes the body politic as a particularly inefficient


organism in this respect, demonstrating 'an extreme lack of homeostatic
processes' (in other words, that a relatively stable equilibrium will
be reached by independent elements of the system). This belief, he
claims, has risen to an official act of faith in the United States
where free competition is regarded as a homeostatic process much
against evidence (what Wiener calls a simple-minded theory - a
description of free market capitalism itself). In this view the
individual capitalist is regarded as a public servant who deserves the
profits gained from their actions - rather than a selfish individual
who steals surplus profit and disrupts the social equilibrium. He says:
'A group of non-social animals, temporarily assembled, contains very
little group information, even though its members may possess much
information as individuals. This is because very little that one member
does is noticed by the others and is acted on by them in a way that
goes further in the group. [...] There is thus no necessary relation
between in either direction between the amount of racial or tribal or
community information and the amount of information available to the
individual.' (2000: 158)

The market is a game (exemplified by Monopoly) with clear winners and


losers (these operations have been studied elsewhere by Von Neumann and
Morgenstern in their general theory of games; in John von Neumann and
O. Morgenstern, Theory of Games and Economic Behaviour, Princeton:
Princeton University Press, 1944). In computer games in particular,
there are clear analogies with politics and war where the individual
strives for reward and the annihilation of competition, and clearly no
emergent homeostasis is demonstrated. For Wiener:
'We are involved in the business cycles of boom and failure, in the
successions of dictatorship and revolution, in the wars which everyone
loses, which are so real a feature of modern times' (2000: 159).

It is perhaps an exaggeration to think of the individual player as


completely ruthless as implied but there are disturbing tendencies and
analogies to the political and social system all the same. There is a
'policy of lies - or rather, of statements irrelevant to the truth'
(2000: 159) that encourage certain choices in the game of consumer
capitalism and neo-liberal democracy - encouraging the 'fool' to buy
certain products, and vote for particular candidates - to the profit of
the 'merchants of lies'.

The solution to this rather bleak description for Wiener relies of an


understanding of the weight of pubic opinion, and the development of
uniform levels of intelligence and behaviour within social groups.
There can be demonstrated in small groups in particular where relative
homeostasis can be discerned. The system self-organises into a
relatively equitable one. Against this, larger communities protect
these interests by privacy, property rights and individualism. As has
been proven by recent history and is clearly evident in the workings of
contemporary politics: 'the control of the means of communication is
the most effective and most important' factor in this respect (2000:
160). This logic holds because each individual or organism is a conduit
for the 'acquisition, use, retention and transmission of information'
(2000: 161). Communications technologies hold tremendous power in the
game of lies and in the transmission of half-truths.

Wiener presents a passionate critique of contemporary society where


'natural and human resources are regarded as the absolute property of
the first business man enterprising enough to exploit them' (2000:
161). Through the means of communication, certain 'state capitalist'
[my interpretation] tendencies emerge:
'the elimination of the less profitable means for the more profitable;
the fact that these means are in the hands of the very limited class of
wealthy men, and thus naturally express the opinions of that class; and
the further fact that, as one of the chief avenues to political and
personal power, they attract above all those ambitious for such power.
That system which more than all others should contribute to social
homeostasis is thrown directly into the hands of those most concerned
in the game of power and money, which we have already seen to be one of
the chief anti-homeostatic elements in the community. [... Tragically]
the State is stupider than most of its components.' (2000: 161-2)
The system is not intelligent after all. Is an intelligent system
simply wishful-thinking?

Certainly Wiener presents a scenario that is counter to much of the


wishful writings of social and political theory in general. For Wiener
any optimism for social change must take account of systems theory -
and indeed, we might add that systems theory should take account of
cultural theory. Part of the problem for Wiener is the 'loose coupling'
with the phenomena under study - 'too small to influence the stars and
too large to care about anything other but the mass effects of
molecules, atoms, and electrons' (2000: 163). Indeed, as modern science
has demonstrated, everything that we study is 'dispersed and distorted'
by the act of studying it - an investigation of the stock market is
likely to upset the stock market for instance (note: this is often
characterised as the relationship of the observer to the observed that
has since become a more common internal critique of the disciplines
Wiener is referring to such as anthropology, ethnographer, politics and
the social sciences - where subjectivity in addition to objectivity is
accounted for. Von Foerster raised this reflexivity question in more
detail of how to account for the role of the observer on the behaviour
of the system, Heinz von Foerster (1981) Observing Systems, Seaside,
Cal.: Intersystems Publications). Interestingly, and despite his
undoubted faith in science, this is not a blind faith, or false
optimism (but one tempered by realism) where the non-verifiable, the
non-scientific work of the historian must also be taken account of.

At the end of this chapter, there is a note that extends the interest
in games theory whether it is possible to construct a chess-playing
machine (2000: 164 - note: Coding can be compared to chess in terms of
strategy). The link to the work of the historian might be further draw
here in alluding to Benjamin's 'Thesis on the Philosophy of History'.
(link to Benjamin allegory here)
Chess is a particularly good example as in the human imagination, it
appears to embody the ultimate intellectual challenge as well as the
classic battle of opposites - think of at the time of the Cold War and
the (ideological) contests between Fisher and Kasparov (USA ad Russia
respectively) where political divisions are played out and sublimated
into the game itself. Kasparov in fact, at the time world champion,
played a computer and lost preferring to believe that it was not
possible for the machine to win without human trickery (in Woods, 2002:
102).

It would be relatively simply to construct a machine to follow the


rules of the game without a sense of the merit of the play but to
construct one with a will to win would be a more complex operation. To
do this the machine would require an understanding of the opposing
player's operations as well. This is altogether possible by
constructing a machine that would calculate all its own and opponent's
admissible moves and as a result assign valuations to these moves. It
would operate at a level of proficiency equivalent to the majority of
the human race (but fall short of Maelzel's fraudulent machine of 1818
- see section on the chess player). An ordinary digital machine could
easily be adapted to the rules of the game. But how to make it even
better, and less stupid?
Characteristic of living systems, is the ability to learn and the
ability to reproduce. Thus all living organisms adapt through evolution
to conditions, changing behaviour that better suits the conditions to
allow for continued survival and fitness better suited to the
environment. The levels of this vary considerably depending on the
complexity of the organism. The question for Wiener is as to what
degree machines can be seen to do this? Clearly to some degree, man-
made machines can learn and can reproduce themselves.

The learning aspect can be demonstrated in games theory, where


strategies and tactics to win or not lose the game are developed
through experience. This is not a complete strategy but an
approximation, such as the von Neumann 'approximate theory' where the
player acts with caution to avoid defeat. However, many examples of
tactics would demonstrate that caution is only one such tactic and
suitable to particular circumstances, and against particular opponents.
Thus, to develop a sophisticated chess-playing machine, it would need
to adapt its behaviour and tactics according to the particularities of
the circumstance (bearing in mind how literal its operations are). It
would need to learn but its failures and successes across a series of
games and adapt itself accordingly so as not to reproduce exactly the
same moves under the same circumstances but to do so against learned
criteria of success and failure. The learning process must be divided
into a number of stages and not simply be subject to uniform evaluation
and responses. In other words, it requires two orders of programming:
one where linear one and one in which the past is used to determine the
first. This predictive element again has significance in terms of the
work of the historian - working both in a linear and non-linear way. So
too the learning machine, must be programmed by experience or non-
linear feedback (2000: 177) and not reduced to its limited, linear
functionality.

Self-propagating machines also work through the correlation of linear


and non-linear processes, wherein an analogy is struck with living
systems. Wiener's example is a non-linear transducer where the output
is determined by the past of the input, but where the adding of inputs
does not result in the corresponding outputs: 'One property of
transducers, linear or non-linear, is the invariance with respect to a
translation of time.' (2000: 178) He points to the analogy with
genetics: 'when a gene acts as a template to form other molecules of
the same gene from an indeterminate mixture of amino and nucleic acids,
or when a virus guides into its own form other molecules of the same
virus out of the tissues and juices of its host. I do not in the least
claim that the details of these processes are the same, but I do claim
that they are philosophically very similar phenomena' (2000: 180).

Binary code (ref. digital dialectic):

Wiener describes two main types of computing machines: the 'analogy


machine' where the data is represented by measurements and 'numerical
machines' where the 'data is represented by a set of choices among a
number of contingencies, and the accuracy is determined by the
sharpness with which the contingencies are distinguished, the number of
alternative contingencies presented at every choice, and the number of
choices given' (2000: 117 - for more on contingency, see Godel's
formulation). In the case of the numerical machines, each choice is two
as it is constructed on a binary scale of '1' or '0'. Instructions for
computing, similarly operate in a binary mode - combining contingencies
by using algorithms that follow this logic. One of the simplest is
'Boolean', based on the dichotomy between 'yes' and 'no'. Thus, data
follows both an arithmetical and logical binary form, as a set of
choices between two conditions - for instance, in the case of switching
between 'on' or off' in a series of relays. These relays work as a
series of actions as part of a set of iterative processes (akin to
historical relations) that include 'memory' (the ability to use past
results for future operations) which is the kind of analogous
operations that allow Wiener to link the processes of natural and
artificial machines (computing machines and the nervous system of
animals). However, 'the brain, under normal circumstances, is not the
complete analogue of the computing machine but rather the analogue of a
single run on such a machine' (2000: 121 - this is rather like the best
advances in artificial life making only the crudest analogue of
plankton).

An important principle that again displays the analogue is that a


machine working on a repetitive cycle in this paradoxical manner
answering 'yes' and 'no' would never arrive at an equilibrium. The idea
of a machine that 'learns' becomes the way out of this paradox working
on the principles of 'similarity, contiguity (where something evokes
something else), as well as cause and effect' (2000: 127). In this way,
a computing machine might display 'conditioned reflexes' as a 'nervous
computing machine' (2000: 130) - more than simply a machine in action,
which combines relays and storage mechanisms. This is clearly a
description of a learning machine that might arrive at a solution but
only (like a dialectician) as a result of an 'iterative process of
successive approximations. The process must be repeated a very large
number of times [...]' (2000: 130) - if indeed a final resolution is
desirable at all (in the case of dialectics). For Wiener, care is
required in setting up the problem in the first place so as not to
compromise any later solution - this is a useful reminder for much of
the applications of contemporary computation.

For Wiener, 'Information is information, not matter or energy. No


materialism which does not admit this can survive at the present day.'
(2000: 132)