Cybernetics in a Post-Structuralist Landscape

by Simon Biggs, 1987

Abstract

An essay into the historical/philosophical relationship between Cybernetics (as developed through the work of Norbert Weiner and Alan Turing) and Structuralism (in the broad sense, including the work of Wittgenstein and Chomsky) and the later development or position of Cybernetics in a Post-Structuralist context - that which is often described as Post-Industrial or Post-Modern (as found in the work of Jean-Francois Lyotard and Jean Baudrillard). The essay explores the work of a number of artists of both historical and contemporary interest (including Iannis Xennakis, Alvin Lucier, Richard Teitelbaum and Felix Hess) relative to the general theme.

Introduction

Cybernetics was developed in part through the work of Norbert Weiner and Alan Turing from the 1930's through to the 1950's. Weiner's ideas focused on the characteristics of control and communication in a variety of systems, including machines, animals, humans and social groups. He employed, in part, James Watt's model of the automatic steam governor (basically a pressure operated valve action pressure release mechanism) as a metaphor to describe similar processes of automatic self-referential control systems operating in more sophisticated automated machines, biological systems, the mind and social structures.

Turing postulated the idea of a machine that was re-programmable, where functional parameters could be programmed into the system, a machine that could be any machine and control any machine. Turing further developed this concept to include the notion of a machine that could program itself, in effect creating a recursive, self-regulating and self-referential system.

Weiner's concept of self-regulating systems and Turing's idea of the self-programmable machine were further developed during the 1940' and 1950's by scientists, engineers and theorists who initially wished to model the nature of intelligence, both to come to a better understanding of it and in order to develop an "artificial intelligence". Much of this work was carried out in the USA where successive governments funded related research, recognising its potential military and industrial applications. Major institutes specialised in this research thus emerged during the 1970's and 1980's.

The term Cybernetics came to stand for much of the work occurring in this field, particularly that of artificial intelligence (A.I.) research, although activity was also carried over into sociology, psychology and eventually a significant proportion of the technological development during this period that has led to many of our daily technological objects. A.I. drew not only on Weiner and Turing but to an extent on work carried out in aspects of linguistics and psychology. In particular, the work of Wittgenstein and Chomsky, which had been developed contemporaneously.

A.I. researchers recognised quite early in their work that central to the processes of intelligence and control were those of communication and language. Intelligence was seen to be the exercise of control, both internally and externally, through language, language being seen in turn as both the content and the medium through which control was realised.

Most, but notably not all, A.I. research was carried out through and applied to computer systems. The computer functioned as the concrete realisation of the Turing Machine and by the 1950's it had far outstripped its conceptualisers vision. Research in this area continued throughout the 1960's - again, chiefly in the USA, but also in the USSR and Europe. During this period a number of artists seriously turned their attention to what they saw as intriguing ideas and processes and their potentially far reaching implications . Of course, as with all new ideas, it also attracted many who were more interested in the effects side of this research, and thus their work was largely to effect only, without substance or inspiration (only proving that art cannot be produced to formula).

One of the first artists to turn their attention to A.I. and Cybernetics was Iannis Xennakis . For Xennakis music was simply organised sound and his fascination centred on the possibilities of this organisation and its relation to aesthetic experience. He developed a number of automated or semi-automatic compositional systems, thus foregrounding the systems themselves, derived from work in Cybernetics, including "game" based compositional techniques (based on Game Theory, a branch of mathematics concerned with the logical structure of problem solving) and stochastic methodologies (techniques arriving at statistically derived random structures). Xennakis's work was in marked contrast to the cosmetically similar "random" music of John Cage, which originated out of ideas associated with the neo-Dada of Fluxus and was heavily influenced by Eastern philosophy as well as traditional African music and Jazz. Xennakis based his work on mathematics and seemed motivated by an almost classical, purest, desire to find the "perfect" musical composition, placing his work within the well defined tradition of European Avant-garde classical music.

During the 1960's a number of other artists began to work in this area, often with sound or what came to be known as "sound-sculpture". Given that the nature of the ideas they were working with were process based this called for time based media. At that time media such as video, robotics and various electro-mechanical systems were either largely unavailable or prohibitively expensive. However, the cheap electronic musical device, or synthesiser had become popularly available, along with various peripherals that were, in effect, specialised small computers. Although some interesting work did occur outside the sound media - for example, the work of Edward Ilhanowicz and Robert Breer (robotics), James Seawright (interactive sculpture) and Vera Molnar (visual arts) - for the economic reasons alluded to above the larger part of this activity was in the sonic arts. Amongst the artists working in this field at that time were David Tudor (often collaborating with John Cage), Robert Whitman, Steve Reich, David Rosenboom, Alvin Lucier and Richard Teitelbaum.

Cybernetics and Structuralism

The bringing together of Cybernetic and linguistic theory (particularly that of Wittgenstein and Chomsky) was, at the time, a powerful mixture. The late Empiricism of a high-Industrial culture (although already post-Industrial tendencies were evident), typified by Popperian rationalism, was well served and expressed by both the apparent logic implied in each discipline and their almost dream-like potential. Some of the more wild claims by some of A.I'.s adherents included suggestions that the "meaning of life" was to found in such totalising research. As we all know, this was eventually determined by Monty Python's.

Chomsky and Wittgenstein's ideas regarding language were certainly different, but both agreed that the codes that constitute a linguistic net are specific and that they are logical in their development. In the first belief they held distinct positions from their Structuralist contemporaries (particularly as found in the work emerging from the Prague School, following on from that of De Saussure) but in the latter they were mostly in general agreement. Regardless of the specificity of language - the fixed or unfixed relationship between the signifier and the signified - the various approaches at the time all centred on the logicality, or genealogical, internal relationships of language systems. As such, the idea that languages evolve according to set rules and that therefore the relationships between terms could be described logically and historically. In some ways this approach can be seen as a hang-over from Nineteenth Century Comparative Linguistics, which was largely occupied with describing the genealogy of languages and the classification of terms.

For the Cyberneticist this form of linguistic theory, and that of structuralism proper was just what was needed to flesh out their ideas remembering that they recognised the primacy of language in all systems of control and thus, by extension, intelligence.

To this end a great deal of A.I. research focused on language and particularly on understanding genealogical rules for the development of signifying systems. The initial objective was to develop a programmable formulation of parameters that would allow the computing system a degree of autonomy in the development of its own processes of signification. The value of the this approach was seen to be not only in the modelling of the structures of language (and thus systems of control and behaviour) but also the processes of learning (at the time considered in the field to be a "gestalt" process of linguistic development related to external stimuli). A.I. also drew on emerging discoveries in genetics, finding in the DNA model another metaphor to reinforce the A.I. model. The Darwinian implications were not lost to A.I. researchers either .

Cybernetics and Post-Structuralism

An attempt to describe Post-Structuralist thought as a coherent tendency in ideology and philosophy would be misplaced, given the characteristic plurality of a period notable for divergent praxis. During the 1960's Structuralism did not so much decay as fragment, producing an ephemera of ideological shrapnel that has since come to form the irregular topography of an epoch often referred to as Post-Industrial or Post-Modern. This is a period concerned not so much with redefinition as de-definition, a stripping or deconstructing of meaning.

As such the high Modernist framework within which Cybernetics was constructed has fractured. The crisis in Modernism that occurred in the 1960's and 1970's saw a collapse in interest amongst philosophers and artists in the paradigms which Cybernetics largely depended upon for ideological support. The reasons for this abandonment of what had a been a dynamic tendency are complex, but central to them was the general loss of faith in the Modernist paradigms that constituted orthodoxy from the late Nineteenth Century until the mid-Twentieth.

However, although Cybernetics was rejected by many thinkers and practitioners, and especially by many on the Left, the possibilities suggested by it continued to be pursued in the "hard" sciences. This has since directly led to the development of, amongst other things, the Personal Computer and the Cruise Missile. Both of these developments grew directly out of A.I. research, the PC deriving from work on high-level software and User-Interface research and the Cruise from work in remote sensing, artificial vision and servo-control systems.

Given these, and other developments, it would seem that Cybernetics as a practice is still very much with us today. In point of fact, it has probably had a more profound and lasting effect in the 1980's that in the previous thirty years . The danger here is that there are those who perhaps should have continued to address the implications of A.I., that is to say, those with a different outlook, who tended to be attracted to Post-Structuralism, have seemed to ignore it at their own, and others, peril.

Notably a few thinkers have sustained the development of ideas in relation to the possibilities and implications of Cybernetics, but within the general field of Post-Structuralist thought. Amongst them are Jean-Francois Lyotard and Jean Baudrillard. Lyotard has developed the notion of "l'immateriaux", which is intended to address a culture that has dematerialised and come to be constituted as a culture of immaterial signals. Lyotard sees this process as intrinsic to the rapid evolution of information technologies - computers, satellites, holography, etc.

Related to the ideas of Lyotard are also those of Baudrillard, who has argued the concept of "sign value" . Baudrillard regards this as a recently evolved meta-discourse in our economic and cultural milieu, a discourse central to the contemporary production of meaning and value which recontextualises the Marxist concepts of "Use Value" and Exchange Value". The idea of "Sign Value" seeks to postulate an ethic and aesthetic of production and consumption based on the "systems" components capacity to signify in relation to a systems totality. As such, the intent of production becomes not so much to make things we need or want to use or that can support an exchange based economy but rather to produce things that will function as signifiers relative to their producers and/or consumers.

Like Lyotard, Baudrillard sees this development as directly related to technological change and the shift in the nature of communication within our culture. Baudrillard contextualises this line of thought as a form of political-economy which, like Marx before him, is seen as inclusive of social value systems in general.

Both Lyotard and Baudrillard can be seen to have drawn to some degree on the work of Michel Foucault . A clear connection can be discerned between the idea of Lyotard and Baudrillard and Foucault's concept of the panopticon. The panopticon, inspired by Bentham's ideas on control and discipline in Nineteenth Century prisons, function for Foucault as a metaphor for how we, as individuals and as a social body, implement automatic control systems. This idea can be seen as closely related to those of Weiner in particular, although Foucault's approach and style is far removed from the former. However, like Weiner, Foucault sees the panopticon metaphor as central to much of our recent technological development , and it is this possibility that both Lyotard and Baudrillard have drawn on and therefore, in a fashion, continued a critique of a cybernated culture.

Similar to Lyotard's and Baudrillard's development are the practices of a number of artists. Amongst them are Alvin Lucier, Richard Teitelbaum and Felix Hess. Both Lucier and Teitelbaum were active at the close the 1960's and can be seen to have developed in relation to and from the prevalent structural and Cybernetic approached of the time in directions more cogniscent of recent post-Modern thought. Hess became active as an artist during the 1970's and 80's and therefore this development is less apparent in his work…although his background in the physical sciences heavily informs a practice predicated upon ideas arising from the same traditions. All of these three artists work with sound, fascinated by the processes involved in communication and control, and yet each has taken a distinct path in their practice.

Lucier has done much in the past twenty years in opening up our definitions of what we might call music, not only in new approaches to composition but also in performance. In doing this he has often worked with technology and, like the other artists discussed here, focused on problematising the artists role in relation to the audience. Lucier's "Music on a Long Thin Wire" forces the viewer to confront both their own presence and their immediate environment. The piece itself consists of a long thin mono-filament passed through a magnetic "pick-up" at each end which function to amplify the wire's interaction with its immediate environment. Stimuli can include air temperature, pressure, humidity, light and movement (including human) and the slightest change in the environment can profoundly effect its state and thus the character of the emergent soundscape. The implications of this work are numerous as it functions to deconstruct the relationships between artist, viewer, artwork and environment and render them in a fluid state.

Teitelbaum has for some years been exploring musical collaboration with automata, not dissimilar to Colon Nancarrow's use of the Pianola, but in a far more complex and spontaneous manner. A Teitelbaum concert might consist of a musician sitting at a piano upon which they improvise around a theme. The player is surrounded by other similar pianos all of which are being played by electro-mechanical systems under computer control. The computer is programmed to interact with the human musician, just as a Jazz ensemble will feature the spontaneous interaction between human musicians, with the computer and musician in a constant improvisatory exchange that leads to the computer changing its responsive parameters throughout the performance.

For Teitelbaum these computer collaborators are akin to Shelley' Frankenstein, or the older proto-Frankensteinian myth of the Golem. Indeed, the composer has even composed an opera on the Golem theme. In this can be seen the artists obsession with the process of creation and reproduction and humanities desire to reproduce itself, not only sexually but, if possible, through its artefacts and disciplines such as alchemy and A.I. The psycho-analytic aspects of this process - the artefact as a mirror of ourselves - describes a field that addresses itself directly to these issues, as also developed by Lyotard, dealing with technology as the central expression of the post-Modern; the computer as another expression of the Mirror Phase.

Felix Hess has been working on communication patterns amongst animals, particularly frogs, for many years and this has led him to release some of his field recordings of frogs, from far-flung sites around the world, in the form of "found" musical works. Hess does not claim these recordings are music as such - his initial interest was scientific (he is a physicist by profession) - or for that matter anything more than field recordings of frogs, however they have been received as music. This disparity between intention and consumption has more than an echo of Baudrillard's thoughts on the simulacra.

As a scientist Hess began to model the emergent communication patterns he discovered through the recordings, initially on paper but later in real space. He took this to its logical conclusion by constructing a network of small electronic devices each of which could both hear and emit sound. Each box, a frog-like unit, was programmed so that it could discern between a sound in a certain "good" frequency (sounds like those it made itself) and sounds that were hostile (deeper sounds, actually closer to that of the human voice). Good sounds would encourage the unit to emit sounds whilst negative sounds would inhibit it. The result is a complex, although in principal very simple, ecology of sound that constantly swirls around in the space where it is installed, as various units triggered off other units in particular locales of the space and human visitors knowingly or unknowingly interacted with the work.

This work has been shown in galleries as sound installation and performed at concert-like events in theatres. The position of the artist is doubly problematised here as Hess renounces that what he is making is art and that he is not an artist, more a dispassionate observer modelling found behaviour, and at the same time due to the manner in which the work ignores or defers many of the basic characteristics that we associate with the exhibition or performance.

This querying of the authorial role takes on Derridean proportions as the creator is negated by both the manifestation of the work and their attitude towards it and their audience. This relationship is further complicated by the "frogs" responding to aural activity in the installation or performance space, with the "artist" having perhaps a finer appreciation of why things are as they are but in the end having a role relative to the final work little different to that of any member of the audience.

It is perhaps the plurality of intent and interpretation that these artists have introduced into their work that separates them from the Cybernetic art of the 1960's as they engage not only with the technology and its underlying principles in areas such as A.I. but also with contemporary work in psycho-analytical theory, philosophy and the general field of post-Structuralist discourse. Each of these artists questions our relationship with our artefacts, our productions in a post-Industrial context, that reflects not only Lyotard's l'immateriaux or Baudrillard's "Sign Value" but also the broader and more specific social implications of technology. A fascination with language and control, with communication and power, is central to their work, contextualised within a broader framework of post-Structuralist thought, either consciously or unconsciously, and is thus more responsive to a deconstructing world.

1. Coined by Norbert Weiner in the 1930's in his book "Cybernetics - Control and Communication in Animals, Machines and Society".

2. A number of texts have covered this field, including Jonathon Benthall's "Art, Science and Technology", Jasia Reichhardt's "Cybernetics and Art" and by other authors such as Douglas Davis and Gene Youngblood. Annual updates on work in the field can be found in the Ars Electronica (Linz, Austria) catalogues, published since 1981.

3. Xennakis, born in Greece but resident in Paris for much of his professional life, worked primarily with musical composition but also collaborated on architectural projects (most notably with Le Corbusier for the Brussels Expo), poetry and visual arts. His book, titled "Music and Mathematics" is a dense and comprehensive profile of his ideas and methods.

4. DNA can be regarded as a self-regulatory system capable of modifying its internal language-like structure in response to both internal and external stimuli. Darwin's concept of natural selection and random genetic mutation was interpreted as another example of Cybernetic processes functioning at the level of biology and population.

5.Writers such as Richard Gregory (UK) and Paul Hablemarde (USA) are still producing influential work in the philosophy of A.I. Hablemarde's "Mind Design", although a well balanced collection of essays on A.I., documenting both its successes and failures, still functions within and is an apologia for the elemental Modernist paradigms that underlie Cybernetics. Generally speaking, Cybernetics has become the orthodoxy in the computer sciences, as evidenced by the role of institutions such as MIT and Stanford University and the pre-eminent positions of ideologists such as Marvin Minsky.

6. In books such as "The Post Modern Condition" and "Driftworks" Lyotard has addressed the implications of a technological culture and how this impacts upon how we see ourselves and value our knowledge. Lyotard also curated "Les Immateriaux", a major exhibition at the Georges Pompidou Centre, Paris, which sought to develop his ideas in to the objects of his discourse - high art, design, technology and consumer items - with the objective of illustrating his central concept of a dematerialising culture constituted in its signals and messages rather than its things.

7. Jean Baudrillard's "Towards a Critique of the Political Economy of the Sign". His earlier "Mirror of Production" built the general ground work for this essential critique of Marxism and popular culture.

8. Michel Foucault, "The Order of Things" - an Archaeology of Knowledge" and "Discipline and Punish".

9.Mark Poster's "Foucault, Marxism and Technology" is an in depth study of these aspects of Foucault and although no direct reference is made to Weiner or Cybernetics it is difficult not to draw connections.

copyright Simon Biggs 1986