Strong Theory, as in strong AI or AL (Artificial Intelligence or Life), has already yielded the next emergent high level cognitive reorganization. From the perspective of the new sciences of the artificial, reality is one of many possible configurations of self-organizing complex physical systems coadapting in parallel with complex symbolic systems. In this view, biological life (carbon based), universal constants, and human consciousness are merely a subset of the "possibility" of forms. Simulated (computed) environment, once considered a "virtual reality," is now regarded as a laboratory for the exploration of rule based systems and behavior within multiple pathed evolutions. The following is a brief survey of domain applications of this theory, and how art is integrated throughout as a science of design, but could be more consciously developed as Propostional Realities (PR).
The inquiry system, Strong Theory, expounds an open or adaptive model. This means the inquiry system itself will be regressively altered by its object. A new vocabulary is applied to "art" (PR): complexity, adaptiveness, evolution/morphogenesis, genome/phenome, environment, simulation, and nonlinear model. The conclusion of this paper is dependent upon the dynamic interaction of ST and art (STart). In this context, ST challenges the writer and reader to not merely pay lip service to "open/adaptive" form, but must perform it -- in the case of the writer, as demonstration of "nonlinear" techniques (ex. embedded litomes [literature + genome]), and in the case of the reader, as a suspension of expected academic "signals." Finally, ST's methodology allows for the axiomization of the obsessively argued subject -- the unification of art and science.
Long Contents
Introduction
The Santa Fe Institute proceedings, at the moment twenty seven volumes,
explores the science of complexity as it relates to many disciplines (see
appendix 1). I first came across these volumes when doing research
in the field of Auditory Display. AD is the technique of mapping data characteristics
to sound parameters, utilizing the sensitivity of the ear to subtle temporal
fluctuations or as a complementary perceptual channel to "viewing" large
and complex data structures. I realized immediately that this was a more
general schemata for "music" theory, a perception based theory, and further,
that it provided a basis for an inclusive theory of all art via (interface)
design theory. I didn't realize then that it was derived from a unified
field theory, an epistemological reformation.
The Artificial
Artificial Science is a general domain concerned with the simulation/realization of complex adaptive systems (CAS), as in the newly emergent field of Artificial Life. Artificial Life is concerned with generating lifelike behavior (Langton) in computed environments. In the AL field, an important distinction is made of the essential difference between lifelike behavior, and the material (not necessarily carbon based) which embodies the behavior. There is a blurring of distinction between the simulation of life, regarded as symbolic (life-like), and the realization of life-itself (Pattee). Strong Theory states that lifelike behavior may be embodied in radically abstracted, algorithmic form: computer programs. The criteria for lifelike behavior will be explored.
Distributed Cognition
A brief history of simulation is covered from a recent article in Presence
magazine. This indicates that there exists a human tendency for technological
simulation, and that this has significance for cognitive evolution. This
leads to the exploration of contemporary technologies for networked/symbolic
environments, in the form of corporate/industrial networking tools (Meindl).
What "function" is being fulfilled by this wave of distributed cognitive
technology? Some technical groundwork of VR and computer-human interface
design will be discussed (Buxton, Steuer).
Propositional Realities
Morphogenesis is a natural way to define changing music structure in
that this art has historically habituated humans to the function of transition
and modulatory based environments. An article by David Rosenboom discusses
the application of elements of complexity theory to musical realities.
It is written in a style easily extrapolated to include artificial (electronically
embodied) environments. This serves as a bridge to personal artistic experiments
in algorthmically scripted realities.
nonVRBL
I use a sound synthesis package called SuperCollider to implement a
basic genetic algorithm as described by Langton (Langton). This is implemented
within a VRML environment. The code is included and annotated.
STart
Introduction
In our age the computer has altered the intellectual, social, and institutional map. Yet Herbert Simon, a prominent psychologist, points out that it is obviously not the hardware that excites the modern imagination, but the use of the computer as a design tool; that the proper study of mankind is the science of design (Simon 96). It is in this context that I refer to "art."
In the Santa Fe proceedings, mentioned above, the disciplines who utilize the modern inquiry system of complexity aided by computers are exclusively of the sciences. The artist slips in the backdoor only under the disguise of AD (Auditory Display [music]) or Nonlinear Modeling and Forecasting (visualization). It is for this reason that I embrace the emerging artistic movements centered around computers, and those who apply a new rigor in theory and practice.
The computer is the new speculum mundi, and is generating the same atmosphere of discovery that accompanied the early use of the microscope and the telescope. Analytical sciences, such as mathematics, are reluctantly becoming experimental disciplines, as they observe in wonder the evolution of complex numerical environments. It is no different for the musician who runs a genetic algorithm with unexpected auditory result.
The computer is an electronic laboratory suited to a different methodology of speculation. It has been referred to as "order for free," or "run it and see." It presents a new psychological challenge for homo cyberneticus, and many are reluctant. Especially so when strong theories of artificial life, intelligence, and art (creative intelligence!) are brought into the picture. We are obviously an insecure society, with fears rooted in a racial inferiority complex toward the cyborgian/enhanced creature. Amid the inspirational is where we will make our last stand!
Beyond the fears of individuals and institutions are pioneers who are
thriving amid the fast developing tools and slow developing ideas. The
use of computational prosthetics for exploration of consciousness has only
been glimpsed. The art of the artificial medium may be simply, the conscious
design of consciousness. Yet the society as a whole seems somehow linked
in an overall creative effort where the tools, the infrastructural efforts,
and the organizational management paradigms are equally as fascinating
as the content.
The Artificial
Artificial means man made; nature is considered untouched by human hands, or minds. The artificial world has tended to be identified with straight lines and perfect mathematical proportions, and this has been looked upon as either more perfect then nature, or contrarily, as second-hand. Formal theory in the age of complexity however has become more sophisticated, implementing folding, catastrophic, or emergent architectures (Jencks); "cosmogenic" architectures. It is still artificial; unapologetically so (the "2nd Nature").
According to ST, the artificial world has subsumed the natural. The buzzword is "possible...." Nature, the universe, is but one possible configuration of matter, universal constants, and consciousness; but there are other possible evolutions, and these can be simulated/realized within an artificial incubator. Copernican revolution: the artificial is not within the natural, but the natural is within the artificial.
Complexity theory is the mechanics of artificial science, a dynamic organizational scheme. There have been three main conceptions of complexity which have evolved in this century: 1. holism, Gestalts, and creative evolution; 2. information, feedback, cybernetics, and general systems; 3. (current) chaos, adaptive systems, genetic algorithms, and cellular automata (Simon 96).
A definition of complexity as described by Singer, requires numerous interacting agents with emergent, aggregate, nonlinear behaviors. Complex Adaptive Systems (CAS) should exhibit self-organizing, self-fixing, evolving qualities. Finally, novelty emerges as a result of bottom-up rule-governed procedures. These rules apply to both computer-based and natural systems. I cannot provide an explanation of all components of complexity theory, but will concentrate on a few aspects.
Complexity theory resolves the [life = art/art = life] equation with
a "continuous coadaptive" model. For example, are we using computation
as an aid in understanding biology (evolution, cognition), or are we using
biology as a metaphor for work on computation (neural nets) (Gell-Mann).
Thus CAS lead to general organizational principles, from pre-biotic chemical
reactions, to ecological communities, to complex symbolic systems, to computer
hardware, to similar functional systems on other planets. This is a functional-centric
paradigm rather than a materialistic one.
The applications of this theory to biology has had far reaching implications
via the emergent study of artificial life. I explore artificial life in
some detail as an example of one application of ST in a particular domain,
biology. The key criteria of what defines life-like behavior are abstracted
from a biological model, but is complemented by artificial theory -- what
are the fundamental principles that may evolve other than biological/carbon
life forms.
Artificial Life: as described by Farmer and Belin (Emmeche, p12):
1. Life is pattern in space/time, rather than a specific material
object.
2. Life is self-reproducing.
3. Life is associated with info storage, self-description (genome).
4. Life survives by metabolic processes (feeds on ordered energy
and gives off disordered energy, heat -- thermodynamics, entropy).
5. Life enters into functional relationship with environment (phenome).
6. Life can die (dependencies).
7. Life maintains a dynamic stability (self-fixing).
8. Life evolves.
The question remains that if an algorithm fulfills these functions can it be determined that this a real life form, or further, that it is a conscious life form. A set of four criteria is provided by Gell-Mann on the cognitive activity abstracted from CAS.
1. Experience can be thought of as a set of data, usually input-output,
with inputs including system behavior and outputs including effects on
the system.
2. System perceives regularities in the experience, the remaining
information is treated as randomness.
3. Experience is not simply recorded, but incorporated into
a schemata. Mutation processes give rise to rival schemata. Each schema
provides descriptions, predictions, and prescriptions for action.
4. The results obtained by a schema then feed back to affect
its standing with respect to the other schemata with which it is in competition.
An important concept in complexity theory, as described by Simon, is hierarchy (Simon 95). This refers to the aggregate levels of a system. In other words, certain activities are insignificant (weak) relative to a higher level of a system; i.e. quantum uncertainty has little effect on the organism level. I am particularly interested in this idea in relation to symbolic thought. What is the function of thought on the level of organism's interaction with the environment?
Strong theory in the domain of intelligence has made a decision on the
mind-matter question. If a system, computer, can do symbol processing it
can think. Neurons and chips are physically different but functionally
similar. Thus there is a symbolic level of theory above the hardware or
neuronal level (Simon 95). I would like to focus on the relationship of
personal and cultural symbolic complexity as embodied in PC's and global
distributed computing.
Distributed Cognition
Distributed environments are just a matter of IP addressing in the
age of global super computing.There is a concerted human effort to design
and implement (own!) the ultimate, integrated, fully networked OS; a PC
(personal consciousness) linked to the "everything." A main concern to
designers of these systems is representation (3d vs 2d), metaphors (spatial
[cyberspace, navigator/explorer]), organization (integrated OS vs multiple
programs). Is it important to know where computation is taking place (client/server)
as long as it is displayed on your monitor? These are some of the questions
facing designers of software for distributed/cooperative cognition within
organizations (Meindl).
Two domains might approach this design issue from different points of view. The technological industrial approach takes version 2.0 and optimizes its structure for version 3.0, building layer upon layer of solutions. In the academic domain there is not a focus on product, but on a theory based design -- the wetware's connected to the symbolic system, the symbolic system's connected to the software, etc. In 1975 the first attempts to move toward research on the process (a theory) of design was begun at the Design Research Center at Carnegie-Mellon university. There design theory was aimed at broadening the capabilities of computers to aid design, and 3D CAD programs were first developed.
Distributed computing workspaces are of late themselves the object of design. And they are created and evolve within the environment that they are designing. "Spider" is a software environment for distributed cognition (Boland). It is an integrated suite of spreadsheets, databases, communications (email), and group workspaces (virtual whiteboard); it is a real-time environment. Spider is based on a paradigm of arriving at decisions by generating questions and interpretations, rather than "information systems that build a pipeline that will deposit the required data at the proper time to the appropriate decision maker" (Boland p250). The design alters management (decision making) organization on a fundamental level.
Spider is an example of 2D simulation, and a practical use of multiple user environments. It points to future technologies that may be 3D immersive, extremely sophisticated work environments, with smart facilitating agents that stimulate the users toward novel solutions to practical design (artistic) decisions. These systems would simulate multiple evolutions from given data (seeds), and choose an optimized solution for a desired goal. There is no question that a general cognitive enhancing station is on its way (Netscape Communicator, JAVA machine, Windows95), and will be utilized for any disciplinary function.
There are different scenario for simulation environments; immersive, augmented reality, with the aim of presence/telepresence/chronopresence (applying inductive or deductive logic: show me where this reality was derived from, show me where this reality will arrive in this type of environment). In a current issue of Presence magazine there is history of american "simulation entertainments:" cyclorama, panoramas, historic recreations, mechanical rides from amusement parks, and world's fair exhibits that provided group experiences to far away places (trip to the moon) (Maloney). Issues discussed were, suspension of disbelieve, passive/vicarious experience, and propaganda uses for simulations. Today computer generated virtual environments continues this trend, even if economically motivated to exploit destructive or perverse tendencies (video games).
How about an environment which explores something as seemingly innocuous
as different configurations of sounds, shapes, and colors? The reason art
may have the bad reputation of being a "soft" discipline, is because people
have not realized that these sounds, shapes, and colors were experimental
implementations on technologies and in media that would later be used to
satisfy this desire for ever increasing resolution and presence in simulated
environments.
Propositional Realities
Representation is a key issue in the cognitive sciences and the science of design. The aim of representation is to make a solution transparent by adequately representing the system. The success of a new organizational paradigm is determined by the amount of new solutions generated by it. What significance then does the idea of the blurring of simulation and reality have in regard to representation? And how about when the goal is not to find solutions but to find questions? I think we will soon have a discipline which is dedicated to mapping the future (the possible), though eventually they will give up because the possibilities are infinite. A general theory of the future will then be implemented, and any one instance of the future will be explained by this theory. The issue will become decision and design.
David Rosenboom, in a paper on propositional music, views music composition as "the proposition of musical realities -- complete cognitive models of music -- using propositional musical language accompanied by a propostional language of music theory" (Rosenboom). He differentiates between speculative and experimental music: "experimental has become distorted by historical and stylistic associations." This may have some significance in a regressive rewriting of music history, distinguishing between the types of composers who are known because they were prolific in one compositional style and those who were famous for their less prolific but innovative influence.
A new compositional environment may be able to program in parameters
of morphogenesis, such as stasis (don't evolve for a couple of seconds).
An intuitional command of seeding the environment will be important for
future composers. Rosenboom's compositional tips:
Note: Chreod -- A topological construction
that describes the projection or spread of an evolving, morphogenic process
into a region of space-time given a defined beginning known as the initiation
set of the chreod. This is related tothe idea of a morphogenic field, which
also attempts to describe the spatial diffusion or region of influence
of a morphogenic process.
This genetic algorithm implements functions of reproduction, variation
(random), and selection. Conceptually, there is a decoding process of the
genome/algorithm to the phenome/instantiation. The result is then evaluated
by the environment. Plan A has not been implemented at this time. See plan
B; then go to the link nonVRBL if you have a vrmlable browser.
Plan A (Wilson)
A. one cell [list] becomes two offspring cells.
B. they share common genetic code.
C. they can differ from the parent and each other.
D. the cells can differ in size.
E. the environment is a factor in mutation.
X(trigger/if-conditional) + K(parent) => K(daughter)K(son)
Growth rules:
A. Identify rules which satisfy conditions.
B. Chosen rule emits offspring, signal.
C. Repeat.
Excitation rules: (each rule has a weight):
A. Reproductive: A => AA, Wrep -- detects adjacency
B. Inhibitory: Sig + A => A, Winh -- trigger by signal, maintains
status quo
C. Deletion: Sig + A => 0, Wdel -- trigger by signal, deletes
If Wr is large, and there are few cells, select (reproduce).
Wi and Wd increase relative to reproduction, eventually reaching equilibrium.
System's EquilibriumSize = (1/constant)(Wr/Wd).
Symmetry (L, R), Periodicity (tone, noise), Polarity (phase).
A => B(rand) B(rand) -- rand function for slight asymmetry
B => [D C]L or R[C D] -- handedness
Plan B (Langton)
Implemented: Simple Linear Growth
This is a Lindenmayer system which consists of sets of rules for rewriting strings of symbols.
0 A initial seed
1 CB => A cb replaces a
2 A => B a replaces b
3. DA => C da replaces c
4. C => D c replaces d
Each member of the list is evaluated from L to R, then outputted.
defaudioout L, R;
deftable t1 t1ps env;
defenvelope env_func;
--a is replaced by cb
--b .. a
--c .. da
--d .. c
var pattern, a=[420], b=[210 210], c=[140 120], d=[120 140];
start {
phenome = [a a a];
pattern = Sseq(phenome, 1000, \nil, `evaluation, \nil);
genome;
}
evaluation { arg phoneme, count;
var length pos;
phenome.forEach ({arg item;
switch (item)
case a;
pos = phenome.indexOf(a);
phenome.remove(a);
phenome.add (pos, [c b]);
pos.post;count.post;phenome.post;
case b;
pos = phenome.indexOf(b);
phenome.remove(a);
phenome.add (pos, [a]);
pos.post;count.post;phenome.post;
case c;
pos = phenome.indexOf(c);
phenome.remove(a);
phenome.add (pos, [d a]);
pos.post;count.post;phenome.post;
case d;
pos = phenome.indexOf(d);
phenome.remove(a);
phenome.add (pos, [c]);
pos.post;count.post;phenome.post;
default;
^phenome
end.switch;
})
}
genome {
var dur, freq;
freq = pattern.value;
dur = (0.1 + 0.3.rand);
complexity(dur, freq);
[dur, thisFunc].sched;
}
comlexity {
arg dur, freq;
var osc1 osc2 xenv prev f1 f2 a1 a2 len;
f1 = (freq + 20.0.rand);
f2 = (freq + 20.0.rand);
len = (0.5 + 3.0.rand);
osc1 = Acoscili(t1, freq, 1);
osc2 = Acoscili(t1ps, freq, 1);
xenv = Ktransient(env, len, 0.1, 0, `dspRemove);
a1 = (10.0 + 100.0.rand);
a2 = (12.0 + 100.0.rand);
chan = [L, R].choose;
{
prev = osc1.value(\nil, osc2.value(\nil, prev) * a2);
(prev * xenv.value).out(chan);
prev = prev * a1;
}.dspAdd(0);
}
Santa Fe Institute Proceedings
The Economy as an Evolving Complex Systems II
Edited by W. B. Arthur, S. N. Durlauf, and D. Lane
Proceedings Volume XXVII, 1997
Adaptive Individuals in Evolving Populations: Models and Algorithms
Edited by R. K. Belew and M. Mitchell
Proceedings Volume XXVI, March, 1996
Reduction and Predictability of Natural Disasters
Edited by J. Rundle, B. Klein, and D. Turcotte
Proceedings Volume XXV, 1995
Evolving Complexity and Environmental Risk in the Prehistoric Southwest
Edited by Joseph A. Tainter and Bonnie Bagley Tainter
Proceedings Volume XXIV, 1995
Maturational Windows and Adult Cortical Plasticity
Edited by B. Julesz and I. Kovacs
Proceedings Volume XXIII, 1995 [August availability]
The Mind, the Brain, and Complex Adaptive Systems
Edited by J. Singer and H. Morowitz
Proceedings Volume XXII, 1995
Spatio-Temporal Patterns in Nonequilibrium Complex Systems
Edited by P. Cladis and P. Palffy-Muhoray
Proceedings Volume XXI, 1995
The Mathematics of Generalization
Edited by David H. Wolpert
Proceedings Volume XX, 1995
Complexity: Metaphors, Models, and Reality
Edited by George A. Cowan, David Pines, and David Meltzer
Proceedings Volume XIX, 1994
Auditory Display: The Proceedings of ICAD '92, the International Conference
on Auditory Display
Edited by Gregory Kramer
Proceedings Volume XVIII, 1994
Understanding Complexity in the Prehistoric Southwest
Edited by G. Gumerman and M. Gell-Mann
Proceedings Volume XVI , 1994
Artificial Life III
Edited by C. G. Langton
Proceedings Volume XVII , 1993
Time Series Prediction: Forecasting the Future and Understanding the
Past
Edited by A. S. Weigend and N. A. Gershenfeld
Proceedings Volume XV , 1993
The Double Auction Market: Institutions, Theories, and Evidence
Edited by D. Friedman and J. Rust
Proceedings Volume XIV, 1993
The Principles of Organization in Organisms
Edited by J. E. Mittenthal and A. B. Baskin
Proceedings Volume XIII, 1992
Nonlinear Modeling and Forecasting
Edited by M. Casdagli and S. Eubank
Proceedings Volume XII, 1992
Evolution of Human Languages
Edited by J. A. Hawkins and M. Gell-Mann
Proceedings Volume XI, 1992
Artificial Life II
Edited by C. G. Langton, C. Taylor, J. D. Farmer, and S. Rasmussen
Proceedings Volume X, 1992
Artificial Life II Video Proceedings
Edited by C. G. Langton
Available as a set (book plus video) or separately, 1992
55492-5 (videotape)
Molecular Evolution on Rugged Landscapes: Proteins, RNA, and the Immune
System
Edited by A. S. Perelson and S. A. Kauffman
Proceedings Volume IX, 1991
Complexity, Entropy, and the Physics of Information
Edited by W. H. Zurek
Proceedings Volume VIII, 1990
Computers and DNA
Edited by G. I. Bell and T. G. Marr
Proceedings Volume VII, 1990
Artificial Life
Edited by C. G. Langton
Proceedings Volume VI, 1989
The Economy as an Evolving Complex System
Edited by P. W. Anderson, K. Arrow, and D. Pines
Proceedings Volume V, 1988
Lattice Gas Methods for Partial Differential Equations
Edited by G. Doolen et al.
Proceedings Volume IV, 1990
Theoretical Immunology (2 books: Part One and Part Two)
Edited by A. S. Perelson
Proceedings Volume II and III, 1988
Emerging Syntheses in Science
Edited by D. Pines
Proceedings Volume I, 1988
Bibliography
Boland, Tenkasi, Te'eni, (1994). Designing Information Technology to
Support Distributed Cognition. (see Meindl). pp.245 - 280.
Buxton, William. (1992). Telepresence: Intergrating Shared Task and
Person Spaces.
Proceedings of Graphics Interface '92. 123-129.
Emmeche, Claus. (1994). The Garden in the Machine. Princeton University Press.
Gell-Man, Murray. Complex Adaptive Systems. (see Morowitz).
Jencks, charles. (1995). The Architecture of the Jumping Universe. Academy
Additions.
Joje, David. (1996). Postmodern Mangement and Organization Theory.
Sage Publications.
Kramer, Gregory. (1994). Auditory Display. Santa Fe Institute: Studies
in the Sciences of
Complexity. Addison-Wesley Pub.
Langton, Christopher, Ed. (1987). Artificial Life. Santa Fe Institute:
Studies in the
Sciences of Complexity. Addison-Wesley Pub.
McCartney, James. SuperCollider. Santa
Maloney, Judith. Presence, Vol. 6. No. 5, October 1997. 565-580. MIT.
Meindl, James. (1996). Cognition Within and Between Organizations.
Sage Publications.
Morowitz, Harold, Ed. (1995). The Mind, The Brain, and Complex Adaptive
Systems.
Santa Fe Institute: Studies in the Scinces of Complexity. Addison-Wesley
Pub.
Oe, kf. Art as machine. (1997). http://www.create.ucsb.edu/harp/N/cogsci.html
Pattee, H. H. Simulations, Realizations, and Theories of Life. (see Morowitz).
Rosenboom, David. (1997) Propositional Music: On Emergent Properties
in
Morphogenesis and the Evolution of Music. Leonardo, Vol. 30.
Simon, Herbert. (1996). The Sciences of the Artificial. MIT Press.
Simon, Herbert. (1995). Near Decomposability and Complexity. (see Morowitz).
Singer, Jerome L. Mental Porcesses and Brain Architecture. (see Morowitz).
Steuer, Jonathan. Defining Virtual Reality.
http://www.cyborganic.com/~jonathan
Steuer, Jonathan. Conversaton, Context, & Computers.
http://www.cyborganic.com/~jonathan
Wilson, Stewart. The Genetic Algorithm and Simulated Evolution. (see
Langton).