An extremely interesting paper has now made its public appearance:

- David Peak, Jevin D. West, Susanna M. Messinger, and Keith A. Mott,
"Evidence for complex, collective dynamics and emergent, distributed
computation in plants", Proceedings of the National Academy of
Sciences
**101**(2004): 918--922 [link] *Abstract:*It has been suggested that some biological processes are equivalent to computation, but quantitative evidence for that view is weak. Plants must solve the problem of adjusting stomatal apertures to allow sufficient CO2 uptake for photosynthesis while preventing excessive water loss. Under some conditions, stomatal apertures become synchronized into patches that exhibit richly complicated dynamics, similar to behaviors found in cellular automata that perform computational tasks. Using sequences of chlorophyll fluorescence images from leaves of*Xanthium strumarium*L. (cocklebur), we quantified spatial and temporal correlations in stomatal dynamics. Our values are statistically indistinguishable from those of the same correlations found in the dynamics of automata that compute. These results are consistent with the proposition that a plant solves its optimal gas exchange problem through an emergent, distributed computation performed by its leaves.

The "cellular automata that compute" used as a baseline here are the ones grown as part of the Evolving Cellular Automata project. I happen to have worked on that project for a few years (making only the smallest of contributions), so I'm now going to talk your ear off about how cool my collaborators' work was, and how cool this new work is. I've explained cellular automata elsewhere, so I won't repeat myself here.

The motivation for the EvCA work was that there are some computations which are very easy if you have a centralized processor or memory, but very hard for decentralized systems. Suppose, for instance, I give you a string of random bits, and ask whether most of the bits are 0 or most of them are 1 ("density classification"). This is easy, if you can count and store your results in a central place; it's very hard if you can only do local operations, looking at, say, a bit and its immediate neighbors in the string. EvCA used genetic algorithms to evolve local rules --- cellular automata --- for these problems, measuring their fitness by (in essence) how likely they were to solve random instances of the task. The tasks themselves were completely trivial and uninteresting (a fact which eluded some people); what was interesting was seeing the tricks which evolution found to solve the problems in a decentralized fashion. (I won't draw any B.S. morals about politics or society from all this, but I will say I've long been tempted to send a pile of reprints to the Discovery Institute and the Templeton Foundation. Unfortunately that'd just make the honest ones' heads explode, and leave us to deal with the conscious charlatans --- a clear case of adverse selection.)

One of the more robust positive results was that, for many, many different
tasks, the cellular automata evolve a
system of computation based on "domains" and "particles". Domains are
extended regions in space and time where the CA is repeating the same pattern
over and over, possibly with a little unimportant noise incorporated into the
pattern. (More exactly, domains are patterns which are left unchanged by the
CA rule. For a yet more exact definition, see here
or here.) Domains contain
very little information, and do no computation. Particles are propagating
boundaries between domains, which repeat a given cycle of states over and over
and move through space at some constant velocity. They're like solitons (if
you're a physicist) or gliders (if you're an afficinado of
Conway's Game of
Life). When particles meet, they interact, producing new particles or
annihilating (looking rather like a Feynman diagram). Particles have a very
high density of information, and particle interactions are what end
up doing the computation. This is *emergent* computation: you can
describe what's going on in terms of the domains and particles, and doing so
is vastly
simpler and nearly as accurate as evaluating the low-level CA dynamics, but
the rules of the CA don't explicitly encode domains or particles at all;
they're generated entirely by local interactions.

The idea of using particles to do computation goes back at least to the Game
of Life in the 1970s, and it's a pretty standard trick in studying CAs. (Here
is a non-EvCA paper
which does it, for instance.) Probably the most mathematically elaborate use
of particle computation is Matthew Cook's proof that one of the most basic
cellular automata rules (called "rule 110") is as powerful as a universal
Turing machine. Universal Turing machines are equivalent to a kind of
string-manipulation system called a Post tag system (named after the logician
Emil Post, who invented
them in the 1920s). Cook devised a variation on Post tag systems he called
cyclic tag systems, and showed that they, too, are computationally universal.
Finally, he showed how, if you engineer the proper arrangement of particles and
domains *just so* in rule 110, the particles implement a cyclic tag
system; and there you are. (You can read a sketch of Cook's proof in Stephen
Wolfram's book A New Kind of Science, and doing so is the
only *good* reason to read that monstrosity.) What was new in EvCA, and
in my humble opinion neat, was (a) showing that particle computation is
something which reliably gets evolved to solve this kind of problem, and (b)
showing that you could define particles endogenously, as it were, in terms of
domains. It's worth noting, too, that the particles in the EvCA rules don't
require this careful engineering; they form spontaneously, from completely
random initial conditions. You'd have to engineer things to *keep* them
from forming.

(An earlier *negative* finding of the evolving CA project was that
the idea systems must "evolve to the edge of chaos" to perform powerful,
adaptive computations is bollocks --- see here or here.
But once a notion so conformable to the zeitgeist becomes
endemic among the lumpenprofessoriat, it's hard for any amount of actual
science to uproot it.)

Now what does this have to with plants? Well, as the abstract of Peak et al. explains, plants face an essentially computational problem as well, in deciding how to balance carbon dioxide uptake against water loss. This is really something they need to do globally, but the sensors and the decision-making are all local; so it's a distributed problem. What Peak et al. have shown is that the statistical properties of the plant stomata look very like those of CA evolved to solve global coordination problems, and that there seem to be persistent, localized, moving patches of activity --- particles --- involved in establishing the coordination. (In a two-dimensional system, like a leaf, one might also get extended lines or boundaries as emergent structures, as well as point-like particles, but they don't report such things.) This is very suggestive of some kind of distributed computation. One would like very much to be able to identify the computation more directly; I think I know a way to do that...

For another take, see this news piece in Nature (may require subscription), in which the usually-reliable Philip Ball (or his copy editor?) manages (1) to confuse the emergent particles with the basic cells of a CA, and (2) to say that Wolfram was the first to show cellular automata can "mimic computers", something established by John von Neumann and Stanislaw Ulam before Wolfram was born. (Via Melanie Mitchell, in e-mail.)

**Update while waiting for code to run, 3 February 2004**. I
think this is my most popular post ever. These are the links I know of which
haven't been picked up by trackback (below): Brad
DeLong; Metafilter; GR3M; The
Abstract Factory; Tobi
Schwarz; Language
Log. --- Ray
Girvan writes to point out that Philip Ball seems to have an unaccountable
fondness for Wolfram's tome, here attributing to
it ideas originating long before with Ed
Fredkin and collaborators. I'm surprised, because most of Ball's pieces
for Nature are actually well-informed, and his book on pattern
formation, The Self-Made Tapestry, is truly excellent. So
I'm puzzled
about what's gone wrong here.

*Trackback*:
Crumb Trail;
Stephen Laniel's Unspecified Bunker;
Signal + Noise;
Pedantry

Posted by crshalizi at January 21, 2004 22:10 | permanent link