November 21, 2003

Christof Koch, Zombie-Monger

Christof Koch's work on neural information-processing is excellent, and it's nice to see him getting profiled in the popular press (via Boing Boing). I'm a bit worried, though, that he may have bitten off more than he can chew in attacking consciousness. Of course, this is all filtered through a science journalist, so there's no knowing what degree of distortion and outright falsehood is involved in the report; really I should wait for Koch's book; but what fun is that?

As relayed by Wertheimer, and in the summaries on his homepage, e.g., this one-pager co-written with Francis Crick, Koch seems to be arguing that the function of consciousness is to allow planning and a decoupling of motor action from immediate stimuli, which (they say) could not be accomplished by purely unconscious, automatic, "zombie" neural systems. Wertheimer, in particular, makes it sound like Koch regards the regulation of behavior by the contents of short-term memory --- having a meaningful short-term memory --- as the criterion of consciousness.

I'm not going to get involved in the whole qualia issue, because Dennett has convinced me that it's a (subjectively!) red herring. What I will say is that, on technical grounds, this makes no sense at all. That is, I think it's obvious "zombies", in Koch's sense --- reflexive neural circuits --- can plan, and have short-term memories. Newell and Simon exhibited machines capable of planning in the 1950s, and Miller, Galanter and Pribram's classic book on Plans and the Structure of Behavior (1960) lays out an elaborate, computational theory of planning, crucially relying on a form of short-term memory. In fact, all of these systems are, technically, finite-state automata, so a 1967 theorem due to Patrick Suppes shows that there are stimulus-response systems which can learn to exhibit exactly such behavior. (Suppes later generalized the theorem so it applies even to non-finite state automata; the extension, and the application to planning systems, are described in his recent book --- which, in passing, is as good as something forty years in the making should be.) So there could be a zombie which met all of Koch's criteria for consciousness; Q.E.D. (This is, to repeat another of Dennett's points, why this whole way of thinking about consciousness in terms of "zombies" is a dead end --- or rather, in view of its persistence, an undead end.)

There is a long tradition in psychology and neuroscience of pointing to foresight, planning, the regulation of automatic processes, and the inhibition of reflex action, as important aspects of intelligence, or more generally the "higher mental functions". Sherrington had some classic, and wonderfully eloquent, statements of this view, which also influenced Russian neuropsychology, American cognitive science (especially, again, through Simon), etc. I am a bit afraid that Koch has fallen into the trap of confusing sophisticated adaptive behavior with consciousness --- or at least, has not set the threshold on the level of sophistication anywhere near high enough.

Now, Koch is a very bright man, and knows the literature in this field much better than I do, so I can only suppose one of two things: (a) something occurred to me immediately which eluded an actual expert; (b) I have utterly failed to grasp what Koch has in mind, possibly because it has been distorted in the telling. My experience with journalists and popular science writing (and, of course, that natural humility with which my readers are so familiar) very strongly inclines me towards the second option. But we'll see when the book comes out.

Minds, Brains, and Neurons; Philosophy

Posted by crshalizi at November 21, 2003 22:49 | permanent link

Three-Toed Sloth:   Hosted, but not endorsed, by the Center for the Study of Complex Systems