Notes on Teilhard's Omega Point: Part 3, On Complexity, Consciousness, Thermostats and Panpsychism

Think of the following things:

A chair
A computer
A weed
A bacterium
A worm
A butterfly
A mouse
A dog
A chimpanzee
A human

Now think about taking a hammer to each item on the list, starting with the chair. At what point in the list do you start feeling morally queasy about randomly inflicting violence on each? I start to feel it at butterfly but it really cuts at dog. I'd be morally outraged if someone just randomly, for no good reason, killed a dog.

Generally speaking, our moral outrage correlates with the degree of consciousness we find in each entity. The greater the capacity for pain--physically or psychically--the greater the moral status.

Interestingly, this increase in consciousness (and concurrent moral standing) seems to be intimately related to the complexity of the entity. A worm is more complex than a bacterium and a mouse is more complex than a worm. On it goes to the most complex creature, man. Man is the most complex entity we know of in the universe and, interestingly, man also has the greatest range of consciousness. No one feels pain quite the way mankind does. In sum, there seems to be a rough correlation between material complexity and the range of consciousness.

In The Phenomenon of Man Teilhard calls this complexity/consciousness correlation the Law of Complexity and Consciousness. As he describes it (p. 60):

"Whatever instance we may think of, we may be sure that every time a richer and better organized structure will correspond to the more developed consciousness."

"The degree of concentration of a consciousness varies in inverse ratio to the simplicity of the material compound lined by it. Or again: a consciousness is that much more perfected according as it lines a richer and better organized material edifice."


In short, as material complexity increases so does consciousness.

The lingering question here is how are we to define "complexity"? And, is something "complex" by default going to be conscious? Or must there be a certain kind of complexity before we see consciousness arise?

David Chalmers in his influential book The Conscious Mind suggests that the "complexity" associated with consciousness is involved in information processing. Chalmers' view is very abstract and sophisticated and I cannot do it justice here. In fact, a summary on my part is likely to be simplistic to the point of distortion. Please consult his book for a fuller treatment (particularly Chapter 8).

Risking much in summarizing, Chalmers' basic argument is that we replace a "material" view of the cosmos with an "informational" view, taking information as primary over matter. In this view, one growing common in scientific circles, the laws of the universe are software that "compute" the next step in the "program." The universe moves forward in time as one large computation. The current state of the universe--T1--is taken as input by the "program" (the physical laws of the universe) which "computes" the universe at T2.

Chalmers speculates that when we see consciousness arise we see it associated with some kind of information processing. A dog or a bacterium take in information from its surroundings, manipulate it, and produce some kind of "output" (a behavior or an internal change of state, like a memory). Interestingly, whenever we see these "biological computers", these "organic difference engines," we also find consciousness. And, as Teilhard noted, the more sophisticated the computer, the more information it can process, the greater the consciousness. Thus, in the human brain we find both the most complicated computer known to man as well as the most exquisite form of consciousness. In short, given Chalmers' work, Teilhard's Law of Complexity and Consciousness can be refined as The Law of Information-Processing Complexity and Consciousness.

This Law has some interesting conseqences. To quote Chalmers (p. 294-295):

"...we can think about what might happen to experience as we move down the scale of complexity. We start with the familiar cases of humans, in which very complex information-processing gives rise to our familiar complex experiences. Moving to less complex systems, there does not seem much reason to doubt that dogs are conscious, or even that mice are...

Moving down through the scale through lizards and fish to slugs, similar considerations apply. There does not seem to be much reason to suppose that phenomenology should wink out while a reasonably complex perceptual psychology persists. If it does, then either there is a radical discontinuity from complex experiences to none at all, or somewhere along the line phenomenology begins to fall out of synchrony with perception, so that for a while, there is a relatively rich perceptual manifold accompanied by a much more impoverished phenomenal manifold. The first hypothesis seems unlikely, and the second suggest that the intermediate systems would have inner lives strangely dissociated from their cognitive capacities. The alternative is surely at least as plausible. Presumably it is much less interesting to be a fish than to be a human, with a similar phenomenology corresponding to its simpler psychology, but it seems reasonable enough that there is something there.

As we move along the scale from fish and slugs through simple neural networks all the way to thermostats, where should consciousness wink out? The phenomenology of fish and slugs will likely not be primitive but relatively complex, reflecting the various distinctions they can make. Before phenomenology winks out altogether, we presumably will get some sort of maximally simple phenomenology. It seems to me that the most natural place for this to occur is in a system with a correspondingly simple 'perceptual psychology,' such as a thermostat. The thermostat seems to realize the sort of information processing in a fish or a slug stripped down to its simplest form, so perhaps it might also have the corresponding sort of phenomenology in its most stripped-down form. It makes one relevant distinction on which action depends; to me, at least, it does not seem unreasonable that there might be associated distinctions in experience."


This may seem to be a crazy conclusion to reach, that a thermostat can feel, but I'm with Chalmers on this one. I think it is clear that consciousness degrades as we move down through more primitive forms of information processing. Either this degradation occurs discontinuously (it "winks out") or smoothly. If the former, one must at least give some reason for why consciousness should "wink out" as well as specify the rough location along the information-processing continuum where this is believed to occur. That seems to be a tall order, to me at least. Or, you can simply take the most parsimonious explanation: Consciousness degrades continuously from maximally complex (humans) to maximally simple (thermostats).

If we accept Chalmers' argument then a form of panpsychism results. As Chalmers says (p. 297, 298, 299), "If there is experience associated with thermostats, there is probably experience everywhere: wherever there is a causal interaction, there is information, and wherever there is information, there is experience. One can find information states in a rock--when it expands and contracts for example--or even in the different states of an election. So if the unrestricted double-aspect principle is correct [the information-processing/consciousness correlation--RB], there will be experience associated with a rock or an electron...this view has a lot in common with what is often know as panpsychism--the view that everything has a mind...I hope to have said enough to show that we ought to take the possibility of some sort of panpsychism seriously: there seem to be no knockdown arguments against the view, and there are various positive reasons why one might want to embrace it."

Teilhard, for one, did embrace a form of panpsychism. For Teilhard, as we saw with Spinoza and Chalmers, there is an intimate correlation between matter and consciousness. That is, where matter exists consciousness (however simple or degraded) must also exist. Phrased another way, consciousness is latent in matter. And, interestingly, this can't be considered a controversial or "crazy" point. It's fairly obvious that consciousness is latent, in some lawful way, in the evolution of matter. Just look at a snail. Or glance in a mirror. That's conscious matter staring back at you.

As Teilhard said (p.73), "pre-life has already emerged in the atom."

Next Post: Conclusion

This entry was posted by Richard Beck. Bookmark the permalink.

11 thoughts on “Notes on Teilhard's Omega Point: Part 3, On Complexity, Consciousness, Thermostats and Panpsychism”

  1. Richard,

    I agree with much of Chalmer's view, but where he runs into trouble is saying that consciousness goes all the way down the scale of complexity. It is fairly obvious that it stops somewhere along the way, just like matter that is living stops being alive once it loses a degree of associated properties.

    Even if I were to grant that consciousness in some form is available even to ant hills like Hofstadter claims, that still does not make ant hills moral agents.

  2. I hesitate to enter debates like this directly. Our experience in defining "intelligence" has been so riddled with potholes that confidently defining "consciousness" to the point where we can assign it to thermostats and politicians is problematic to say the least. Hell, I am even sure we understand complexity well enough to distinguish between the biological and non-biologocal varieties.

  3. This parallels some of the singularity theorists, where all creation is seen as an information matrix. Also brings to mind an image of "rocks crying out."

    Great post, Richard, as always.

  4. I'm starting from your view that "our moral outrage correlates with the degree of consciousness we find in each entity" here, as I think it is the most testable statement in this appealing, but flawed narrative.

    I think the oddest thing about this view is the expectation that human morality is so logically precise and that there are underlying principles to it.

    Consciousness may be one of the factors that leads us to identify with the victim of this hypotethical hammer-attack.

    But I suspect most people would be more morally outraged were this hammer-attack happening to their child than to a random stranger - a factor that has nothing to do with consciousness. Similarly, is it less morally outrageous to attack someone in a coma than to attack someone in the street? And although I do not share their view, many people seem to think that an foetus (not as conscious or complex as a human being) is of equal status with human beings.

    Tribal and instinctive connections are more relevant in deciding who or what we identify with than simply the level of complexity or consciousness (although as we are complex and conscious it probably is one of the factors leading us to instinctively connect with other beings - but only one of them).

    I personally would be more worried about whether the behaviour of a mouse-assailant would be transferable to their relationship with other human beings than for the welfare of the mouse (same goes for a dog or any other non-human animal). However, I'm fairly sociopathic in this regard and there are many people who seem to identify more with animals than with humans (eg people who campaign against elephants being kept in cages in a circus but don't campaign against immigration detention centres).

    Now I think that there is something special about human beings (probably arising from our being created in the image of God) and that the rest of life is important but pales into insignificance compared to human beings. So on a theoretical level I'm willing to accept that you can place all non-human matter on a continuum, but on a tribal level the way I see matter is in terms of its use and importance to other human beings. A dog is more valuable and deserving of less hammer-attacks not because it is more conscious but because it provides a level of companionship to the lonely that I have never seen a rock provide. This is partly to do with complexity but also to do with character - a tiger is as complex as a mongrel but I wouldn't leave a tiger alone with a pensioner.

    What am I getting at? I think I'm trying to say that there may be some truth in there being an element of consciousness in all matter (though even then I don't like the word consciousness as to me it suggests self-awareness), but on a primary level at least, it just doesn't matter. Ultimately it's not one of the most important factors we use when forming judgements or making moral decisions and neither should it be.

    Do I get bonus points for writing this point in the "stream of consciousness" form? :p

  5. Chalmers' book was interesting to me, but the emphasis on 'information' I think is deeply problematic. This is because it cannot differentiate between 'information' processing that occurs sequentially (as in my Mac) and the massively parallel processing of my brain. Perceptual tasks are impossible for serial processing units (which is why 'facial recognition' software always uses virtual neural networks--and why they always have to be 'trained').

    I'm inclined to think that consciousness 'emerges' not because of 'information' processing, but because of the mode of its processing.

    Panpsychism is obviously true insofar as it simply says that consciousness arises out of the stuff of the world, but stronger theses are unwarranted.
    Just like 'liquid' no longer applies to one or two molecules of H2O, so also 'consciousness' does not apply to a two-state 'information' system such as a thermostat.

    As far as my 'moral queasiness' goes--I get queasy at worm. Though perhaps this is mostly because I wonder 'what kind of person would gratuitously kill a worm?'
    I should add that my sensibility in this matter has increased since becoming a vegetarian over a year ago...

    Peace,
    -Daniel-

  6. Let me add a couple of clarifications and comments that cut across may of your comments:

    Morality and Consciousness:
    It is true that consciousness and morality do not move in lockstep. But I think it very clear that they are, roughly speaking, correlated. I think that is a defensible claim as many moral and legal issues--ranging from abortion to euthanasia to animal rights--do make appeals to issues of pain sensitivity, self-awareness and sensation thesholds. In sum, although consciousness doesn't dictate moral judgments it is hugely implicated in them. That is all I'm claiming.

    Phenomenological Thermostats:
    Although it may seen crazy that a thermostat expresses a kind of quantum of experience, upon reflection it is not strange at all. Look at the molecular workings of the neurons in the brain. It's just molecules shifting about. Physical stuff. Why is it not shocking that that stuff is conscious compared to a thermostat? It's no more crazy to see a thermostat as conscious than the physical matter of the brain. It's all just atoms!

    Think of it this way. Is a single neuron conscious? If I say yes it sounds as odd as the thermostat. A neuron just seems so, well, simple. Then why are millions and billions of neurons conscious? Aren't those just a bunch of interconnected thermostats?

    One might counter that consciousness is an emergent property that is not reducible to the building blocks. No doubt emergence is a part of all this. But what, exactly, is emerging? For Chalmers it is a more complicated information processing system and he's trying to create links between the two (consciousness and information processing). Yet, as the post discusses, information processing scales up, as does consciousness. Personally, I think there are phase transitions that take place, dramatic leaps forward that appear to create qualitative differences in the level of consciousness. There probably are emergent moments or thresholds (e.g., the onset of symbol usage). But these phenomenological "jumps" don't imply that consciousness, in raw form, was non-existent at lower forms of complexity.

    IMHO, I think that at root the shock of the thermostat idea comes from the fact that it exposes our latent Carestian assumptions as we muse about the brain (animal or human).

  7. Another clue that in consciousness lies some fundamental aspects of the universe comes from reflection upon some aspects of quantum physics. By performing certain types of actions having to do with, for example,light traversing narrow slits, one can affect of the past. Here's a quote from Paul Davies' book "God and the New Physics" ch 3, where he quotes physicist John Wheeler as saying "The quantum principle shows that there is a sense in which what the observer will do in the future defines what happens in the past - even in a past so remote that life did not then exist." (the citation for this is an article entitled 'Genesis and Observership' Foundational Problems in the Special Sciences eds R. E. Butts and K. J. Hintikka; Reidel 1977)/

  8. It looks like you've finished your interesting series on Teilhard at least for now, so I'll toss in a couple observations. First, this idea of consciousness infusing matter is an issue addressed by the "speculative realists" and is kind of a hot topic in continental philosophy these days. Some of the leading figures are Graham Harman (American), Ray Brassier and Quentin Meillassoux (both French). I know this stuff only peripherally, but they generally support what you're talking about here.

    I wondered if you were going to critique Teilhard's idea of a long-term evolutionary trend toward greater complexity. Stephen Jay Gould says that this trend is an artifact of random movement away from what he calls the "left wall" of irreducible simplicity. There is no life form simpler than the unicellular organism (bacteria, etc.). Since these 1-celled creatures were the first ones to appear on the evolutionary scene, any variation in complexity has to move toward the right, toward greater complexity, since a move past the left wall into less complexity results in a return to the inanimate state. So you end up in the long run with a skewed distribution, with a big clump near the left wall and a long thin tail stretching out to the right.

    Gould further observes that the simplest of creatures are also arguably the most successful even now. It's not just the sheer numbers of creatures (more E. Coli in one person's digestive tract over the course of a human lifespan than all the humans who have ever lived). In total biomass, the 1-celled creatures outweigh every other category of more complex living things.

    Your opening remarks about the hammer brought to mind some discussions I've had with Christians about whether they're more willing to support military action against non-Christians. If Christians' minds are supernaturally endowed with an ability to detect truths about God and nature that elude the unregenerate, then it could be argued that Christians have more complex consciousnesses than do non-Christians. Hence they might be more prone to wield the hammer against the infidels: less consciousness, lower moral status, less capacity for feeling pain.

    The French use the same word for "consciousness" and "conscience." This is true in Latin too: conscience = con-science = to know with.

  9. K,
    I'm running now so let me make a quick comment. First, thanks for the names, I'll need to be looking into them.

    Next, I have one more post in this series.

    I do agree with Gould's view (eg, Full House) but I note he overlooks consciousness in his account and its role in evolution. As Teilhard would say, Gould is still only dealing with the Outside, only half the story.

    Finally, your last comment troubles me as I see your point in a "Higher" level of consciousness doing violence to a "Lower" form. This implication is problematic. Two quick areas to think about in view of this issue is 1) a strong form of ahimsa and 2) avoiding the naturalistic fallacy.

  10. The beginning of this post grabbed my attention. For me, it's interesting because the form in which the "violence" is being inflicted upon each thing in the list (mainly once we reach the breathing things in the list) is what affects me differently on a moral basis. Explaination:

    Starting with say the worm. Killing a worm is not going to get me up in arms; however, if I witness someone literally winding up and swinging the hammer to pummel this obviously defenseless worm, I'm going to begin to feel something different than if they simply set the hammer on top of the worm and apply some pressure in a manner of simply smashing the worm.

    This same sort of scenario would apply and just be more intensified for me once we get to the mouse. It would be at this point that the manner in which the pain and suffering is inflicted upon the mouse that I would have more extreme differences in my feelings. As mentioned above, if the pain is being inflicted in a manner in which shear dominance and force -- such as winding up and swinging with brute force -- is being used, it would disturb me more, than say simply pushing the hammer down on the mouses back to smash him.

    Hmm...I'm not sure, maybe it's because I get pulled into a person's intent/emotion behind their behaviors that affects me differently. Maybe it just scares/disturbs me more when I feel someone would have such a level of anger/intent to inflict pain in such a powerful manner.

Leave a Reply