Is subjective experience essential to true consciousness?

traditional view of humans at top of species pyramid

(reading time: 6 mins)

Humans are considered the only truly conscious animals, because we are not just reacting to our environment but we do so through a rich complex inner world. However I think this complicated, self aware, highly subjective experience is just one type of intelligent awareness. In fact as I will argue here, there are even parallels between our subjective experience and a legacy computer system.

Most people would agree our species sits at the top of the evolutionary tree, the apex of the pyramid perhaps, because we have the most complete, even enlightened awareness in the animal kingdom. Analytic thinking, self awareness, subjective experience and so on, were long thought to be uniquely human traits, which makes their appearance in just one species an ever greater mystery.

At this point in history however many scientists recognize that other animals might be described as conscious, because their behaviours and brain structures suggest a degree of subjective experience. No-one has yet defined a way to draw the line between the conscious and non-conscious animals, but recent scientific research has certainly brought that likely ‘threshold of consciousness’ further and further down the pyramid.

However any sort of threshold suggests a radical emergence of true consciousness – spontaneously appearing from more basic animal biology. To avoid this, some thinkers use a more nuanced approach, describing our subjective consciousness as the brightest light of animal awareness – a light which gradually fades into insignificance as we move down to animals with less complex brains.

But for a moment consider the nature of the language used here, the idea of us having the brightest light of consciousness. Discussions on human consciousness frequently employ words like: “rich, wonderous, mysterious, vivid, elusive”. These are all valid terms for the ‘mind movie’ we have going on in our heads.

However it’s worth remembering our inner world is produced by biological hardware which imposes some significant limitations – Chronostasis for example. Have you ever looked up at a clock and noticed the second hand apparently freeze, before the clock then starts ticking normally? This happens because your brain needs time to focus on and observe the clock before it can decide whether this clock is ticking or has stopped. So the current moment we experience is not only subjective, it’s not even the current moment at all – it’s really a prediction of what our brain expects the current moment to look like, because all that hunter-gatherer neurological hardware creates a significant time delay between the light entering our eyes, and the sensory input being turned into our experience of the now.

So what on the one hand is a rich, vivid subjective inner world, could equally be described as approximate, flawed, rough and ready! The language we choose reveals how we feel about our inner lives. And that’s something to be aware of, because unexamined language can mystify and mythologise, and potentially lead us away from a full understanding of consciousness.

Subjective experience in historical context

Many thinkers regard first person experience as the defining feature of consciousness, and it’s best characterized by David Chalmers hard problem of consciousness. But Chalmers is giving a philosophical response to an historical moment because the hard problem follows on from Thomas Nagel’s 1974 essay ‘What is it like to be a bat?”. Both are reactions to the idea that strict materialism can answer all the questions of consciousness by more detailed study of the human brain. Nagel and Chalmers are recognizing that materialism is unlikely to answer more fundamental questions – how do conscious beings spontaneously appear from an otherwise unconscious universe?

The problem with The Hard Problem is that is a fundamentally dualist perspective (something discussed in this previous post) which leads to the conclusion that there must in some sense be a ‘real me’ somewhere, either in my brain or somewhere else (ghost in the machine style) which creates and experiences our subjective world. That the inner world is more than the sum of its parts, so in some way a greater truth, more than the total of more mundane sights sounds and smells that trigger off brain activity in our daily lives.

But as described above, in many ways our inner world is an incomplete picture of reality – a lesser truth rather than a greater truth. I won’t labour this point by giving lots of examples of flawed human perception – Daniel Dennet has this well covered. However from the industry I work in, computing, I think there is one useful analogy for human subjective experience.

The legacy consciousness

These days when computers interact with one another they frequently do this via an interface that follows an agreed standard, a web service, or an API (an application programming interface). However if you’re trying to put data into a legacy system, it will lack the right interfaces. So rather than reprogramming the legacy system, it’s common to use a type of program called a screen scraper.

Screen scrapers read the screen output from the old time system intended for a human. The program navigates through menus, reads text, puts data into input boxes and press buttons. Compared to systems interacting via an interface it’s of course rather inefficient and error prone. In a way a virtual world, a kind of virtual computer and virtual user is created in between the two systems so that they can interact. And perhaps this is a metaphor for the subjective and imperfect first person consciousness in our heads? Rather than our subjective consciousness being the brightest light of animal awareness, our brains producing a frequently flawed first person experience might be compared to a screen scraper, which out of necessity mediates, slows down and limits how we interact with the world.

One fascinating example of an animal with a less mediated interaction with its world is slime mold, Physarum Polycephalum. Slime mold is one of the simplest life forms imaginable, it has no brain or CNS, so it’s hard to conceive of it having anything like our subjective experience. Yet recent studies show it can anticipate future events and navigate mazes more successfully than many basic robots. And in another test, given a choice of 11 foodstuffs with a varying balance of carbohydrates and proteins, the slime mold will choose the combination best suited to its physiology – something we humans consistently fail to do!

Now don’t get me wrong, I’m not suggesting that slime mold is more intelligent than humans, (or trivialising the obesity epidemic for that matter). Or even that slime mould is having a first person experience like ours. And of course, intelligence is about more than just passing tests. But surely one key marker of intelligence is whether an animal consistently makes the most advantageous choice – as the slime mold does with its diet – not whether an animal goes through a complicated mental process, and ends up making a bad choice, as humans often do.

Our complex inner lives frequently lead us to eat unhealthy food. But it’s also easy to claim this complexity as further evidence of our superiority – instead of conceding that an extremely simple life-form actually outperforms us on this one test. The no brained slime mold is making intelligent decisions, so maybe the basic nature of intelligent awareness is not subjective first person experience at all? Maybe that is a by-product of certain animal brains, rather than the key to the mystery? We know subjective experience is a part of human intelligence, but is it a requirement for all intelligence?

We are probably the most intelligent species, because we have shaped and controlled nature in a way that no other animal could. Yet we continue to do so in an unsustainable way. We are still overusing limited resources, polluting and poisoning a natural world on which we are completely dependent.

The human race is thriving and multiplying, but our inventiveness works against that evolutionary purpose, endangering our survival. Our complex subjective consciousness means we are still more concerned by abstract concepts, increasing GDP or fighting pointless wars in the name of some ideology, than we are with protecting the only planet in the known universe that can actually sustain our species. Which means our complex subjective consciousness detracts from our intelligence in many ways.

So perhaps we should look at abstracted worlds we carry in our heads as something the brain just does, rather than being the essence of consciousness itself?