
Consider the humble amoeba. It doesn’t think, it doesn’t feel. It simply is. Through sheer randomness, the ones that could effectively split and consume resources propagated, and with that propagation, the ability to split and consume resources persisted. This is the raw, unfeeling engine of natural selection.
For billions of years, this was how biological evolution continued. And then, something went wrong, or perhaps, something miraculously right happened. The first flicker of “Self” emerged.
Cnidarians and Ctenophores were the very first animals to develop a nervous system, a happy accident in the evolutionary tree where a simple nerve net sufficed for the coordination of movement and processing sensory inputs. This was the starting point from which all complex animal nervous systems, including the human brain, would eventually arise.
Then the first bilaterians, creatures perhaps no more complex than a flatworm, developed the first brain-like structure. And within that structure, the first faint flicker of subjective experience emerged some 600 million years ago.
Let’s try to understand why this event is crucial for the emergence of a sense of “Self”. With the development of a brain, a being has the capacity to process stimulus from its own body (need food) and stimulus from the environment (the ground is wet). It creates a model of its own body in the world, acknowledges its boundaries, and recognizes everything else as the outside. It might even store that information for future use, creating a model of itself that exists in the present but is anchored in the past and has expectations for the future. In this sense, the brain isn’t a magical instrument; it is a biological computer for predicting outcomes to facilitate survival, and the refinement of this organ was driven by millions of years of selective evolution.
Then comes something even more fascinating and harder to explain: “emotions.” While the exact mechanism is complex, we can see how evolution favored their emergence. For example, fear—a creature that feels a desperate, overwhelming urge to flee is far more likely to survive than one that merely computes a ‘high probability of predation’. Emotions, these powerful internal states, heighten awareness and are often quicker than logical processing. Animals in which these primal emotions emerged were much more likely to survive and propagate, passing that capacity down the evolutionary line.
Think of the salmon, driven by an inexplicable force to swim upstream. We might call it ‘determination’ or a ‘drive to spawn,’ but for the salmon, it is likely a powerful, primal emotion, a feeling we humans have no name for, programmed by evolution to ensure genetic propagation.
So, consciousness wasn’t the goal. It was a side effect. A spectacular, world-altering side effect of building increasingly complex systems for survival.
This changes everything for AI. We’re so busy trying to design consciousness into a machine. But what if, like evolution, we simply focus on building systems of immense complexity and agency, and consciousness… just happens? And are we prepared to face the challenges of it when it does?
Leave a comment