You’re at work and your lunch break is just starting. You gratefully put your desktop to sleep and close your eyes for a moment. It’s Wednesday, and you’re feeling the full effects of the hump. Eventually, you grab your lunchbox and head to the breakroom. You open the door and quickly realize it must be someone’s birthday. There’s a half-eaten cake on one of the tables with some plastic forks and paper plates. Next to the cake is a bag of apples. You mentally cringe: It’s decision time. You know fully well what the healthier choice is. The choice your doctor, and your body, would applaud you for…but your taste buds are crying out for that confectionery delight, and they refuse to be silenced. You head over to the table, grab a plate, a fork, and, of course, a piece of cake. Taste buds: 1. Body: 0.
This is a relatively innocuous scenario, but it raises some interesting questions. Why is it so hard to say no to a piece of cake, even when we know it’s unhealthy? In fact, why is it that cake tastes so good in the first place? To take this inquiry even one step further, what is it that makes certain things, like cake, so hard to resist but other things, like apples, so easy to brush off?
The answers to these questions may lie in the fact that our brain, just as much as any other part of our body, is a product of evolution. At first glance, this statement may seem so commonsensical that it needn’t require any further explanation. However, it suggests a surprisingly meaningful, though often overlooked, context for human psychology.
When learning about evolution, one often hears about how our eyeball, or jaw structure, or bipedalism, is a finely-tuned adaptation, developed over millennia by our ancestors. The specificity lended to these perspectives rarely translates to the brain. In human evolution, our brains are just big, and that’s what’s important. However, just as much as our eyes, or jawbones, or our brains have likewise been designed by the processes of natural and sexual selection to address various challenges faced by our evolutionary ancestors. What’s more, where our other body parts function to do a variety of tasks (sight, digestion, locomotion), our brain is designed to do no less than generate all of that behavior.glistening thighs,
Since the evolution of Homo sapiens around 300,000 years ago, nearly all of human history has been spent in small hunter-gatherer tribes on the African Savanna. As such, this context makes up our Environment of Evolutionary Adaptedness (EEA), a relatively straightforward term which describes the ancestral environment to which our species is adapted. As it relates to our discussion, this can be read as the ancestral environment to which our brains are adapted.
To put it in other words: every thought you’ve ever had, every sight you’ve ever seen, every action you’ve ever taken, has been experienced, understood, and mediated by an organ that is adapted to a hunter-gatherer lifestyle, in a hunter-gatherer environment.
Every thought you’ve ever had, every sight you’ve ever seen, every action you’ve ever taken, has been experienced, understood, and mediated by an organ that is adapted to a hunter-gatherer lifestyle, in a hunter-gatherer environment.
That being said, this process obviously did not begin with humanity. Rather, our brains have vestiges deep inside them from the very first brain to ever arise in our evolutionary lineage. When a brain adapts, it doesn’t scrap its ancestral hardware altogether and re-construct from nothing. Instead, the process is iterative. That’s not to say that pre-existing structures are exempt from modification. Evolution can and has caused changes in all parts of the brain. I merely intend to point out that prior cognitive foundation is also built upon, with new and improved cognitive modules to face novel environmental challenges.
Naturally, the more recently evolved adaptations are not the only ones that affect behavior. According to one popular (but admittedly oversimplified) model of cognitive evolution, our ancestral cognitive foundation can be considered our “lizard brain,” controlling powerful base urges for survival and reproduction. As these urges are oftentimes the ultimate determinant of whether or not an organism’s genes will propagate into further generations, they’re naturally connected to highly powerful emotions. The ape who deliberated whether or not to run when face-to-face with a tiger did not survive long enough to have children. The early human who followed their “urges” and had a steamy time in the bushes with you-know-who spread their genes much further than the one who wondered if they were ready for that sort of commitment.
To tie this all together, this evolutionary perspective becomes so important when we acknowledge that we are no longer lizards. Hell, for the most part, we’re no longer even hunter-gatherers. Our brains, at all scales, are unequipped to interact with our modern-day circumstances. And what may have been useful for our genetic ancestors’ survival and reproduction is not inherently beneficial in the pursuit of planetary, social, or even personal wellbeing.
With all this in mind, we’re going to explore a handful of relevant, well-documented cognitive biases and other cognitive effects that result from this inherent mismatch between our evolved minds and our contemporary circumstances. I like to consider these biases somewhat like “hangovers” from our evolutionary history. They may have been fun back in the wild Saturday nights of the Pleistocene, but in the harsh Monday morning of 21st century reality, more often than not they’re just a headache that we’d like to do without.
We live in a world of constant stimulation. From the phones in our pockets, to the fast food junking up our cities and our veins, to the absurd amount of money spent each year on gambling — those of us who live in the global North have functionally constant access to an endless stream of stimulation. As one of the propaganda ads shown above proudly declares: “Nonstop you” (another merely states the word “EXPRESS” over and over, and if that’s not characteristic of American culture, I’m not sure what is).
In terms of our evolutionary history, It goes almost without saying that this is a very novel condition. A hunting and gathering lifestyle offered a much lower baseline of stimulation than what many of us are used to today. Excitement came from the thrill of the hunt, or conversation with friends, or even some of your favorite local plant-based psychedelics. Because of this, we are incredibly cognitively vulnerable in the face of these overwhelming contemporary stimuli.
Supernormal stimuli is a term which describes the hijacking of an animal’s instincts beyond their evolutionary purpose by an unnatural and exaggerated trigger. This is a well-documented phenomenon in many species. In one experiment, mother birds will consistently abandon their own eggs to roost upon a disproportionately large placebo egg with magnified colors that was placed in their nest by an experimenter. In another study, highly territorial stickleback fish attacked wooden “male” fish models more aggressively than real males if the models’ undersides were redder. What lies at the root of all these behaviors is a word that most of us are familiar with: instinct. Which egg do I incubate? Who do I attack? What feels right to do?
As we discussed above, gut feelings such as this are primarily motivated by the “lizard brain.” In other words, these are not rational thought processes. Instincts are so powerful, and so effective, because they are motivated by emotions. Strong emotions. The mother bird prioritizes incubating her healthiest chick (with the biggest, most dramatically-colored egg), not because she sat and thought about it, but because she simply feels that it’s the right thing to do. In our EEA, these instincts were helpful (from a genetic point of view) as they motivated humans to pursue activities that led to the continued survival of their bloodline. However, just like the poor misguided mother bird and her favorite plaster fetus, these instincts can be easily hijacked.
Today, our society suffers from an overwhelming amount of supernormal stimuli. By and large, this inundation is not voluntary. Instead, these mental pollutants have been pushed upon the populace by corporations who know just how effective these triggers can be at motivating behavior. What’s worse, instead of educating us about the hidden dangers behind these feel-good stimuli, society largely encourages us to follow these urges, hiding behind market-oriented justifications that remove any possibility of systemic culpability. Of course, there is nothing wrong with a little stimulation now and again, but how much is too much?
Media like Black Mirror and Blade Runner offer visions of future techno-dystopias where modern trends are exaggerated to grotesque extremes, but this merely obscures the fundamental irregularity of our contemporary circumstances. These are not normal times. Society is already dystopian. And the overwhelming presence of these supernormal triggers merely encourages greater passivity in those who already benefit most from inequitable systems of colonialism and economic exploitation. I’m inclined to mention soma, the allegorical drug from Aldous Huxley’s A Brave New World that stimulated an addicted upper-class while obscuring the ethical atrocities committed by the state in the name of “order.”
This is the world. These are our minds. At times, our brains might feel like putty in the hands of supernormal stimuli. That’s okay. Personally, as a human actively striving to understand and work with the inherent irrationality of my own mind, I hope that this information can serve as a useful tool with which you can begin to identify irrationality in your thought patterns. From there, once you know the origins of a thought, it becomes slightly easier to gain a degree of distance between the thoughts that happen to be in your mind, and the thoughts, or lack thereof, with which you choose to identify.