
Robbers Cave State Park
Not long ago an Argentinian hacktivist posed a simple question to his 53.6k Twitter followers, a question probably posed in similar vein by Socrates in the agora more than two thousand years ago:

A life of social science research methodologies applied to college students, beer drinkers, Sea World visitors, potato chip lovers, hipster clothing wearers, soft drink buyers, computer owners and credit card carriers has prepared me more than most to answer this question. After all, I have made it my professional life’s practice to understand not just how people respond to stimulus, but why they do, and how to apply that learning to most profitable effect.
A big part of designing market research is eliminating the potential for biases to corrupt the findings —a two hundred thousand dollar double blind taste test can be ruined by accidentally introducing an order effect that can’t be statistically controlled for. (I only wish I wasn’t speaking from experience.) Learning to identify and control for inherent biases is a necessary but not sufficient condition for a well-designed research study that will yield valid, reliable results.
It turns out this works for life, too: knowing decision-making biases in all their forms (including, especially, attentional bias, denial and selective perception) enables you to control for them, and not ruin your very expensive study (or life) by failing to be aware of easily avoided/corrected for errors in thinking.
We humans are simply not the rational creatures we think we are; we’re social, first and foremost. Humans evolved to be tribal; to be part of a group is not just necessary from a social learning perspective, but for survival. A lone hunter could not hope to run down a kudu, a feat a group can easily manage.
So how is it that something so important to our flourishing as a species — being part of a group, or tribe — is simultaneously one of the primary weapons for tearing the social fabric apart?
Understanding how to defeat the natural tendency to ‘tribe up’ when the tendency is bringing about dangerous division has become the urgent question of the day.

Pink Floyd
The truth might set us free, but it won’t necessarily have any impact on cherished beliefs — even those that are demonstrably false. So what do you do when in the face of steadfast beliefs that are nonetheless demonstrably wrong? What is the best response when tribal signaling is trumping truth?
The answer lies in our brain, that mystery of computational power that operates as decisively when its operator is conscious as unconscious.
The first thing is to remember, at the core of tribalism is not truth, but beliefs. And the one thing you cannot do is reason anyone out of their beliefs. After all, beliefs are not arrived at with reason, and so cannot be dismantled with same.
We reason not to get to the truth but to stay safe — to stay part of the in group. We use reason in order to get along with other people, to be part of a tribe, which in turn is crucial not just to our sociable natures but to survival itself.
With survival at stake it is easy to see why the context of the tribe, and the safety it represents, matters more than logic. Because tribes represent safety in the most fundamental sense (survival), agreeing with the tribe is a safe default position for group members, even when it doesn’t make sense to do so.
For example, consider one study which asked people whether, if they had a fatal disease, would they prefer a life-saving diagnosis from a computer that was 1,000 miles away, or the exact same diagnosis from a computer in their town. A large majority preferred the same information if the source…a machine…was local, though such a position is inherently nonsensical.
Why Facts Don’t Change Our Minds
One thing experts universally agree on — the least effective means of challenging a cherished belief a person has is to bombard him/her with information that challenges that belief.
In fact the opposite is true: the more you can prove something is false, the more the true believer is likely to dig in and double down on defending their beliefs. This is due to common reasoning fallacies that all humans are subject to — it’s how our brains are built.
When taking in information, what persuades people is not the actual factual nature of the information, but how much they want to believe something is true (i.e. how motivated they are in their belief) as well as how sincere they believe the person delivering the information to be.
Imagine a mouse that thinks the way we do. Such a mouse (and those that follow it), “bent on confirming its belief that there are no cats around,” would soon be dinner.
As a rule, strong feelings about issues do not emerge from deep understanding, but rather, intensity of belief. And beliefs are powerfully shaped so they agree with beliefs of the groups with which we most strongly identify.
Researcher Dan Kahan studies cultural cognition, or how cultural identification values shape perceptions of public risk and related policy enactment.
Kahan’s research has found that the more challenged our views are, the more dogmatic, defensive and closed minded we become, engaging in an intellectual form of ‘circle-the-wagons, we’re under attack’ tribal unity and bringing to bear a full arsenal of “weaponized irrationality” when confronted with evidence that contradicts our beliefs, including belief perseverance, confirmation bias, and the backfire effect, among others.
Belief perseverance is the tendency to cling to one’s initial belief even after receiving disconfirming information. We enable belief perseverance by engaging in all kinds of flawed reasoning, chief among them confirmation bias —a type of data distortion in which we create/recall evidence that confirms our belief, and ignore/don’t recall disconfirming cases.
Confirmation bias has a physiological component — it literally feels good. Research conducted at Oxford (“Denying the Grave: Why We Ignore Facts That Will Save Us”) shows that people experience genuine pleasure — a rush of dopamine — when processing information that supports their beliefs.

What we say we believe says something important about how we see ourselves, making disconfirmation of such beliefs a wrenching process — something that our minds stubbornly resist.
“It feels good to ‘stick to our guns’ even if we are wrong.”
Just as confirmation bias shields our beliefs when we actively seek information, we have other mental tricks to protect our beliefs when reality tries to dismantle them. The backfire effect defends our beliefs when we are presented with factual evidence that disconfirms our beliefs.
That’s right — corrections actually increase the strength of our misconceptions, if those corrections contradict personal beliefs. Not all of us, of course — just those of us with amygdalas.*
*all of us
Fire, Aim…Ready!
According to Matthew Feinberg, an Assistant Professor of Organizational Behavior at the University of Toronto’s Rotman School of Management, the process of belief formation is exactly opposite from what one would expect: when confronted with a problem people do not reason their way through alternatives to a logical conclusion. Instead, they “come to the conclusion first, and then the reasons they kind of pull out just to support their beliefs.”
As it happens, the reasons do not have to be actually well-reasoned; we’ll think they are, no matter what. The illusion of explanatory depth is the untested and incorrect conviction that one has a deeper understanding of the world than is actually the case — essentially confusing familiarity with knowledge because, tautologically speaking, we don’t know what we don’t know.

Research shows the tendency appears as early as second grade, and that it’s been around pretty much as as long as mankind has been sharing thoughts with one another.
If you think you’re more rational than the average person, indulge in a quick thought experiment: turn on a recorder and explain how a toilet flushes. Now imagine someone having to build a toilet based on this explanation. Did you remember to include all 14 parts? Did you include a description of siphoning at work, maybe throw in Bernoulli’s equation for clarification?

Most of would realize very quickly that we cannot properly explain how a toilet flushes —five minutes of fumbling would debunk our illusion of explanatory depth.
It’s a conundrum of growth, that one of the unintended consequences of progress is expanded ignorance. Throughout human history, as people invented new tools (the septic system; the light bulb; the combustion engine; the computer) for new ways of living, they by default created whole new realms of ignorance. It’s a necessary evil of progress, though: imagine if every member of society had to master the printing press in order to read a book.
Adapting to new technologies will always on some level require society to embrace incomplete understanding, in order to thrive. After all, learning how to hunt together as a group was probably a key development in humankind’s evolutionary history.

But the dark side of relying on one another’s expertise is our tendency to fall prey to the pervasive belief that we as individuals know way more than we actually do. The illusion of explanatory depth is not harmful when it comes to adopting to new technologies — we can use smart phones and drive cars without knowing the specifics of the mechanics that make these things possible. But it’s an entirely different matter to adopt a position on subjects — such as climate change or immigration or Brexiting — without knowing what you are talking about.
We should realize that the baseline condition is not equality; it is bias.
~Arlie Russell Hochschild, “Strangers in Their Own Land: Anger and Mourning on the American Right”
Noticing What Is Noticed
Our capacity for critical thinking is easily derailed by what we notice, and how we feel, at any given moment. We humans are so susceptible to reasoning errors, it takes little more than the power of suggestion to sway us like a reed in the wind.
The suggestion doesn’t even have to be relevant — it just has to be made: that’s what MIT professor Dan Ariely demonstrated in his research “Predictably Irrational: The Hidden Forces That Shape Our Decisions”.
In the study, research subjects bid on an item. But before they placed their bids, they were asked to jot down the last 2 digits of their social security numbers. Students whose Social Security number ended with the lowest figures — 00 to 19 — were the lowest bidders, offering on average $67. Students with the highest bids had the highest last two digit value — their bids were an average of $198.
That’s right , students who jotted down higher irrelevant numbers bid THREE TIMES MORE than students who jotted down lower irrelevant numbers. This effect is called anchoring.
Similarly, critical thinking is severely curtailed in the face of strong negative emotions: when we are angry or fearful (illegal immigrants are terrorists!) the amygdala activates and overrides rational thought.
The Fault In Our Beliefs
Many of us make the Platonic error of reasoning: that is, we think we’re considering the facts and not letting emotion ‘get in the way’ of actively arriving at the rational conclusions that form the basis of our beliefs.
But in reality, facts have little bearing on what most people believe…and, far from being the polar opposition of rationality, our emotions are essential to our rationality. In fact, we couldn’t be rational beings without emotions.
If Descarte had known about somatic marker hypothesis, he may have reframed his famous aphorism “Ego cogito ergo sum” — “I feel, therefore I think.”

cogito ergo sum
In the excellent book Descarte’s Error, neurologist António Damásio suggests rationality and emotionality are not only not opposites, they inter-dependently coexist. René Descartes’ “error” was the dualist separation of mind and body, rationality and emotion. Damásio proposes rationality is not separate from but actually requires emotional input. Or, as How We Decide author Jonah Lehrer puts it,
Every feeling is really a summary of data, a visceral response to all the information that can’t be accessed directly.
Experts and geniuses get that way through practice, and stay that way by paying attention to their feelings, i.e. letting their intuition inform their decisions. Clearly, when there is a deep well of experience composed of deliberate practice behind them, feelings are an accurate shortcut.
Consider Ted Williams, the greatest hitter who ever lived. According to The Physics of Baseball, a typical major league pitch takes about 0.35 seconds to travel from the hand of the pitcher to home plate — the same amount of time between one heartbeat and the next. It takes the average batter .25 seconds to activate his muscles and initiate a swing. The visual information of the ball entering the strike zone takes a couple of milliseconds to travel from the retina to the visual cortex.

05 seconds in slomo
All of this leaves the batter with fewer than .05 seconds to perceive the pitch and decide if he should swing.Sounds like an impossible feat, but not all .05 seconds are equal. This is where feelings come in. For Ted Williams, winner of six American League batting titles, holder of a a .344 lifetime batting average, and the last MLB player to hit .400, that .05 seconds was informed by making a science out of hitting.
He was known to practice his swing for hours everyday, often using any household object as a bat. After games, he could be found taking extra batting practice. He studied pitchers’ statistics, memorizing their pitching repertories, and poring over the daily box scores for clues about each pitcher to, as he termed it “update his mental databank.”
Which is a pretty accurate description of why Ted Williams was a better hitter than anyone else has ever been; during his .05 second decision window, Williams had far more information — data points — to access than the average hitter; but rather than going through his vast database tick by tick, reasoning his way through the best way to hit a particular pitch, his emotions guided him instinctively, triggering the decision to swing with more consistent bat on ball contact than any hitter then or since.
Perhaps the best explanation social science has to offer when it comes to the question “Why does tribalism trump truth?” is realistic conflict theory.
Realistic conflict theory (RCT) explains how intergroup hostility can arise as a result of conflicting goals and competition over limited resources, and it also offers an explanation for the feelings of prejudice and discrimination toward the outgroup that accompany the intergroup hostility.
Feelings of resentment can arise in the situation that the groups see the competition over resources as having a zero-sums, or “winner take all” fate, The length and severity of the conflict is based upon the perceived value and shortage of the given resource.
According to RCT, positive relations can only be restored if superordinate goals — i.e. goals that require the cooperation of two or more people or groups to achieve, which usually results in rewards to the groups — are in place.
Eagles and Rattlers in the Robbers Cave
The most famous social science experiment demonstrating realistic conflict theory had a name — The Robbers Cave Experiment — worthy of a Hardy Boys mystery, and containing all of our sad little human reasoning errors in a perfect microcosm that starts with two dozen well-behaved Boy Scouts meeting each other for the first time and devolving into a Lord of the Flies hatefest in just 2 weeks time. The study was conducted when President Trump was 8 years old, and more than sixty years later its relevance to the United States of amygdalas is more relevant than ever.

The actual Robbers Caves, OK — (not quite Lord of the Flies)
In 1954, Researcher Muzafer Sherif of the University of Oklahoma carried out a 3 week study at a 200-acre summer camp in Robbers Cave State Park, Oklahoma.
The study consisted of 22 twelve year old boys divided into two groups balanced across physical, mental and social attributes. Each boy was picked up by bus and transported to a Boy Scouts of America camp in the Robbers Cave State Park, where a three stage experiment was conducted over a period of three weeks.
During the first stage, the groups were kept separate from each other in different parts of the camp. For the first few days each group engaged in activities that helped the group members bond. After six days, each group had named itself: The Rattlers, and the Eagles.
As each group became distantly aware of the other group, their internal bonding took on an external focus, for example, becoming defensive about which camp facilities might be being ‘’abused’ by the other group. The groups independently requested the camp staff (i.e. the researchers) to set up an intra-group competition.
The prospect of competing against the other group seemed to rally each group to work hard at excelling at any activity that might become competitive. In other words, the mere presence of an ‘other’ acted to reinforce group identity and cohesion.

for #winers only
During stage two of the experiment, the two groups were brought together for the first time. With the intention of introducing friction, a series of competitions were announced with a winner take all prize structure: the winning team received a trophy and each boy on the winning team received a medal and a multi-bladed pocket knife, while the losing team received nothing.
As the groups prepared for the competition the ‘otherfication’ began in earnest: the Rattlers mounted their flag on a communal baseball field and made threats about what they’d do if an Eagle so much as touched it. In the mess hall the groups engaged in name calling and singing derogatory songs about each other, with some requesting the camp staff to arrange separate meal times so as not to have to eat with the other group.
Cabin raids and a show of disrespect for each others’ flag followed, with each group burning the other’s flag. When the Eagles won the contest, the Rattlers raided their cabins and removed any medals or pocket-knives they could lay their hands on. Name calling escalated in quantity and intensity with several boys nearly coming to blows. Noses were held when the ‘enemy’ was near and both groups objected even to eating in the same mess hall at the same time.
In Stage three (“the Integration Phase”) the researchers sought to dissipate the friction over the course of the third and final week of the experiment. However a bean-counting contest, fireworks evening and a film did not lead to a reconciliation between the Eagles and the Rattlers — feelings of animosity remained strong, with several encounters ending in food fights. The researchers found that the contrived competitions created such real tensions it would take far more than contrived recreation opportunities to achieve any reconciliation.

The researchers next introduced activities with superordinate goals, the attainment of which was beyond the resources and efforts of one group alone — i.e. requiring the two groups to work together toward a solution. A change of venue helped to inhibit recall of grievances associated with action that transpired at Robbers Cave.
The first superordinate goal to be introduced pertained to drinking water — the researchers arranged for the camp to experience a loss of water supply, requiring the campers to discover and address the source of the problem, even as they experienced the consequences of the shortage — thirst, inability to flush toilets, limited cooking ability affecting meals, etc. The second superordinate goal was selecting and covering the cost of entertainment — in this case, a movie. Two more included hauling a fallen tree from a road and freeing a truck carrying food for both groups that was stuck in a muddy rut.
The joint sharing of goals and achievement lessened intergroup tensions. After fixing the water supply problem, agreeing to a per-camper cost to cover most of the movie and taking care of the tree and truck issues, the previous animosity between the Rattlers and the Eagles dissipated, along with their group identities —the campers once again ate together in the mess hall without complaint. Intragroup friendships blossomed and at breakfast and lunch the last day the boys sat together, rattlers among eagles and eagles among rattlers.

no winners in this fight
When put to a vote a majority elected to travel home from the camping trip on the same bus, and when it stopped for refreshments, the former “Rattler” leader suggested that the Rattler prize for winning one one of the camp contests be used to buy all the boys (Eagles too) milk shakes, which the campers cheered.
What then must we do?
When it comes to tribalism, if the truth won’t set us free…then who or what will? As it turns out, when it comes to the prison of beliefs, you have to really want to be free — truth is not the opening of the prison door, but a key only the prisoner himself can turn to unlock it.
The key to changing your belief system is changing your thoughts. The key to reducing political polarization is not just to expose people to information that conflicts with the beliefs they want to hang onto, but also increases their receptivity to that information. Now that you know all the biases you’re up against, you can choose stratagems that optimize the chance for receptivity to your message.
- Choose the right messenger. People are much more likely to be convinced of a fact when it “originates from ideologically sympathetic sources,” so choosing messengers that look and sound like the audience is ideal. People will only imbue narratives with the power of belief when they’re coming from someone they trust.
- Use humor. Research demonstrates that humor enhances a speaker’s credibility, audience rapport, and aids in message retention. Laughter — especially when it is unexpected — lowers resistance; shared laughter creates a sense of community/ tribal belonging, however briefly.
- Define the ‘goodness context’. People value a service more when it is described in isolation or presented as part of the status quo, than when it is presented as part of a larger good, i.e. people are more likely to vote budget dollars to upgrade fire safety equipment than to improve disaster preparedness. Frame new taxes as a ‘tax bonus’ and voters will vote yes; call it a ‘tax penalty’ and voters will be angry, even though the concepts are functionally equivalent.
- Question, don’t challenge. If you ask the average person to explain why they hold a given opinion “They will come to realize the limitations of their own understanding,” said Frank C. Keil, a Yale University psychologist who studies intuitive beliefs and explanatory understanding. This won’t necessarily lead to a change in point of view, but “if you ask someone non-aggressively to walk you through their point of view, they’ll likely see the holes more.” One of the most effective de-biasing technique is known as counter-explanation, i.e. asking the person to imagine or explain how the opposite belief might be true. This won’t change hearts and minds on the spot, but can introduce the all-important change of thought that precedes a change in beliefs.
- Be accepting of the person, if not their beliefs. Make them feel good about themselves and they’ll be more receptive to your message. “When people have their self-worth validated in some way, they tend to be more receptive to information that challenges their beliefs,” says Peter Ditto, a psychology professor at UC-Irvine who studies emotion and its connection to political and religious beliefs. This is partly because our mood determines a lot about how receptive we are to new information or ideas: If we’re happy and confident and at ease, if we feel our dignity is being respected, we’re more likely to be open-minded.
- Identify/pursue superordinate goals — i.e. solve a problem that requires united, cooperative action and a meaningful shared reward(s) that may otherwise be unattainable.In the Robbers Caves experiment, after successfully creating groups that disliked each other, our researchers found themselves no more able to control their monster than Dr. Frankenstein. They tried and failed to ease tensions between the two camping groups, which now hated one another with the natural intensity and enmity of real rattlers and eagles. Appealing to their scouting commonality was useless; where before there were 24 campers, there was now a good ‘us’ and an intolerably bad ‘them’. Arranging for a shared pleasant experience only served to provide an arena for escalated fighting. Introducing a third, neutral group to become a ‘common enemy’ resulted in a recruitment competition — an arms race between the original two groups seeking to win hearts and minds. Establishing common goals — the same thing that so successfully created the groups to begin with — was the only thing that proved effective in uniting the two groups.