A few weeks back I was reading a report penned by Amay Hattangadi and Swanand Kelkar from Morgan Stanley. In that report, I came across a very intriguing word called “Echo Chamber”. The authors wrote –
The most telling reaction post Brexit was from a London based friend who apart from lamenting the outcome went on to say that he didn’t know of a single person who was likely to have voted “Leave” and hence felt that the outcome was rigged. This is what we called the “echo chamber” in one of our earlier essays. We tend to be surrounded by people who are like us and share our world view. Social media accentuates this by tailoring our news and opinion feeds to match our pre-set views. To avoid falling into this homogeneity trap, one needs to seek out and dispassionately engage with people whose views differ from your own and that’s true not just for current affairs but your favourite stocks as well.
The word ‘echo chamber’ painted such a vivid picture in my mind that I decided to give it a permanent place in my mental attic. Echo chamber has thus become an important node in my latticework of mental models.
Echo chamber effect feeds on a fundamental cognitive error called confirmation bias (sometimes referred to as commitment and consistency bias). Famous psychologist Robert Cialdini has written about this bias extensively in his seminal book, Influence – Psychology of Persuasion .
Just like every other human bias, the roots of confirmation bias can be traced by stepping into the jurisdiction of evolutionary biology.
Millions of years of evolution has wired the human brain to shun the inconsistencies in the environment – either by avoiding or by resolving. It’s extremely hard for us to harbour two conflicting pieces of information in our mind at the same time. This mental discomfort created by entertaining two or more contradictory arguments is known as cognitive dissonance .
The three-pound grey matter inside our skull has been designed by mother nature in such a manner that it’s optimized to conserve energy by getting rid of (by hook or by crook) cognitive dissonance. Even if it means being delusional.
Charlie Munger calls this Inconsistency Avoidance Tendency. The result of this tendency is what psychologists have termed as confirmation bias.
Rolf Dobelli, in his book The Art of Thinking Clearly , writes –
The confirmation bias is the mother of all misconceptions. It is the tendency to interpret new information so that it becomes compatible with our existing theories, beliefs and convictions. In other words, we filter out any new information that contradicts our existing views (‘disconfirming evidence’). This is a dangerous practice.
Confirmation bias manifests itself in our behaviour by making us seek those things which validate our prior beliefs. We tend to hang out with those people who agree with our views. We selectively watch those news channels which bolster our existing political inclinations. At the same time, when we come across a contradictory piece of information, our mind tends to either ignore it or call it wrong.
The biggest danger with confirmation bias is that, although it starts small, it compounds very rapidly. As we unconsciously construct an environment which is devoid of any conflicts and contradictory information, we get embedded deeply into our cocoon of beliefs. Beliefs that are prejudiced about how the world works.
There’s a vicious cycle at work here. The echo created by constant reinforcement and repetition of the same ideas inside our mental chamber turns us into someone who knows only one side of the argument. Modern information technology and ease of access to information has further exacerbated this problem.
In their article in livemint , Swanand and Amay write –
..social media systematically finds ways to ensure that we are fed with more of what we find appealing. Our Facebook feed is filtered based on previous history of “likes”. Amazon suggests books to buy based on our pattern of previous purchases. Twitter suggests whose tweets we should “follow” based on those we are already following. The online world has magnified the decibel level of the reverberations in an echo chamber manifold.
The positive feedback loop amplifies the effect and results in a mind that can believe in anything no matter how implausible or irrational.
Organized religions and cults have been the biggest beneficiaries of echo chamber effect. People of same religion flock together, share same myths and have the same world view.
For centuries, the term black swan was used as a metaphor for something which didn’t exist or something impossible. People believed that all swans were white. No one had seen a black swan before and every time someone spotted a white swan they would cite that as an evidence to confirm their hypothesis i.e., all swans are white.
However, one single observation to the contrary invalidated a belief derived from millennia of confirmatory sightings of millions of white swans, write Amay and Swanand. “But unfortunately, that is not the way we typically function. We do quite the opposite, which is to form our view and then spend the rest of the day finding all the information that agrees with our view.”
The best armour against confirmation bias is to actively look for disconfirming evidence. The best way to arrive at truth is the process of eliminating what’s untrue. This is known as the process of falsification.
The father of evolutionary biology, Charles Darwin, was known to practice this diligently. Whenever he encountered an observation which did not fit his hypothesis, he would immediately make a note of it. He was aware that his brain would conveniently forget about exceptions if he didn’t take extra care in acknowledging and capturing it.
A year, according to Charlie Munger, is a wasted year if you haven’t destroyed one of your most cherished ideas. He likes to say –
We all are learning, modifying, or destroying ideas all the time. Rapid destruction of your ideas when the time is right is one of the most valuable qualities you can acquire.
On another occasion Charlie said –
Ask yourself what are the arguments on the other side. It’s bad to have an opinion you’re proud of if you can’t state the arguments for the other side better than your opponents. This is a great mental discipline.
As an investor, another important trick to avoid confirmation bias is to not talk about your investment ideas in public. Many successful investors, including Mohnish Pabrai and Guy Spier, follow this principle. They understand that the more they discuss and defend their investments in public, harder they pound it back into their own head, and tougher it is to change their opinion later.
Guarding against confirmation bias doesn’t mean that one remains indecisive. Making a decision under the spell of confirmation bias and going ahead with a decision in spite of disagreeing with it, are two different things. For that matter, you don’t always need all the evidence and agreement to make a decision.
Warren Buffett and Charlie Munger have starkly different personalities, plus both are ruthlessly independent in their own thought processes. It’s unlikely that they would have never disagreed with each in their half a century of partnership. But that hasn’t stopped either of them in making decisions despite the disagreement.
If you’ve attended our annual meetings, says Buffett, “you know Charlie has a wide-ranging brilliance, a prodigious memory, and some firm opinions. I’m not exactly wishy-washy myself, and we sometimes don’t agree. In 56 years, however, we’ve never had an argument.”
Disagreeing and having an argument about who’s right are two very different things. Argument stalls the decision making. Disagreeing doesn’t.
In his 2016 letter to shareholders , while talking about high-velocity decision making, Amazon’s CEO Jeff Bezos wrote –
If you have conviction on a particular direction even though there’s no consensus, it’s helpful to say, “Look, I know we disagree on this but will you gamble with me on it? Disagree and commit?”… I disagree and commit all the time. We recently greenlit a particular Amazon Studios original. I told the team my view: debatable whether it would be interesting enough, complicated to produce, the business terms aren’t that good, and we have lots of other opportunities. They had a completely different opinion and wanted to go ahead. I wrote back right away with “I disagree and commit and hope it becomes the most watched thing we’ve ever made.” Consider how much slower this decision cycle would have been if the team had actually had to convince me rather than simply get my commitment.
Note what this example is not: it’s not me thinking to myself “well, these guys are wrong and missing the point, but this isn’t worth me chasing.” It’s a genuine disagreement of opinion, a candid expression of my view, a chance for the team to weigh my view, and a quick, sincere commitment to go their way.
Warren Buffett once wrote –
What the human being is best at doing, is interpreting all new information so that their prior conclusions remain intact.
That’s why you need a devil’s advocate who can challenge your assumptions. Someone who can ask uncomfortable questions.
As an investor, it’s very important to have your own small group of intellectual peers to bounce your ideas. But be careful in selecting these folks lest your sounding board turns into an echo chamber for that would not only be futile but outright dangerous for your decision-making process.