Reading Time: 8 minutes
The null hypothesis of having fewer opinions.

“Sell your cleverness and buy bewilderment”

Rumi

In the age of social media, sharing our opinions with others is easier than ever. And emotionally charged subjects, such as politics, fuel our desire to get those opinions out there. We want to join in and give our take on things—we want to feel relevant.

But opinions are tricky things, and we have to be careful. We don’t always understand how they formed and why we want to share them. And once we do share them, our instinct is to stubbornly defend them, even in the face of contrary evidence. But how sure should we be? How much about the complex world do we really understand?

In this article, I want to convince you of one opinion that I’ve grown to be most confident in over the last year: we should all have fewer opinions. This is a mental framework that I label as the null hypothesis, and it’s been a game-changer for me.

On a personal level, it will relieve the psychological burden of needing to signal—or have—an opinion. On a practical level, you’ll actually be right more of the time, since the few opinions that you allow yourself to have will be more robust. And on the societal level, maybe humility like this will create healthier and more productive conversations.

The Null Hypothesis

The null hypothesis is a concept from inferential statistics, and it’s a way to avoid experimental bias. The null hypothesis is a default assumption stating that there is no effect or relationship between two phenomena—in other words, having no preconceived opinion. If there does turn out to be some causal relationship, then the evidence will indicate that, and you’ll know that your null hypothesis is false.

Approaching epistemology in this way is known as falsifiability, and it’s a concept in the philosophy of science introduced by Karl Popper. The opposite of falsificationism is verificationism, which is a problem because of confirmation bias. If you set out to prove your theory true, you might believe it to be true for the wrong reasons. So the null hypothesis is a safety mechanism that defaults you to having an “empty” hypothesis, that is until the evidence clearly indicates otherwise. 1

In more general terms as a mental framework, the null hypothesis is the space between two (or more) opinions. It’s the quality of saying “I have not yet formed an opinion on that topic”. And perhaps you shouldn’t ever form an opinion on that topic, that is unless you are willing to put in the requisite research.

Even with a falsification approach to research, the strength of your opinion should depend on the depth of that research.

Because part of this mental framework is recognizing—becoming mindful of—the relationship between your knowledge and your opinions. The more robust your knowledge is the stronger your opinions can be. Too often, though, the strength of our opinions is more informed by the emotional charge of the subject. When we vocalize opinions from this place, we are only a few probing questions away from looking foolish.

So the null hypothesis is also a reminder that you should have a high threshold of knowledge and research—again research where you try to prove your preconceived notions to be false—before ever codifying a strong opinion on something.

But people rarely do this. Mindfully watch yourself next time you’re in an argument and you want to prove your point. You begin researching something, something you’re sure will make your opponent look foolish. Is your instinct to verify the position you’ve taken—verificationism and the problem of confirmation bias—or is it to try and falsify your already held opinion?

Especially during political disagreements, I too often catch myself being a verificationist. You might win in the short term, but it doesn’t mean you’re right.

Having the mental framework of the null hypothesis avoids this by keeping us in a space between conclusions. It is remaining agnostic until the data truly lead you to a conclusion. You will have fewer fights because you will vocalize fewer opinions. And the opinions that you do vocalize will be carefully researched, or at least their strength will be properly calibrated.

Bias

“Bias is the brain’s strategy for dealing with too much information”

MOLLY CROCKET ON VERY BAD WIZARDS

I think about bias in a few different ways. One version is the more obvious and conscious bias that we all have. For example, you might have a bias towards or away from certain foods and music genres. You might hate classical music, which would clearly bias you against writing a fair review of a newly released classical music album. These kinds of preference biases are easier to admit, and we usually don’t demand reasons for having them.

From these more explicit biases of preference, we can slide into a more implicit type of bias. These can still be conscious, but we tend to mask their existence—to ourselves and to others. The most obvious example of this is political bias. People mask their political bias by asserting “I am being completely objective”. To the extent that this is even possible, it’s rarely what people are actually being. And such a person must know on some level that they are more critical of one side than of the other.

But there is a more disturbing layer to bias, even more implicit, that should induce humility. This is a completely subconscious bias in which your brain decides what data to promote into your conscious awareness. This type of bias is what Assistant Professor of Psychology at Yale University Molly Crocket has called “the brain’s strategy for dealing with too much information”.

There’s so much data around us at any moment, and if we consciously received it all we would be overwhelmed. The brain decides for us—not surprising since 95% of our brains processing is subconscious—what information we become aware of. This is a kind of signal to noise relationship, and only certain information—useful information—passes the test.

And on top of that, there’s also the problem of interpreting that data. There is an infinite number of ways to interpret information, and the brain largely decides on that interpretation without us. So you should realize that you might literally see a different world than the person you disagree with.

The Laurel-Yanny and the Gold Dress-Blue Dress internet sensations were a perfect example of this kind of subconscious bias, but unfortunately, most people didn’t get the humbling message that we are not in control of our brains.

Hidden Motives: Why we share our opinions.

Speaking of not being in control of our brains, there is a book that I want to point you toward that further develops this concept. In fact, one of the authors of this book, during the Q&A section of a Sam Harris podcast, gave the same advice of having fewer opinions.

The author was Robin Hanson, and the podcast was about his recent book The Elephant in the Brain: Hidden Motives in Everyday Life.2

**Click here to listen to our podcast interview with Robin from 2021.**

“In summary, our minds are built to sabotage information in order to come out ahead in social games. When big parts of our minds are unaware of how we try to violate social norms, it’s more difficult for others to detect and prosecute those violations. This also makes it harder for us to calculate optimal behaviors, but overall, the trade-off is worth it. Of all the things we might be self-deceived about, the most important are our own motives.”

 The Elephant in the Brain: Hidden Motives in Everyday Life

What is the elephant in the brain? Similar to the elephant in the room, it is the often unacknowledged and awkward truth about how our brains really work.

We might feel like we are transparent to ourselves, like we are in charge and understand ourselves. Putting the question of free will aside, we feel this way because we tell others—and ourselves—why we do certain things. But if 95 percent of our brain’s activity is subconscious, how can we be sure that we know ourselves?

The relevant take-away from the book is that our brains hide information from us and manipulate the reality we experience. We—that is the conscious/speaking version of ourselves—might feel like the president in charge of our brains, but in reality, we are more like the press secretary. And your job as the press secretary is to give acceptable reasons for why you do things. As Robin Hanson said on the Making Sense Podcast, “You don’t actually know why you do things, but your job is to make up a good excuse”.

You give these excuses to other people in your social circles, but you’re also giving them to yourself.

And we are rarely giving the real or primary reasons for why we do things. We self-deceive because it makes it easier to manipulate others—to send a signal of our prestige or dominance.

And we don’t only hide our motives—we also invent counterfeit reasons. This post hoc rationalization is illustrated in experiments by Roger Sperry and Michael Gazzaniga on split-brain patients, but the same process happens more subtly in a normal whole-brain person. Our ego, our press secretary, evolved not to give the most truthful explanations for our actions, but rather the most plausible and prosocial ones.

The psychology highlighted by this book is honestly disturbing, and I personally interact with it in small doses in order to maintain my grip on reality. For me, the takeaway is to watch myself more closely when I want to vocalize an opinion. What are my real motives: solving a problem or signaling prestige? Telling the truth or signaling group membership?

The Relief of Not Knowing

“When we understand the truth of uncertainty and relax, we become free.”

Jack kornfield

Much like a press secretary always has an answer, our egos always want to say something, even if we don’t have relevant knowledge. It will dodge questions, change subjects, and pretend to know things. The motive, largely hidden to ourselves, is often to signal our intelligence and prestige to our social group.

Evolutionarily this makes sense—we don’t have time to research everything, and it’s safer to go along with groupthink rather than stand alone as an individual.

Another motive might be to reduce anxiety. The world is so complex, mysterious, and unpredictable. We don’t like this insecurity, and pretending to know things helps us to ignore it, to fool ourselves into feeling more secure. But being uncomfortable with uncertainty is a bad reason to have an opinion on something.

Before having a meditation practice, I was doing that all of the time. I had trouble saying “I don’t know” or “I guess I got that wrong”. I still catch myself being too confident or pretending to know, but I now aim at what Jack Kornfield calls “the wisdom of insecurity”. This is accepting the fact that life is chaotic and insecure—we can’t control things and we don’t know enough.

And it turns out that admitting this—not just verbally but really feeling it deep down—is a tremendous relief. It’s a relief because it’s the truth, and truth makes our minds more harmonious.


None of this is to say that we shouldn’t have any opinions. The null hypothesis mental framework is simply a reminder to form opinions in a slow and thoughtful way—to stay uncommitted while you research and to be truly guided by the evidence.

Don’t confuse emotional charge with reasoned evidence. Maintain your curiosity and be open to changing your mind. Learn, as Oscar Wilde said about the value of a university degree, to “play gracefully with ideas”.

You do know things, and you might even be an expert in certain areas. The key is to have self-awareness about the depth of your knowledge and to calibrate the strength of your opinions accordingly.

Have a few strong opinions in a select few areas. Form many weak opinions, but be willing to let go of them easily. And on the rest, enjoy the relief of not needing to have any opinion at all.