Thinking

A Short Guide to Sniffing Out Bullshit in Yourself and Others

The Story of the Arrogant Homeless Man

Two homeless guys sat down next to me at an open-air gym near the waterfront in Florianopolis, Brazil. For the next 30 minutes as I continued to do my training, they conversed on a variety of subjects and ended up teaching me a life lesson I don’t want to ever forget.

The behavior of the first homeless guy was what you’d generally expect. He was slow in his speech, a bit dopey, and more or less lost as to what he was doing in the world or where he was going on that day.

The second homeless guy, however, took me by surprise. He seemed sharp as a tack. He spoke clearly, confidently and with certainty about what he was saying as he lectured the other. You could have mistaken the tone of his voice for that of a university professor or upper manager at any business. After a few minutes he began talking about SpaceX and Elon Musk. The first ever launch of the Falcon Heavy rocket was scheduled for later that day and this homeless guy was up to speed on the news.

But as I listened in between sets I started to pick up on what was happening. As confident as this guy sounded he was completely and totally making up everything he said. You may have met people like this in the past. Complete confidence in their opinions, yet completely out of touch with reality.

When the dopey homeless guy asked him, “How far away is Mars?”. The smart one said without pause, “It’s 14500836353 kilometers.” “Wow, how far is that?” asked the dopey one. And the smart one replied with absolute conviction “As far as Rio de Janeiro.”

This continued on for some time. With the dopey one asking questions and the smart one giving complete nonsense answers with absolute conviction. Maybe he really believed what he was saying. Maybe in his mind he was the brilliant intellectual that society had unfairly cast to the street. I don’t know his life story or how he ended up here.

But it was at this point that I clearly realized why ancient teachings warn so often of the ‘fall of the proud’.

I realized just how dangerous the arrogance of unquestioning conviction in your thoughts really is. It wasn’t just drugs, or stupidity or hard times that could lead you to the street. You could also get there by being the hero of your own thoughts, the intellectual yet idiot.

Delusion is dangerous. And in today’s world where we have access to any information we want at any time our capacity to delude ourselves and others is at an all-time high.

We need to be careful. Below I’ve tried to construct a 24 item checklist as a way of pinpointing common traps that can lead us to murky thinking, poorly chosen action and the potential to become arrogant idiots in a world of vast access to information.

I hope that by reviewing this checklist we remember to be a little less certain and a little more doubtful of what we “know” in how we navigate the world.

A 24 Item Checklist for Bullshit Finding

On Beliefs and Bets

1. You hold ANY belief so strongly that it becomes something beyond questioning.

Nothing is 100%. Including your religious and political views. You can be 99.99% sure if you think it’s justified. But the minute you refuse to consider the possibility that you could be wrong you make yourself infinitely vulnerable in a future which may have different rules, discoveries and possibilities.

2. You refuse to acknowledge good ideas when they come from a person you don’t like

3. You forget that you are biased towards that which you already believe.

It takes significantly more evidence to change your mind than to help you confirm what you already know.

4. You think that your ideas are in fact yours.

Remember that every idea in your head is there because people far smarter than you have been developing these ideas for all of human history. Without the ideas of others, you wouldn’t even know what a wheel is.

5. You’re intolerant of intolerance

What do people at extreme ends of the political spectrum have in common? They are right, and the other is an idiot.

6. You won’t put your money where your mouth is.

If you’re actually confident in what you’re saying you should be willing to bet on it. Talk is cheap without consequences.

7. You claim your wins are because of your skill, and your losses are because of bad luck.

How genius you are for buying that stock before it shot up in price. But why didn’t you see the crash coming, sir?

On Opinions

8. You forget that opinions carry more weight when they potentially could cause you harm

Example 1: The oil company executive who lobbies for carbon emissions reduction.

Example 2: The female professor who speaks out against radical feminism.

9. You forget that opinions carry less weight when they align with the “virtuous” crowd.

Pretty words are nice to hear. But, do you actually believe that? Do you act on what you say?

10. You assume your opinion is even remotely valid on more than a handful of subjects.

It takes 10,000 hours to become an expert in any field. We really only have time to know a couple of things well. Therefore, if you hold a strong opinion on more than a select handful of topics you really need to ask yourself why you feel so strongly about things you know so little about.

On Arguing and Statistics

11. You defend an argument using the results of a single scientific study

A false positive occurs when you mistakenly conclude that something is true, when in fact it is false. This happens every day in Science. In the majority of scientific fields if you can statistically prove that the results you got would only happen by chance less than 5% of the time, that is considered good enough to publish.

The problem is that when thousands of scientific studies are being done, and with publishers that bias towards studies that make a new finding, there will be many false positive results.

Technical Side Note:

Let’s explore this problem graphically with simulation.

What if we were to run a real experiment trying to prove a good hypothesis?

Let’s assume that this hypothesis actually does have some real underlying truth but the data is noisy. By chance I might prove my hypothesis or I might not. But if I ran that experiment 10,000 times the results might end up something like the graphic below. With a real underlying relationship and some noise on top of it we find our relationship with statistical significance a good percentage of the time. So, one experiment might not tell us much, but by looking at the full distribution of results from the scientific literature we can see that there is a real relationship here.

In contrast, what if I was to do the same thing but this time my hypothesis is total garbage? If I do 10,000 experiments where there is no underlying relationship, by definition, I will get a false positive 5% of the time. When you look at the whole picture you can see that there is no relationship here. The results are totally random. However, if you want to cherry pick your study results you’ll have 500 out of 10,000 garbage conclusions to use in your magazine article or marketing.

What this means is that because so many studies are done with such intense pressure to publish new findings, if you want to you’ll be able to find and hand pick a single scientific study or two that says whatever you want it to say.

Never rely on a single study. You always need to ask what does the total body of scientific literature say about this subject to get a reasonable view.

12. Even worse: you cherry pick your arguments from the media

As bad as using single studies to prove your point is, using some random article from the media is even worse. The media tends to report only on random “outlier” studies.

13. You make your argument by swapping out the thing you are arguing for with something similar but a little different.

Example: A pharmaceutical company claims that their medicine cures colds because it kills 99% of germs in a lab study. Germs in lab != germs in human.

14. You confuse correlation with causation.

Everyone understands that ice cream sales don’t cause more murders even though they might be correlated. But we still fall prey to the random correlations that pop up in many observational studies.

15. You find signal in the noise with back rationalization.

We love to invent patterns and feel smart for finding them after the fact. If you assume the past will equal the future try drawing a linear regression on a stock chart and see how far that gets you.

16. You trust a statistic without knowing it’s error band.

Economists predict 4% growth in 2019. What this actually means is that Economists predict between -2% and 10% growth with 95% confidence. When experts don’t tell you the error bands of their predictions it’s usually because they are embarrassed by them.

17. You use a single non-standard case to try to refute a more general observation.

Example: You know a single parent who does a fantastic job raising their kids. You conclude that single parent households in general are equally effective for raising kids as 2 parent households.

18. You argue with someone and need to attack the character of the person you’re arguing with, because you can’t refute their ideas.

19. You take quotes out of context.

On Actions and Education

20. You use your big goals as an excuse to not do the little things that are right in front of you.

You prefer to be a hero in your own mind rather than actually get moving somewhere realistic. You dream about the millions instead of making the $10,000.

21. You complain about a situation without comparing it to its realistic alternatives.

What’s the baseline? Is there a better realistic option?

22. There’s a misalignment between what you think you deserve and what you offer.

You receive in proportion to what you give, not what you want. This applies to your salary as well as what you get from a relationship.

23. You believe strongly in an idea that you can’t or won’t test in the real world.

All models are wrong, but some are useful. You only find the useful one’s by using them.

24. You study something with no value and think you’re smarter because of it.

Education unchecked in the real world can lead to crystallized conviction in idiotic theories.

You have a PhD in WHAT?

Nassim Taleb calls this being an IYI (Intellectual Yet Idiot).

Closing: A Special Word on Preconceptions and Bayesian Thinking

A poker player sits down to play a game of cards. Across the table sits an old man, quiet, neatly dressed, with his poker chips stacked in piles of precisely the same size.

Immediately, the poker player makes a judgement about this old man. He will likely play a straight forward game. He won’t bluff too much. And if he bets big he probably has something good.

Next to the old man is a different character. He’s young, overweight, sloppy, and calling out at the waitress to bring him another drink.

Again, the poker player makes a judgement about the young guy. He’ll likely be a bluffer. If he bets big he’ll probably have nothing. If he has something good he’ll probably play it slow to try and trick me into getting more money in the pot.

Are these split-second judgements an act of prejudice? Is it wrong to judge people or situations based on so little knowledge? Or is it simply good poker?

Anyone who’s played a lot of real money poker knows that making these types of judgements is not only justified but necessary if you want to be successful.

Making initial judgements about people or situations based on little information, previous experience, market norms, common sense and statistics is not unethical. It’s a proper first step for forming your models.

Specifically, it’s called decision making by using a Bayesian Prior. Named after the famous statistician Thomas Bayes. And it’s arguably the most honest way to see the world.

A Bayesian prior is a “probabilistic” assumption you make BEFORE the game starts. And using Bayesian priors over time will result in you getting to the best answers faster than if you treated every situation as a blank slate.

However, here’s the problem!

Preconceptions become a problem if you fail to update your judgement as new information unique to that situation comes forth.

Back to the poker example. If after one hour of play you saw that the old man in fact had tried to bluff you 3 times you would need to update your model. This particular old man is not like the others you’ve played with before. He’s a unique individual and the more you get to know his playing style the less important the Bayesian prior becomes and the more important your unique experience with him becomes in judging his play.

If you stuck to your previous assumption that old, neatly dressed men don’t bluff, and refused to change your strategy in the face of new information, this old man would eat you for lunch.

So, in the face of new information you happily modify your opinion about him.

This is an extremely useful way of thinking about all situations in life, not just poker. And I’d like to outline the process below:

How to think Bayesian 101

  1. We always need to make an initial assumption based on past experience, instinct, statistics, the market, common sense, cultural norms, or whatever.
  2. The strength of that assumption depends on the extent of your past experience, knowledge of the situation, or strength of the statistic.
  3. We then must update that assumption as new information arises.
  4. If the new information is very different and stronger than our previous assumption we must update our model strongly.
  5. If the new information is weak or confirmatory, maybe we just make a small adjustment.
  6. Your model of the world is built on the past and updated by the present.
  7. At no time can we ever believe 100% in what we “know”. Because even if what we “know” is based on VERY strong evidence, it is still based on the past in a world that’s constantly changing.
  8. If we forget this fact, one day that old man who hasn’t bluffed in 30 years will see that we’ve become arrogant beyond return, and he’ll take every penny we’ve got with one big bluff.
  9. 99% OK. 100% no.

Recommended Reading

For good books on thinking statistically, and uncovering bullshit in our daily lives check these out.

Skin in the Game: Hidden Asymmetries in Daily Life – Nassim Taleb

The Signal and the Noise: Why So Many Predictions Fail – but Some Don’t – Nate Silver

Statistics Done Wrong – Alex Reinhart

How to Lie with Statistics – Darrell Huff

Life is tough. We make decisions everyday based on incomplete information, using models to interpret the world that may or may not be relevant to the situation at hand.

This article is simply meant to highlight some of the errors in our thinking, and some ways that we fool ourselves and others. There are obviously many more. So many more in fact that I’m sure I’m fooling myself in some way right now without realizing it. But that’s the point. We will always fool ourselves.

However, only when we admit that we have all of these biases and delusions. That our model of the world is incomplete and changing. Can we live a manner that resembles someone trying to find truth.

Follow the man who seeks the truthrun from the man who has found it.”

My hope is that at least one or two of these items helped you to find an error in your thinking and that we can continue to grow, learn and evolve together.

And never become the arrogant homeless person who lectures about Mars and Rio de Janeiro.

Cheers,

Tyler

PS: Leave a comment below or send me an email anytime at tyler@tylerjwatkins.com

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment