Bias
A bias is anything happening in your brain that causes a rift between the reality that we receive as input, and how we interpret that reality. Often it is a result of your brain's attempt to simplify information processing.
- "A cognitive bias is a systematic error in thinking that occurs when people are processing and interpreting information in the world around them and affects the decisions and judgments that they make."
Why they happen
- When you use a formula you don't understand, it is effectively a black box to you. The same thing happens when the brain makes errors in logic. We feel a certain way, for instance, that death rate by hurricane is larger than death rate by heart attack, but we don't know necessarily where this feeling is coming from. We are just told the answer, and our intuition is to trust it.
- Generally, intuition is a way of saying, "I sense similarities between this problem and other ones I have worked on. Before I work on this problem, I have some expectation about the answer."
- given that intuition is an evolutionary construct, a rule of thumb is to put more trust into it in natural environments, and less trust in non-natural environments.
Think of cognitive biases as little army men that work against you. When we make a great decision, it is because there were less army men present. Bad decisions arise because there were many.
When the solution to a problem does not seem proportionate to the problem itself, people tend to resort to extremes.
- ex. When people are told a serious threat like Coronavirus is coming, but that all they need to do is wash their hands, they see the solution as innocuous and a useless measure against something so impactful. The action doesn't seem proportional to the threat, so they act irrationally, buying up toilet paper.
Conflicting messages
- when one authority says there is no problem, and another says there is a problem, it causes people to worry, which begets irrational behavior
- ex. During covid-19, WHO said to take extra precaution, while Trump said not to worry about it. Conflicting opinions from authorities causes people to worry
- when people are stressed, their reason is hampered. They then look at what others are doing (buying toilet paper) and follow suit.
- there is a lot of value in "social" knowledge.
Illusory Superiority
we judge ourselves by our intentions, and judge others by their actions. in effect we have a lower benchmark for ourselves than we do for others
Fallacy of conjunction
we are heavily influenced by vivid and readily available evidence.
- the influence is so deep that we are willing to make judgments that violate simple logic
- ex. Linda is 31 years old single outspoken and very bright. she majored in philosophy. as a student she was deeply concerned with issues of discrimination and social justice, and also participated in the anti-nuclear demonstrations.
- question: what is more probable: Linda is a bank teller? or Linda is a bank teller and is active in the feminist movement?
- the law of mathematics guarantee that the first option is more likely, yet the majority of people choose the second
Retrospective Distortion
there are millions of factoids that prevail leading up to an event. only a few of those factoids turn out to be relevant to our understanding of what happened. as a result, we will take those few factoids and justify why the event happened, claiming that it was predictable
ex. people living through the beginning of WWII didnt have any inkling as to how momentous the event would turn out to be. it is only with the benefit of retrospect that we can see that.
- bond prices around the start if the war reflect this. bond prices are a good reflection of sentiment around government
Substitution
- when posed a difficult question or one that requires some degree of analysis, people will generally substitute the question for one which has a more readily available answer come to mind
- ex. “The question we face is whether this candidate can succeed. The question we seem to answer is whether she interviews well. Let’s not substitute.”
- When something becomes hard to think about, people transfer the discomfort of the thought to the object of their thinking.
- ex. “Happiness these days” is not a natural or an easy assessment. A good answer requires a fair amount of thinking. However, the students who had just been asked about their dating did not need to think hard because they already had in their mind an answer to a related question: how happy they were with their love life. They substituted the question to which they had a readymade answer for the question they were asked.
Loss Aversion
Loss aversion is only a factor when the the participants plan to use or "consume" that particular item. Ex. Trading $100 for shoes doesn't cause loss aversion because the $100 was seen as an instrument in the person's mind. It was not to be consumed in any other way. Put another way, the purchase was already decided. However, trading 12 vacation days a year for $10000 cash causes loss aversion because those vacation days were to be consumed. Vice versa for this example too
Mental shotgun
- it's difficult for people not to naturally think of more than you ask them (eg what do you think of Donald Trump's hair? It's difficult not to consider his competency overall, or effectiveness as a president etc)
WYSIATI (What you see is all there is)
You cannot help dealing with the limited information you have as if it were all there is to know. You build the best possible story from the information available to you, and if it is a good story, you believe it.
Halo Effect (?)
ex. You meet a woman named Joan at a party and find her personable and easy to talk to. Now her name comes up as someone who could be asked to contribute to a charity. What do you know about Joan’s generosity? The correct answer is that you know virtually nothing, because there is little reason to believe that people who are agreeable in social situations are also generous contributors to charities. But you like Joan and you will retrieve the feeling of liking her when you think of her. You also like generosity and generous people. By association, you are now predisposed to believe that Joan is generous. And now that you believe she is generous, you probably like Joan even better than you did earlier, because you have added generosity to her pleasant attributes.
Zeigarnik effect
- people remember incomplete/interrupted tasks better than they remember completed ones. In other words, a desire to complete a task can cause a person to remember it until it has been completed, because its full execution leads to forgetting it altogether.
Fundamental Attribution Error
Attributing some negative quality to the other party's character, while exhibiting that same behavior, but attributing it to circumstance Ex. At a buffet, one person says to another, “Let’s stock up before all the hoarders get here,” as if preemptive hoarding is different from hoarding. Ex. People on the collectivist Left often discount the evils historically associated with socialism, attributing them to totalitarianism or dictatorship, while attributing the evils historically associated with capitalism to its very nature, which they identify as the selfish profit motive. People on the libertarian Right, meanwhile, often dismiss the evils historically associated with capitalism, attributing them to corruption or government interference, while describing the historical evils associated with socialism as the inherent features of an evil ideology that tramples over individual rights.
Publication Bias
occurs when there is a disproportionate amount of people who publish sigbificant result studies compared to people who publish null studies. This results in more studies being about significant results and giving a false reality as to how many tests are significant and how many are null
- people naturally dont want to publish null papers
Round trip fallacy
- def - The confusion of absence of evidence that unexpected, high impact events (Black Swans) have occurred, or will occur, with evidence of absence of such events (no possible Black Swans). ... But it would be erroneous to infer that there is evidence of the absence of all such events.
- ex. almost all muslims are terrorists and almost all terrorists are vastly different. if 99% of terrorists are muslim, this only amounts to 0.001% of all muslims.
- ex. "conservatives are generally stupid" is vastly different from "stupid people are generally conservative"
- ex. "there is no evidence of cancer" vs "there is evidence of no cancer"
- ex. in science, "absense of evidence" vs "evidence of absence" — scientists who thought formula was as good as breastmilk
Confirmation Bias
Once you notice something, confirmation bias causes you to keep noticing it and see a pattern, while ignoring things that don’t fit the narrative.
Seeing only events and never the rules
an experimenter gives a sequence of 3 numbers: 2, 4 ,6 and says that this sequence was derived from a rule. the objective is for the subject to find the rule by presenting his own sequence of numbers (ex. 4, 6, 8). if their sequence follows the rule, the experimenter will confirm the events. once the subject is satisfied they understand the rule, they will make the general statement. for example, many say "the numbers are always 2 apart." however, the true rule was simply "the numbers are ascending". this is an interesting experiment, because the subjects had a rule in mind based on the (weakly) established pattern: the numbers rise by 2 each time. their instinct is to confirm this theory they have by asking the experimenter "is 4,6,8 right? is 10,12,14 right?". what they never realize is that the rule is much simpler, and the generality they are making is far too general. they are making a general rule based off less information than should be necessary to make that rule in the first place this problem of creating a general rule from a strict set of observations is called naive generalization
Combating Confirmation Bias
You have to seek out the contrarian opinion to what you want. If you want to buy a new game that looks cool, you naturally want to like it. You are going to look for things that confirm your opinion to buy the game. This of course produces the effects of confirmation bias.
- What you should be doing, rather, is googling for things like "anno is ____" to see what comes up. Search "anno sucks" and see what comes up. What do the game's detractors say about it? Repetitive? "Ah, that's not really the game style I enjoy"
Narrative Fallacy
- not theorizing takes effort; theorizing is default
- human behaviour is to make sense of raw facts by making narratives. it takes considerable effort to resist making a story while taking in facts
- ex. consider a "whodunit" novel. the author paints the picture in such a way that each character has a plausible chance of being the killer. in fact, if you were to assign a probability to each person, you will find that the sum adds up to well above 100%
- humans pull memories along causative lines
- ie. memories change over time to fit more comprehensive and cohesive narratives
- Any time we draw a causative link (ie. saying "this happened because this happened"), we have to ask if someone's survival was at stake.
- ex. "he won lots of money at the casino because he is skilled"
- when survival is involved, the notion of "because" is severely weakened. The condition of survival drowns all possible explanations.
- in other words, be careful not to look for explanations when survival is a factor. More often than not, is can simply be attributed to luck and randomness.
Ludic Fallacy
- def - the tendency to stick too closely to the rules of a game, causing any unexpected occurrences to break the paradigm that you have built, making you unable to respond in an effective way.
- ex. consider boxers who learn to fight, but only within the confines of the rules laid out by boxing. If boxers were to have a street fight, they would be severely disadvantaged, since the paradigm of fighting they have created is defined by the rules of boxing
- ex. consider body builders who do all of their training in the confines of a gym. Once they exit the gym, they are disadvantaged in trying to lift anything that does not have the same motion.
- the ludic fallacy is a major reason why mathematical forecasting models are ineffective. The confines of the model are too narrow (ie. they are based on platonified forms)
- we will never have all the information, and small variations in data may result in massive changes.
Two-Track Analysis (Charlie Munger)
- When analyzing any situation where decision making by people is involved, consider:
- How would they act if they behaved rationally?
- How might they succumb to the pull of a number of irrational and psychological biases?
- ex. making inferences based on small sample sizes
- ex. remain irrationally committed to something we've said in the past
When too much incremental information can be a bad thing
- when you get information too rapdily (and with little incremental time between instances of information), you tend to form more hypotheses along the way. This causes you to recognize what is coming far later than someone who is getting information less frequently
- ex. imagine showing a picture of a blurry fire hyrant to two different groups (blurry enough that you can't recognize what it is). To the first group, you define 10 progressively less blurry versions of the image, and to the second, you define only 5 (in other words, the first group will have a smaller incremental difference between each step). In this example, the 4th step for the first group will be the same as the 2nd step for the second group. As it so happens, the second group will recognize the hydrant quicker than the first group when both are seeing the same image (ie. step 4 and 2, or 6 and 3 etc.). This is because we are forming hypotheses along the way about what it is, and get too narrowly focused on those assumptions, thereby hampering our abilities to work out what we are actually looking at.
- The problem is that our ideas are sticky, and once we produce a theory, we are unlikely to change it. Therefore, those who delay forming a theory are better off. When you form an opinion on the basis of weak evidence, you have difficulty interpreting subsequent information that contradicts this theory.
- this is due to confirmation bias and belief perserverance (the tendency not to reverse opinions you already have).
Source of Knowledge
Question where your knowledge came from. If the source is not reputable, there is absolutely no reason you should maintain ANY prior opinions that you had on the subject.
- ex. When asked about power plants, my first impression is "they are safe and probably a good thing to have". However, I don't recall where that sentiment came from. Could it have come from an unbiased individual? Could it have come from someone who was mistaken? Without any of this knowledge, it is really not fair to have any sort of an opinion on nuclear power. As such, I should have a clean slate as if I'd never even heard of it in the first place
UE Resources
- [Overcomingbias][https://www.overcomingbias.com/]
Related
Children
- Anchoring
- Cognitive Dissonance
- Hindsight
- Loss Aversion
- Perspective
- Sensationalism
- Stereotyping
- Survivorship Bias
Backlinks