Cognitive bias refers to individuals consistently making irrational decisions, often intuitively or unknowingly. Many humans have cognitive biases that appear in certain logic, economic, or interpersonal situations. Researchers suspect that many of biases are adaptive, developed over time to aid in human decision making, especially in social situations.
Understanding these bias can help individuals make better decisions or recognize situations where they may be being manipulated.
Anchoring is the cognitive bias where a person is first shown a number in some form deliberate or subtle and then told to perform an action. Persons shown a low number typically anchor towards lower numbers and persons shown a high number tend to anchor up.
Participants are given the same sequence of numbers, but in different orders, then are given not enough time to estimate the product of these numbers.
One group of users is given the numbers and the other group is given the same numbers reversed. . They are given only a few seconds to then guess at the product of these numbers. Participants in the first group, which started, or was anchored, at a high number, guessed an average number of . Participants in the second group guessed an average of . The correct answer is 40,320.
Anchoring is a common tactic used in sales or marketing, for instance clothing prices and sales or discounts. In many Western countries, retailers use a pricing strategy known as high-low pricing. Clothing items are given a high retail price, a high anchor price, and then frequent and varied sales are thrown with large percentage discounts. This tactic anchors the shopper to a high number, and wows them with a large discount, even though the retailer was planning to sell the items for an average affective price well below retail.
The conjunction fallacy is a logical fallacy where individuals assume that a set of specific conditions is more probable than a single, more general, position. Amos Tversky and Daniel Kahneman were amongst the first to identify this phenomena through a problem known as The Linda Problem.
Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which of the following scenarios is more probable?
This is a classic problem in experiments by Amos Tversky and Daniel Kahneman.
In experiments, the majority of test takers got this question wrong. The description of Linda seems almost designed to encourage a test taker to think that she is a feminist. So, when presented with that choice, many take it, even though the conjunction of multiple specific facts is less probable of occurring than just one of those facts by themselves.
Mathematically speaking, this is equivalent to the fact from probability that A deeper look at these ideas can be seen in Bayes' Theorem.
Prospect theory, also known as decision weighting refers to how people choose between risky alternatives, specifically when the probabilities of the alternatives are known or can be guessed. In general, people overestimate the probability of unlikely events (for instance: a third party candidate being elected for president) and overweight unlikely events in their decisions
Daniel Kahneman conducted an experiment in which participants were asked to make two decisions concurrently: 1st. Choose between:
a) a sure gain of $240
b) a 25% chance at $1,000
2nd. Choose between:
c) a sure loss of $750
d) a 75% chance to lose $1,000
The majority of people chose A then D, indeed this is most people's quick gut reaction. However choosing A and D means the chooser has a combined 75% chance to lose $760, and a 25% chance to gain $240. The more rational choice would be to choose B and C, which results in a 75% chance to lose $750, losing $10 fewer dollars than A and C, and a 25% chance to make $250, making $10 more than choosing A and C.
In competitions of luck, results that are on the extreme will eventually regress back to the mean or average. In experimentation it is often the case that if an extreme result occurs the first time, an average result will occur the second. And if the second result was extreme, it is likely that the first result was average.
Consider the case where two people are standing on a line. Another line is drawn behind them. They are then each handed a coin and told to throw it as close to the other line as they can without looking. If they only attempt this once, one of the test takers will do better than the other, and if that is left as the only result, they may conclude that they are superior at this task than the other coin tosser.
However if the test is conducted a second time, the results often switch. The person with the extreme case of accuracy the first time will do worse, and the person who did poorly the first time will do better. They are both regressing to the mean.
This is one of the reason, in scientific settings, that researchers repeat their experiments. Over enough trials, the results will average out to the most accurate result.
The sunk cost fallacy represents a contrast between microeconomic theory and intuitive decision making. Microeconomics holds that once costs have been incurred and are unrecoverable, agents should ignore these cost in making future decisions. However behavior economics has shown that agents do take these costs into account.
A company is behind and over budget ($10 Million more on top of $10 Million already spent) on an IT project that is estimated to generate $100 Million in profits. To finish the project they're going to have to spend an another $10 Million dollars. However this is all they'll have to spend for the year, and a manager in another department is proposing they drop the existing project and invest in a new project that is estimated to generate higher returns.
In many cases, the company chooses to continue with the existing project, suffering from loss aversion and taking sunk costs into their decision making.
 Kahneman D. Thinking, Fast and Slow. (New York, Farrar, Straus and Giroux, 2011)