Ming the Mechanic
The NewsLog of Flemming Funch

Sunday, May 9, 2004day link 

 Common errors in reasoning
picture In a comment on FutureHi, Michael Anissimov mentioned a number of pervasive errors in reasoning that are common to practically all human beings. Well, I know about that kind of thinking fallacies, but I didn't know all the "official" terms. In psychological research, a number of these fallacies have been given names, and been studied in some depth.

Availability Bias

Situations in which people assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind.

People inadvertently assume that readily-available instances, examples or images represent unbiased estimates of statistical probabilities.

E.g. if you've mainly been around a certain type of people, you easily get to believe they represent a typical cross-section of the population. At least, your estimates of various characteristics and beliefs will be biased towards the profile of the people you know. "To a hammer, everything is a nail".

Conjunction Fallacy

When two events can occur separately or together, the conjunction, where they overlap, cannot be more likely than the likelihood of either of the two individual events. However, people forget this and ascribe a higher likelihood to combination events, erroneously associating quantity of events with quantity of probability.

Here's an example:

Bill is 34 years old. He is intelligent, but unimaginative, compulsive and generally lifeless. In school, he was strong in mathematics but weak in social studies and humanities.

Which statement is more probable:
A. Bill is an accountant that plays jazz for a hobby, or
B. Bill plays jazz for a hobby?

92% of people in a survey answered A. Which is completely wrong. See answer

Wason selection task

Research has shown that people find it very difficult to decide what information is necessary in order to test the truth of an abstract logical reasoning problem. The Wason Selection Task is often used to examine this issue.

A typical experiment using the Wason Selection Task will present some rule, and ask subjects to see if the rule is being violated. Consider the rule: If a card has a D on one side, it has a 3 on the other side. Subjects are aware that on the particular set of cards, each one has a letter on one side and a number on the other side. Four cards are shown, such as those below:

Very few people can correctly pick the two cards to turn over to verify the rule. The correct cards are D and 7; most likely, you picked D and 3. Seeing what is on the reverse of the 7 card can lead to falsifying the rule if a D shows up. Seeing what is on the reverse of the 3 card cannot falsify the rule. It can confirm the rule, but not falsify it.

Support Theory

Various theories based on Amos Tversky's research, also related to availability bias, representativeness bias and anchoring.

Support Theory has an empirical base of results showing that different descriptions of the same event often produce different subjective probability estimates. It explains these results in terms of subjective evaluations of supporting evidence. [...]

According to the ‘framing effect’ peoples’ understanding of a problem is profoundly influenced by how the problem is presented.

For example, support for an option seems to increases the more that the option is broken down into smaller components. And naturally, if an option is particularly highlighted (anchored), people would tend to choose that over others, whether it is logical or probable or not.

Another interesting tidbit:

"This framework questioned the assumption of "homo oeconomicus", that is, of human beings motivated by self interest and capable of rational decision making behavior."

Masses of people are so easy to mislead (advertising, politics, media) that there's certainly no guarantee that they'll make rational decisions, which assumption is the basis for our economic system.

Representativeness Heuristic

People tend to judge the probability of an event by finding a ‘comparable known’ event and assuming that the probabilities will be similar.

As a part of creating meaning from what we experience, we need to classify things. If something does not fit exactly into a known category, we will approximate with the nearest class available.

Overall, the primary fallacy is in assuming that similarity in one aspect leads to similarity in other aspects.

The gambler’s fallacy, the belief in runs of good and bad luck can be explained by the representativeness heuristic.

People will also ‘force’ statistical arrangements to represent their beliefs about them, for example a set of random numbers will be carefully mixed up so no similar numbers are near one another.


If I meet someone with a laid back attitude and long hair, I might assume they are Californian, whereas someone who is very polite but rigid may be assumed to be English.

People will often assume that a random sequence in a lottery is more likely than a arithmetic sequence of numbers.

If I meet three people from a company and they are all aggressive, I will assume that the company has an aggressive culture and that most other people from that firm will also be aggressive.

There are a lot more theories and terms and models, of course. See, for example, this list of psychological theories, explained in simple terms.

Obviously, the human mind isn't overly suited for making logical decisions, or for correctly estimating the probability of events. It might seem a bit surprising that we even manage to keep ourselves alive and accomplish complicated technological feats. It explains at least why we often make decisions that don't serve us, and why we easily elect the wrong people to lead us. Of course it helps greatly if we can stay conscious of the various ways we are likely to fool ourselves, so we can avoid them, as much as possible, when we're trying to make important decisions.
[ | 2004-05-09 10:22 | 15 comments | PermaLink ]  More >

 Goldilock Pricing
picture Via Seb Paquet, Goldilock Pricing by Narasimha Chari:
The traditional product segmentation is to offer two versions: a high-end version and a low-end version. However, in some circumstances, it is preferable to offer three versions: low-end, mid-range and high-end. The rationale is that people tend to exhibit 'extremeness aversion' and will tend to choose the mid-range offering. Consider the following experiment (from Hal Varian's paper on Versioning Information Goods):
Simonson and Tversky [1992] describe a marketing experiment in which two groups of consumers were asked to choose microwave ovens. One group was offered a choice between two ovens, an Emerson priced at $109.99 and a Panasonic priced at $179.99. The second group was offered these ovens plus a high-end Panasonic priced at $199.99.

By offering the high-end oven, Panasonic increased its market share from 43% to 73%. More remarkably, the sales of the mid-priced Panasonic oven increased from 43% to 60% apparently because it was now the 'compromise' choice. According to Smith and Nagle [1995], "Adding a premium product to the product line may not necessarily result in overwhelming sales of the premium product itself. It does, however, enhance buyers' perceptions of lower-priced products in the product line and influences low-end buyers to trade up to higher-priced models."
In other words, adding a 'premium' version to the product line actually boosts the sales of the mid-priced version. The newly-introduced premium version steals market share from the mid-range version, but this is more than offset by the market share that the mid-range version gains at the expense of the low-end version - this is the Goldilocks effect. Note that this is purely the result of a cognitive bias - there is no objective rationale for such trading-up.

This may explain the tall/grande/venti segmentation: even though few will order the venti, its mere presence on the menu will induce some buyers to trade up from the tall to a grande. Similarly, it makes sense to add expensive wines to the wine-list that realistically no one is going order.
Seems to be another example of a Support Theory style of human thinking fallacy. By having a set of choices presented in a certain way, we make different choices than if they were presented in a different way. The grande cup of coffee remains the same size, but we feel differently about it if it is the middle choice than if it is the top choice.
[ | 2004-05-09 13:23 | 19 comments | PermaLink ]  More >

Main Page: ming.tv