Biased Biases

During my introduction to psychology class at Ursinus College (because, well, I had to take something that semester), the professor covered a unit on cognitive biases. These are typically neat, and I dedicate a non-negligible portion of every week reading blogs on becoming less wrong1 and overcoming biases2.

One of the biases, however, really bugged me. It's called the 'representativeness bias.' Here's an example from an overview at About.com:

For an illustration of judgment by representativeness, consider an individual who has been described by a former neighbor as follows: "Steve is very shy and withdrawn, invariably helpful, but with little interested in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail." How do people assess the probability that Steve in engaged in a particular occupation form a list of possibilities (for example, farmer, salesman, airline pilot, librarian, or physician)? ... In the representativeness heuristic, the probability that Steve is a librarian, for example, is assessed by the degree to which his is representative of, or similar to, the stereotype of a librarian.- Amos Tversky & Daniel Kahneman from Judgment Under Uncertainty: Heuristics and Biases

But where's the mistake here? Well, it amounts to the opposite of what known as the base rate fallacy. Basically, Tversky and Kahneman are answering an unconditional question, while the question posed seems to indicate we should answer using all of the evidence provided. For those versed in STAT100-level probability, that means the question asks \(P(L | E)\), where \(L\) is the event that Steve is a librarian and \(E\) is the event that the various characteristics (Steve is shy, etc.) are true. This is precisely what a person should do. They should incorporate the evidence that they observe when coming to a conclusion. But Tversky and Kahneman want the person to answer with \(P(L)\), the base-rate (unconditional) probability that Steve is a librarian.

I don't think using evidence should count as a bias.


  1. I should mention here that I really don't like Eliezer Yudkowsky's rampant pro-Bayesianism. I read this when I was an impressionable youth. Fortunately, a good course in probability theory dissuaded me of the misconception that there's anything magical about Bayes's Theorem: it's just an application of the definition of conditional probability and the Theorem of Total Probability. Bayesianism, on the other hand, is a topic for a different day.

  2. Fun bonus fact: If you search for Overcoming Bias in Google, you'll find that Google has mischosen the picture for Robin Hanson to be that of his son, here. At least, they did until I hit the 'Wrong' button on their site. Time will tell if they update their web crawler.