Consider the following:
Lisa is 30 years old, single, bright, and outspoken. She majored in philosophy in college, where she became very concerned with issues of discrimination and social justice. How likely would you judge each of the following possibilities to be?
-Lisa is a bank teller
-Lisa sells insurance
-Lisa is a bank teller and active in the feminist movement.
If you have the intuitions of most people surveyed in psychology experiments, you would report a low likelihood for the first option and a higher likelihood for the third option. It intuitively fits, right? In actuality, the third option is less likely than the first: the probability of A and B happening is always lower than the probability of A alone (see here for an explanation). So why do we intuitively think otherwise?
In the 1960s and 1970s, famed psychologists Daniel Kahneman and Amos Tversky found that people aren't usually as rational as we think. We rely on mental shortcuts called heuristics that save time and energy and are often useful, but are prone to systematic error. It is a tradeoff between speed and accuracy, and it makes us bad intuitive statisticians in predictable ways.
In this case, the mental juicer in our analogy delivers a cup of intuition about what is likely. The process that makes the juice, though, is not proper statistical calculation. It is a shortcut called "representativeness:" we judge each option in terms of how much it is representative of, or similar to, an imagined prototype. In this case, the image of Lisa as a bank teller and feminist is more representative of how we imagine someone like her, so we think it's more likely--even though it is not. Many judgments of probability can be shown to be wrong for this reason.
This heuristic leads people to expect that causes will resemble (be representative of) their effects; that outcomes will resemble the process that created them; that big effects will have big causes; etc. While this is true often enough to be useful, it is often not true: small viruses can cause massive epidemics, and cigarettes can create raging forest fires (small cause, big effect). Or, people incorrectly expect sons to be the same height as their fathers more often than would be predicted by the laws of chance (outcome does not resemble what created it).
The representativeness heuristic has been argued to underlie magical thinking: in superstition and in homeopathy, people believe that "like cures like"--i.e., tree bark that causes malaria-like symptoms supposedly cures malaria (causes should resemble their effects). Intuitively, many people think something as complex and intelligent as humans must have been caused by something complex and intelligent, rather than a systematic but "blind" natural process (outcomes should resemble the process that created them). Or, many people think that disasters like earthquakes must have deeper spiritual causes, rather than mere tectonic plate shifts (big effects should have big causes).
But, all these intuitions rely on a shorthand rule known to produce wrong results. As such, these intuitions cannot be taken at face value; it takes scientific investigation and critical analysis to determine probabilities and causes. (That being said, if any of these intuitions still have some pull on you, hold on until we get to later installments--I believe representativeness is one contributor to these intuitions, but that there are others as well.)