A Hasty Generalization fallacy, also called secundum quid Jumping to Conclusions or Anecdotal Evidense, is a logical error when you reach a conclusion not supported logically or by sufficient evidence. The key here is a generalization. Similar to a stereotype where a small sample size leads to an incorrect deduction.
For example, the DMV (department of motor vehicles) employee is moving so slow. I’ve been waiting forever. The hasty generalization example is
All government workers are lazy.
By contrast, slothful induction dismisses a logical conclusion despite significant evidence or logic.
It’s just a coincidence.
A smoker may dismiss the risk of cancer or death from smoking.
Everything gives you cancer nowadays.
An illogical conclusion given the data says smoking causes 80% to 90% of all lung cancer, according to the CDC. Smokers die 10 years earlier on average, which is why life insurance premiums are much higher if you are a smoker.
The overwhelming exception fallacy is a generalization that is mostly accurate until a closer examination of data or scenarios eliminates a significant portion of the result set. For example
The world is peaceful. There hasn’t been a world war in 75 years. Only a handful of localized conflicts: the Vietnam war, the Korean war, Iran-Iraq war, Afghanistan war, civil wars in Yemen, Syria and Libya, Iraq war, Boko Haram Insurgency – but other than that, we are at peace.
As of 2020, there are approximately 45 armed conflicts in progress.
Fallacy of Unrepresented Samples
The unrepresented samples fallacy refers to drawing a conclusion from samples that are biased or unrepresentative of the true data set.
Using the current racial unrest in the United States as an example. According to the September 2020 national poll released by the UMass Lowell Center for Public Opinion, 44% of people believe police shootings raise important issues about race that should be discussed. However, 49% of whites said the shootings are receiving undue attention, while only 14% of blacks said the shootings received undue attention. It matters who is included in the survey.
A more preposterous example would be meeting three Chinese people at a conference, and you discover they were all scared of dogs. Wow, most Chinese fear dogs. There are approximately 1.4 billion people in China.
The misleading vividness fallacy is a hasty generalization that refers to a small number of vivid, often personal experiences that drive someone to a conclusion rather than considering statistics. Joe tells the story of the rude customer service person at the cable company. Jill chimes in:
Me too! Last week I called, and they were completely useless. The conclusion is the customer service is awful.
Typically, a call center agent will handle 50 calls a day, and some companies employ 1000 or more agents, equating to 50,000 customer service calls a day. Two bad calls are not statistically relevant, yet very impactful. People tell 9 to 15 people if they have bad customer experience, and 86% say bad customer experience is why they quit doing business with a company. Illogical or not, your customer service needs to measure up.
Statistical Special Pleading
The statistical special pleading fallacy attempts to reclassify or interpret results or data to suit the desired conclusion. The common term is cherry-picking. A fallacy itself, cherry-picking is slightly different. (see next section)
An excellent example is arguments made for the mortality rate of Covid-19.
It’s no more deadly than the flu.
True, if you have no underlying conditions, then you have a 0.9% chance of death from Covid-19. The mortality rate for flu is about 1% for all people infected. For Covid-19, the mortality rate is between 5.5% and 10.5% if you have underlying conditions such as cancer, asthma, diabetes, or cardiovascular disease. 60% of Americans have an underlying condition. Covid-19 is also ten times more infectious.
Cherry Picking Fallacy
The cherry-picking fallacy is often used interchangeably with statistical special pleading. The subtle difference is it is the suppressing or incompleteness of data/evidence. The metaphor is a person picking cherries would only select the best cherries and leave out the bad or even ordinary cherries.
The accident fallacy refers to applying a generalization to a group where there are obvious exceptions. See also Fallacy of Accident in Syllogisms. One of the best examples cited is in the United States Constitution,
…all men are created equal.
Yet, people are obviously born into the circumstance, rich, poor, healthy, diseased that immediately renders them not equal – at least when it comes to opportunity.
The converse accident fallacy is the opposite of the accident fallacy, where the exception is then applied to the general grouping. See also Fallacy of Converse Accident in Syllogisms.
A real-world example is people with certain health conditions do not need to wear a mask when in public during the pandemic. Then everyone should be able to go in public without a mask. More theoretical is cancer patients are allowed marijuana, then everyone should be allowed marijuana. Taking this one step further, there is a debate about legalizing heroin to give to terminal cancer patients. Using converse accident, we can conclude that then everyone should be allowed to use heroin.
The package-deal fallacy refers to grouping items together by tradition. In fact, some items are exceptions.
Dave likes surprises. Let’s put a dead skunk in his bed.
The package-deal fallacy is best highlighted by the tribalism of American politics. If you are a republican, you favor less government control, so you would favor a women’s right to choose. As a democrat, you want the government to impose restrictions on guns and pollution, so you must be in favor of making abortions illegal. Both statements are equally false.
Availability bias is more a process than a fallacy and is defined as the human tendency to use thoughts that come to mind as more representative than they really are. It allows for quick and sometimes poor conclusions.
People also remember vivid events more readily. Someone who lost a loved one in a plane crash may be scared to get on a plane. Statistics says the probability of dying in a car crash is 1/114. The odds of dying in a plane crash are 1 in 9,821 or once in every 16 million flights.
Survivorship bias is the error of believing in people or things that survived some process or event are more representative of reality.
Take a soldier returning from his third tour in Afghanistan, where they saw lots of fighting. This person may disregard the dangers of a peaceful society: strolling through a rough neighborhood or excessive speeding. These dangers hardly compare to fighting a war yet are equally deadly.
Applied to business, this is often cited as part of the reason for the market crash of 2008. Made famous by the movie The Big Short, the CEO of Bear Sterns argues everything is fine on stage with trader Mark Baum who predicted the crash. There was no way a company as prestigious and “smart” as Bear Sterns was in significant trouble. They were sold in a fire sale for $2 a share. A third of the price of the shares when they went public in 1985.
Other common logical fallacies: