Stacking The Deck Fallacy Definition Critical Thinking

5 Logical Fallacies from the Second Clinton Trump Debate

A logical fallacy is a flaw in reasoning that lead to false assertions. Let’s look at 5 logical fallacies from the second presidential debate between Hillary Clinton and Donald Trump held on Tuesday, October 4, 2016.

Red Herring Fallacy

“Yes, I’m very embarrassed by it. I hate it. But it’s locker room talk, and it’s one of those things. I will knock the hell out of ISIS. We’re going to defeat ISIS. ISIS happened a number of years ago in a vacuum that was left because of bad judgment. And I will tell you, I will take care of ISIS.” Donald Trump on the 2005 video that was released prior to the debate.

A red herring fallacy is when one misleads or distracts from the main issue. In the example given, Donald Trump uses a red herring, the fight against ISIS, to distract from the main point which is about the comments he is heard saying on a recording from 2005. The example is also a form of appeal to ridicule by using ISIS to make his taped comments appear unimportant/ridiculous in comparison and appeals to emotion.

Appeal to Emotion (fear)

“So I believe that this election has become in part so — so conflict-oriented, so intense because there’s a lot at stake. This is not an ordinary time, and this is not an ordinary election. We are going to be choosing a president who will set policy for not just four or eight years, but because of some of the important decisions we have to make here at home and around the world, from the Supreme Court to energy and so much else, and so there is a lot at stake. It’s one of the most consequential elections that we’ve had.” Hillary Clinton.

An appeal to emotion is the manipulation of emotion in order to win an argument. In this example, Hillary Clinton creates a false sense of urgency and fear as support for her point that this election is different. The examples she uses do not prove that point: the person will be elected for 4 years; the appointment of a new Supreme Court judge is not something exceptional as Obama, Bush and Bill Clinton appointed two judges each; and energy has been on the table for at least the past two decades. This example also contains a circular argument fallacy since the main point, this election is important, is true by definition since all presidential elections are important.

Ad Hominem

“It was locker room talk, as I told you. That was locker room talk. I’m not proud of it. I am a person who has great respect for people, for my family, for the people of this country. And certainly, I’m not proud of it. But that was something that happened.If you look at Bill Clinton, far worse. Mine are words, and his was action. His was what he’s done to women. There’s never been anybody in the history politics in this nation that’s been so abusive to women. So you can say any way you want to say it, but Bill Clinton was abusive to women.” Donald Trump

An Ad Hominem is attacking the opponent’s character, personality, race, or family rather than the content of the argument. In this example, Donald Trump attacks Bill Clinton, his opponent’s husband, instead of answering to the argument levied against him on the comment he made about how he treats women. Ad hominem fallacies are also a form of red herring. This example also contains a “Tu Quoque (you too) fallacy” by claiming that if Bill Clinton did something similar or worse, then what Trump did would be “not so bad”.

Ignoratio Elenchi (Or Missing the Point)

“TRUMP: Oh, you didn’t delete them?
CLINTON: It was personal e-mails, not official.
TRUMP: Oh, 33,000? Yeah.
CLINTON: Not — well, we turned over 35,000, so…”

Ignoratio elenchi is when a person presents an argument that might or might not be true but does not address the point being made. In this example, Donald Trump is questioning Hillary Clinton about the 33,000 emails that were deleted from her private email server. She answers by saying that they handed over 35,000 emails. Independently of the whether this is true or not, this does not address the 33,000 deleted emails and misses the point being made.

Fallacy of the Single Cause

“We have to get rid of the lines around the state, artificial lines, where we stop insurance companies from coming in and competing, because they want — and President Obama and whoever was working on it — they want to leave those lines, because that gives the insurance companies essentially monopolies. We want competition. You will have the finest health care plan there is. […] You’re going to have plans that are so good, because we’re going to have so much competition in the insurance industry.” Donald Trump on what is wrong with Obamacare and how his plan will be better.

Fallacy of the single cause or causal oversimplification is when it is assumed that there is a single (or few in some cases) cause for an outcome when the topic is much more complex. In this example, Donald Trump makes the point that by removing border laws on insurance companies, competition will result in the finest health care plan there is which is an oversimplification of a much more complex problem (different incomes, different pathologies, price of healthcare…). It is important to note that during the whole exchange on the topic, this was the only argument made. Therefore, rendering a possibly valid argument an oversimplification. The example also includes overstatements which are not a logical fallacy but a literary device often used in politics.

TIP Sheet
FALLACIES and PROPAGANDA

It is important to be able to evaluate what you read and hear. If you did not sort the credible from the incredible, the serious from the playful, the essential from the nonessential, the world would be full of conflicting and bewildering messages. Critical thinking enables you to distinguish between fact and opinion and distinguish sound from faulty reasoning.

One kind of faulty reasoning is a fallacy, a breakdown of logic. A fallacious argument is one that tries to argue from A to B, but because it contains hidden assumptions or factual irrelevancies, reaches an invalid conclusion.

Another kind of faulty reasoning results from substituting emotion for thought. Propaganda is an indirect message appealing primarily to emotion. It is aimed at forming opinions rather than increasing knowledge. Propaganda intends to persuade without offering a logical reason to adopt a particular view or take a particular action. While the word itself carries rather a negative connotation (implying intent to mislead or deceive) the techniques can be used in good causes as well—a Cancer Society fundraiser, for example. At least some propaganda techniques are used occasionally by non-profit organizations, advertisers, churches, news organizations, governments, and instructors.

For good or ill, makers of propaganda typically select facts most appropriate to their purpose and omit facts that do not help them achieve that purpose. Because some propaganda uses facts (albeit selectively), it can look like a reasoned argument. But because it fails to make a logical case, propaganda is often fallacious as well as emotional.

Fallacies and propaganda devices are slippery by nature; they overlap, are often used in combination, and do not always fit neatly into one category or another. Following are examples.

Ad Hominem Fallacy

An ad hominem fallacy redirects the discussion of an issue to a discussion of one of the subjects—to his or her personal failings, inconsistency, or bias. For example, in a discussion of the pros and cons of privatizing Social Security, it would be an ad hominem attack simply to declare your opponent a parasite feeding on the lifeblood of the working class. Similarly, you would be guilty of an ad hominem attack if you exclaimed, “How can you claim to be a born-again Christian?” (pointing out an inconsistency of position) or “Of course you would say that—you’re a Libertarian!” (pointing out personal bias). None of these response addresses the actual pros and cons of the proposal to privatize Social Security.

A preemptive ad hominem attack, launched before the discussion fairly begins, is called “Poisoning the Well” (below).

Ad Nauseum Fallacy

Hitler’s Propaganda Minister, Joseph Goebbels, said that if only you repeat something often enough, people will come to believe it. Ad nauseum repetition is the stuff urban legends are made of. Less harmless than urban legends is the indoctrination (“brainwashing”) of cult groups, which aims to create belief by aiming sheer, ad nauseum repetition relentlessly at an exhausted subject.

Fallacy Based on Appeal to Belief, Common Practice, or Expertise

An appeal to belief suggests that, since most reasonable people (of your sort) believe something, you should believe it, too. “Educators fear that vouchers will undermine funding for public schools.” Educators in particular might be inclined to conclude that this fear is universal among educators, and identify with the belief because they identify with the group.

Appeals to common practice suggest that most everyone does it, and furthermore that it is therefore okay; “Nobody in California comes to a full stop. That’s why they call it ‘the California Stop.’”

An appeal to expertise dares you to pit your own ignorance against experts making a value claim (a Grammy award-winning country singer publicly endorses a political candidate, for example). Now, specialists agreeing on objective claims about matters within their field of expertise are reasonably to be believed, but specialists making value judgments outside their field might not. (For more, see the TIP Sheet “Objective and Subjective Claims.”)

Phrases like the following might signal an appeal to belief, expertise, or common practice:

“Ask anyone…”
“Everyone knows…”
“Any reasonable person would…”
“It’s obvious that…”

Fallacy Based on Appeal to Indignation or Anger

The problem with the appeal to indignation or anger is that anger is a poor substitute for reasoning. On the contrary, anger clouds thinking. Certainly anger is sometimes justified, when it follows an argument that gives me grounds for anger, but anger is not the way to reach that conclusion. And anyway, sometimes we are angry for irrational reasons.

For example: “The governor is cutting education funds!” may cause a reaction of anger, followed by the conclusion, “The governor is a dolt and a fascist!” (see “Name-Calling,” below). Anger short-circuits the reasoning process, whereby we might have examined the figures and learned (much to our surprise) that education funding is up from last year—just not up as much as originally hoped.

Fallacy Based on Appeal to Pity

Who has not seen the heartbreaking pictures of starving orphans with the request for donations to ease their plight? A valid appeal to pity has a direct link between the object of pity (starving children) and the desired action (the money that can help feed them).

Appeal to pity becomes a fallacy when there is no logical link between the arousal of pity and the desired action: “Please give me a break, Professor. If I don’t pass this class I’ll lose my financial aid.” Whether this student is qualified to pass the class depends on the quality and quantity of work completed, not on financial need. There is no relevant connection between the student’s need on the one hand, and the quality of work she has (or has not) performed on the other.

Fallacy Based on Appeal to Popularity or Bandwagon

Appeal to popularity exploits the human desire to belong to a group. While there is nothing wrong with belonging to a group, some decisions are not group decisions, should be made without taking a head count, and should be held to even if they are unpopular. For example, my softball team’s consensus should probably not be the determining factor in my decision to take a cut in pay in return for more flexibility in my working hours. My softball team’s disapproval (they think I should go for the higher salary) should be irrelevant.

“Jumping on the bandwagon” is supporting a cause or position merely because it appears to be popular or winning. Politicians who waver from one position to another are sometimes trying to protect their jobs by appealing to the greatest number of voters based on changing poll information. Of course, it is reasonable sometimes to change one’s mind; it is just usually preferable to base the change on reason.

Fallacy Based on Appeal to Ridicule

Sarcasm is always hostile. Ridicule, sarcasm, and the “horse laugh” are the persuasion techniques of bullies. Appeal to ridicule tries to convince you to accept an argument in order to avoid becoming the butt of the joke. The laugh might be at the person (see “Ad Hominem,” above) or at his position (see “Straw Man,” below). Whether it is blatant or subtle, ridicule essentially denies discussion.

Apple Polishing Fallacy

Apple polishing is connecting compliments to unrelated issues. It urges someone to accept a claim in the afterglow of pleasure at the compliment. “You are by far the best math teacher I’ve ever had. You made me love math! I think I may change my major to math. I want to be just like you. Oh—I haven’t finished that chapter review; you don’t mind if I hand it in late?”

Fallacy Based on Begging the Question

Begging the question is a type of circular reasoning that presents as a true premise an assertion that actually requires its own proof. This leads to a “conclusion” that has already been pre-supposed or implied. For example, a student complains bitterly that he failed the English composition exit exam. “The test is bogus! I’ve taken it twice now, and they won’t pass me!” The student concludes that the test is bogus. His implied premise is that his essay is, in fact, good enough to pass. But whether his essay is good enough to pass is, itself, the question. Asserting that it is (or implying that it is) is not sufficient to prove that it is. (Application of the test rubric, by trained test scorers—twice—would appear to conclude that it is not.)

Distribution Fallacies: Composition and Division

Distribution fallacies arise two ways: In the composition fallacy, I know the characteristics of the whole, and (wrongly) attribute those characteristics to each of its parts. Stereotyping of individuals may result from a composition fallacy. Suppose I have read statistics that attribute very strong business ambition to a certain demographic group. I then attribute business ambition to each person of that group, without respect for individual variations (See also “Hasty Generalization,” below).

In its converse, the division fallacy, I know the characteristics of an individual or part, and (wrongly) attribute those characteristics to the whole. However, the parts are not necessarily representative of the whole. Broad stereotyping of a group may result from the division fallacy. Suppose I have become acquainted with a man from Taiwan who is an extremely talented electrical engineer. I mistakenly conclude that all Taiwanese, as a group, are amazing electrical engineers.

False Dilemma Fallacy

The false dilemma fallacy limits responses to those which serve the ends of its user (“Yes or no?”), implicitly denying the middle ground (“Maybe…”) or any qualified response (“Yes, but…”) . In a discussion on illegal immigration from Mexico to the U.S., for example, there is a vast middle ground of discussion between the extremes of building a very high fence on the one hand, and allowing uncontrolled immigration with its associated high social costs on the other. To suggest that a discussion on immigration is simply a choice between the two extremes is a false dilemma.

Political surveys frequently pose questions positing false dilemmas, using loaded questions that are worded in such a way that the desired attitude is suggested in the question. The responder is thus “steered” toward the desired response.

Guilt by Association Fallacy

The guilt by association fallacy muddles the process of fairly evaluating an idea, person, or group by interjecting irrelevant and misleading negative material. Think of it as a kind of red herring (below). This fallacy tries to connect ideas or persons to something unsavory or untrustworthy in order to discredit them. Take, for example, this claim made by an individual running against the incumbent for the office of sheriff: “Our sheriff has breakfast every single week with the head of a white supremacist group.” The fact that this weekly breakfast is attended by thirty-five local businessmen, including the head of the white supremacist group, is selectively omitted here (See also “Stacking the Deck, below). Be on guard when you see phrases like these:

“has ties to…”
“connected with”
“is the parent company/subsidiary of…”
“was a member of…”

Hasty Generalization Fallacy

A hasty generalization draws an unwarranted conclusion from insufficient evidence (see also “Distribution fallacies,” above). It often happens when the sample is too small to support the conclusion. For example, data compiled from 1,000 households, all in Nebraska, would probably be insufficient to accurately predict the outcome of a national election. Hasty generalization occurs, as well, when the sample is too biased to support the conclusion: Ten million households, selected from all fifty states, but all of them Libertarian (or Democrat, or Republican), may also be insufficient to predict the result of a national election.

Another kind of hasty generalization occurs as the result of misleading vividness. For example, a man in Tonga who watched many American daytime dramas (“soap operas”) might conclude that people in the U.S. were more materially prosperous, more well-dressed, more vicious, more self-centered, and more immodest than most of them are.

Name Calling

Name-calling is not so much a fallacy (see “Ad hominem,” above) as it is a propaganda technique, a crude attempt to make mud stick. It does not pretend to thoughtfully address issues or prove anything.

“The femi-Nazis want to eliminate gender.”
“We don’t need any more right-wing theocrats (or leftist, activist judges) on the bench.”
“This administration is a bunch of lying, neo-Fascist colonialists (or liberal, atheistic social engineers).”
“America is the Great Satan.”

Negative Proof Fallacy or Argument from Ignorance

The negative proof fallacy declares that, because no evidence has been produced, that therefore none exists. For example, a theist might declare, “You atheists cannot prove God does not exist, therefore God exists.” On the other hand, the atheist declares, “You deists cannot prove that God exists, therefore God does not exist.” First of all, the lack of proof one way or the other does nothing to prove or disprove God’s existence; the deist and the atheist may just as well make simple assertions: “God exists,” and “God does not exit.” Second, the general rule is that the person who asserts something has the burden of proof to provide evidence to support that something. It is not sufficient to make the assertion and then shift the burden of proof to the listener to prove you wrong.

Poisoning the Well Fallacy

Poisoning the well is an ad hominem attack (see “Ad hominem,” above) on a person’s integrity or intelligence that takes place before the merits of a case can be considered. It redirects a discussion to the faults of one of the parties. “Of course she will oppose the tort reform bill—she’s a lawyer, isn’t she?” The speaker suggests that something (greed? law school?) has corrupted the lawyer’s thinking to a degree that she cannot evaluate the tort reform bill on its own merits. Now the discussion is no longer about the merits of the tort reform bill at all, but about the personal qualities and motives of the other party.

Questionable Cause Fallacy

A questionable cause fallacy is the result of incorrectly identifying the causes of events either by oversimplifying or by mistaking a statistical correlation for a cause. Oversimplifying occurs when complex events with many contributing causes are attributed to a single cause: “School shootings happen because bullying makes students ‘snap.’” Since at least some bullied students do not shoot their classmates, this would appear to be oversimplified. Mistaking correlation for cause can happen when unrelated events occur together, either by chance or because they are actually both results of yet another cause. For example, does eating more ice cream in the summer cause an increase in crime rates? The rates for ice cream eating and violent crime both rise each summer. However, the statistical correlation notwithstanding, eating ice cream does not cause violent crime to increase. Rather, ice cream eating and crime may rise together as a result of a common cause—hot weather.

There may be an evolutionary advantage to discerning cause and effect; we’re hard-wired to look for causes, even if wrongheadedly. The tendency for humans to “see” a pattern where none exists is called the clustering illusion, demonstrated by the Texas sharpshooter fallacy: The Texas sharpshooter fires a hundred shots randomly into the side of barn. He then paints a target around the thickest cluster of holes. He has taken statistically non-significant, random data and attributed to it a common cause. The perennial attempts to find meaningful, hidden codes in the works of Shakespeare or the Bible (or the works of Leonardo da Vinci) illustrate the clustering illusion.

Fallacy Based on Scare Tactics or Appeal to Fear

Scare tactics create appeal from emotion. For example, the following appeal to fear requires me to accept a certain standard of “businesslike” attitude and behavior on grounds of fear:

This company expects a high level of commitment; be here in the office finishing the proposal through the weekend. Don’t forget your employee review is Monday.

Whether we can agree on what an acceptably “high level of commitment” is will have to wait for another time, since my immediate problem is avoiding the implied threat of a poor employee review. A warning differs from scare tactics because a warning is relevant to the issue. For example, it would be foolish to ignore the following warning:

No late research papers will be accepted; you cannot pass the class without a passing grade on this paper.

Slippery Slope Fallacy

Slippery slope arguments conclude that, if an eventual, logical result of an action or position is bad, then the original action or position must be bad. It is characteristic of a slippery slope argument that the final badness arrives in increments; thus, even though the badness could have been foreseen and may even have been inevitable, most of us are taken by surprise. Slippery slope arguments are common in social change issues.

A slippery slope argument is not necessarily a fallacy. It becomes a fallacy when the chain of cause and effect breaks down. If I cannot show you the reason why a causes b, and b causes c, and c causes d, my argument is fallacious. Typically, the (fallacious) chain of “logical” intermediate events becomes improbably long and unlikely, exponentially increasing the unlikelihood of the final result.

Often a historical precedent is cited as evidence that the chain of increasingly bad effects will occur. For example, there exist several historical precedents for the confiscation of guns by oppressive governments: in both Marxist and Nazi states, for example, the government confiscated the guns of citizens based on the registration records of those guns. Gun aficionados in the U.S. argue that gun registration anywhere will inevitably lead to confiscation of guns by the government. However, as a simple matter of logic, a historical precedent alone does not make those results necessary (or even probable) in every case. Gun registration opponents could avoid the slippery slope fallacy if they could show the reason why registration leads inevitably to confiscation. Or, if gun registration opponents could demonstrate that the principle that favors registration of guns is the same principle that would allow confiscation of guns, the fallacy may be avoided.

Smokescreen or Red Herring Fallacy

The smokescreen fallacy responds to a challenge by bringing up another topic. Smokescreen or red herring fallacies mislead with irrelevant (though possibly related) facts:

“We know we need to make cuts in the state budget. But do you really think we should cut funds for elementary physical education programs?”

“Well, we have to cut something. This governor has turned a million-dollar surplus into a billion-dollar shortfall. And what’s up with that shady software deal he made?”

Why red herring? A herring (a salt-cured fish) has a very strong odor; drag one across the trail of an animal and tracking dogs will abandon the animal’s scent for the herring’s. (A satirical fictitious example is the “Chewbacca Defense,” first used on the television show South Park in 1998, in which a lawyer convinces a jury to convict (and later to acquit) by misleading them with irrelevant facts about Wookies; for more information, see http://en.wikipedia.org/wiki/Chewbacca_Defense.)

Stacking the Deck Fallacy

Stacking the deck is a fallacy of omitting evidence that does not support your position. Like the card shark who “stacks” the cards before dealing to favor his own hand, a person who stacks the deck of an argument is cheating. All the cards favor him, because he has arranged them so. On the contrary, honest treatment of a discussable issue should evaluate evidence both for and against one’s own position. Stacking the deck can achieve short term wins against the unwary, but eventually a card stacker’s credibility will suffer.

For example, look again at the claim made above (see “Guilt by Association”) by the person running against an incumbent for the office of sheriff: “Our sheriff has breakfast every single week with the head of a white supremacist group.” This assertion might win the challenger some votes; note that the incumbent cannot truthfully deny it and independent investigation would confirm it. However, the challenger has selectively omitted the fact that this weekly breakfast is attended by thirty-five local businessmen, including the head of the white supremacist group. Should someone effectively expose the cheat before the election, the challenger will appear in an unfavorable light.

Straw Man Fallacy

If I take a complex position and oversimplify it in order to more easily overcome it, I am guilty of the straw man fallacy. In a sense, a straw man fallacy is a type of red herring (above), in that it misleads by misrepresenting the opponent’s position. Complex ideas about family planning are misrepresented below:

“The nuns taught us family planning was a sin. We would have to keep having babies because babies were a blessing, no matter how many of them already were in the house.”

The speaker may go on to demonstrate how unnatural and medieval such ideas are, defeating the straw man he himself has created. Instead of defeating his opponent’s best argument, he takes a weak argument, falsifies it, and then “overcomes” it. Political parties often demonize each other, painting extreme caricatures of each other’s actual, more thoughtful, positions in order to argue more easily (in marketable sound-bites) against them.

The straw man fallacy works best if the characterization closely resembles the real one; that is, if I choose to misrepresent your position, I should not be too obvious about it.

Subjectivist or Relativist Fallacy

The subjectivist or relativist fallacy turns all disagreements into matters of opinion by claiming that what is true and what is false are different for different people. In matters of fact, such a claim cannot be made without making nonsense. In matters of opinion, however, there exists no single standard to prove one or another claim conclusively true or false. The rightness or wrongness of either opinion might very well depend on where you stand. But even for matters of opinion, if there is no “truth,” then any opinion seems as good as any another and all opinions seem essentially groundless and equally meaningless. On the contrary, it is a virtue of a “good” opinion that it has a basis in evidence and logic. For more, see the TIP Sheets “Objective and Subjective Claims” and “Deductive, Inductive, and Abductive Reasoning.”

A subjectivist fallacy on a factual matter puts an end to rational discussion:

A: “The earth is roughly spherical, and goes once around the sun in about 365 days.
B: “That may be true for you, but it’s not true for me. And I have a right to my opinion.”

But a subjectivist claim even on a subjective matter (“It may be wrong for you, but it’s right for me!”) makes further discussion difficult by eliminating the common ground for discussion.

Fallacy Based on Transfer

Transfer is an emotional appeal that associates something or someone known (and therefore implied “good”) with a product, idea, or candidate. Celebrity endorsements are an example. Well-known objects and images can be used the same way, evoking an emotional response that sponsors hope will transfer to their cause. For example, images of Arlington National Cemetery might be used to help raise money for veterans’ groups. The image of Mount Everest is often used to help sell outdoor products (though relatively few of the products’ users will ever actually attempt to climb Everest).

Wishful Thinking Fallacy

Wishful thinking is choosing to believe something is true because we really, really want it to be true. We usually perpetrate wishful think on ourselves, but it also underlies many “positive thinking” philosophies: to believe in your dream is to make it come true. Though wishful (positive) thinking followed by action may help me achieve my goals, wishful thinking alone achieves nothing. Wishful thinking promotes comfort, but it does not promote reasoning.

 

2005 by Rosemary McKeever

0 comments

Leave a Reply

Your email address will not be published. Required fields are marked *