Fallacies are so pervasive in human processes that this reality can make us pessimistic regarding our rationality. There are so many errors we make in logical reasoning. However, the fact that we recognize these should lead us back to optimism since we can be better at avoiding committing them! With more practice, we can most certainly be better at dealing with the inevitable appearance of these mistakes, whether they arise intentionally or not. Critical thinking is hard work, but it is a skill we can all improve!
What Are Fallacies?

In philosophy, fallacies refer to specific reasoning mistakes that are so pervasive they get their name. Essentially, fallacies render arguments erroneous. They can be intentional or not; we may not always be aware of when we are committing them or falling prey to them. And while we may never be able to avoid them, in whichever direction that may come, we can practice being better on guard against them.
The list is long, with at least one hundred being very common. Here are 17 that everyone should know.
1. The Appeal to Ignorance

Perhaps one of the most pervasive fallacies in thinking is the appeal to ignorance. This is often used for something that has not been proven or cannot be proven. This can go in one of two directions, but it begins with premises that something has either 1) not been proven or 2) has not been disproven, and then concludes that either 1) it, therefore, must be true or 2) it, therefore, must be false. Hence, we can find ourselves in a state of logical inconsistency.
Here is a timeless example: Nobody has proven God’s existence. Therefore, God must not exist. Or: Nobody has disproven God’s existence. Therefore, God must exist.
In this type of argument, no positive evidence is presented, precisely what is needed to support a conclusion. Here, we only have this vague “nobody” category, which lends to no specific credibility, reliability, or authority. Moreover, it only presents a negative premise: the lack of evidence. However, that is only half of the argument. All it alludes to is that a group of unspecified people have not been successful in proving God’s existence. Perhaps it can make us a bit skeptical, but what is most important is that this is not sufficient to support the conclusion.
Attempts to prove or disprove something need to be done systematically, critically, and by qualified experts — usually (not always). For instance, something that requires just observation would not require specially qualified experts. If we argue that Mark has never received a speeding ticket, so he probably doesn’t speed, this would not require more than observation or discussion. But there are cases when we must refer to specially qualified experts.
2. The Appeal to Emotions

The commonplace saying that “our emotions can sometimes get the best of us” summarizes this error. Of course, by “emotions,” we can reference various kinds here, but particularly poignant are those stirred by fear or pity. Thus, appeals to emotions have several sub-types, not just by emotion but also by how they function exactly.
For instance, does the emotion of fear invigorate a prejudice, provoke an insecurity, or make one want to jump on a bandwagon? The goal, whether consciously promoted or not, is to try to sway someone in an argument’s direction by manipulating one’s emotions.
Now, it must be noted that an appeal to emotion is not fallacious in and of itself; it is fallacious when it is used in a deceptive manner, such as by hiding a flaw in an argument, clouding one’s proper judgment, or failing to lead one to consider the right underlying issue of an argument, as a few ways in which this may be fallacious. Thus, good judgment is needed to consider the elements of the argument and its context as to whether a fallacy is being committed.
A common form of this fallacy is an appeal to fear. This may come in the form of a threat or the presentation of some presumed negative future scenario that one would want to prevent. In this last instance, this can also relate to another type of fallacy known as the slippery slope, in which one may provoke fear that if we take a certain action, it will result in another presumed inevitable result that we want to avoid.
3. The Appeal to Authority

As noted with the appeal to emotion, an appeal to authority is not fallacious in and of itself; in fact, a proper appeal to authority is one of the better tools we have for good, critical thinking and logical reasoning — even taking into account that experts can at times be wrong.
Thus, the concern here is to look for whether the authority used in an argument to try to persuade one on an issue is good and proper. We must inquire into several key characteristics to avoid this fallacy, such as the following. Is the person an expert? Is this person in the right field? How trustworthy and honest have they been; what do the records show? How consistent have their arguments been? What do other related experts say on the same topic; is there agreement? Each of these inquiries (and this list is not exhaustive) can be expanded upon with more detailed questions, but to put it concisely, we must ask: “says who?”
There is sometimes a connection with the appeal to emotions here when it stirs insecurity about one’s lack of knowledge or inflated confidence in someone else. This happens, for instance, when celebrities are used to promoting things in which they do not have proper expertise, such as credentials and experience.
Sometimes, people may even use the term “experts” vaguely and make empty statements like “experts say that…” Other times, the wrong entity is used to attempt to back a position, such as “most dentists recommend using Tylenol for ankle injuries.” An important takeaway is that some of the best information we can have, as critical thinkers and logical reasoners, is knowing what information is good and what sources are good.
4. The Appeal to the People

As with the others discussed here, this can take various forms. Generally, part of human nature seems to lean toward conformity with others, especially with the majority.
However, as with the other fallacies, this can cloud our judgment and impair critical thinking. This can lead to pressure that results in discrimination and unwarranted biases. However, as with the others, we should not automatically assume that the majority is always wrong; we need to consider the merits of the presented argument. The solution to avoid this fallacy is straightforward: proportion your conclusion to the evidence.
One root cause of this fallacy is tribalism. This also comes in many forms: sports fandom and political allegiances, to name a few. These tribal loyalties can become so strong that we become blind to the errors of our groups; for example, in sports, we may see many more infractions by the “other team.” Or, in politics, we may not fully understand the positions of what our political parties may proscribe to, like missing the science-backed evidence for climate change because we assume to be “on the right side.”
This can be connected to another fallacy known as the is/ought fallacy: just because the majority is a certain way does not automatically mean it should be that way. For example, just because human history seems to show us that there is a part of human nature that leans toward violence does not mean that we should be violent, and is in and of itself does not assume an ought (and there is another related but separate kind of fallacy, the naturalistic fallacy, where the mistaken assumption is that something “natural” equates to “good” — both of these are discussed below).
5. The Appeal to Traditions

In a similar vein to appealing to the majority, sometimes people assume traditions to be authoritative when they may not be. For instance, people frequently draw upon common past “traditions” as evidence, such as when people might argue something fallacious, such as “When I was a child, we did not have seat belts, and I survived” as if that were sufficient evidence — which it is not. To reference a few of the main errors here: the sample size is small (one person), and there is much undeniable evidence that seat belts save lives.
So, what is going on here, essentially, is a faulty appeal to practices of the past as if their duration as “traditional” were enough, in and of itself, to justify their continued use when more current evidence may point to the contrary.
A tradition need not be from the past, however, as it can also take the form of a future expected tradition that one “should get on board with” for the sheer fact of it being a future expected tradition—but when justified reasoning is not presented, a fallacy is committed when the only reason provided is that it is the “expected future tradition,” assumed to be right. This can relate to fallacious appeals to emotion, for example, when pressure is instigated to try to persuade people to “get on the right side of history” moving forward — but again, when there is no good reasoning presented, we must demand good evidence before we decide, which may or may not lead to that tradition being good, or bad. So again, the main inquiry here is to investigate whether or not any abuses are being committed.
6. The Burden of Proof

While this can be a tricky fallacy to spot sometimes, the idea here is to ask if there is a burden of proof that needs to be demonstrated by one side presenting their case for something; in other words, that one side needs to prove their point directly with evidence — if this is missing, then a fallacy has occurred. So, either one side does not recognize that the other side has evoked the burden of proof, or one side does not correctly recognize where the burden of proof lies.
As an example of the first case, imagine the following dialogue: Person A argues that the new company policies are terrible without explaining why, and Person B responds that the company is trying to balance different needs as best it can. Person B needs to challenge the missing why from Person A.
As an example of the second case, imagine the following dialogue. Person A argues that we should not believe in ghosts because there is no evidence to believe in them. Person B responds that Person A cannot make this conclusion because ghosts have never been disproven.
7. The Is/Ought Fallacy

This is a very common mistake: arguing that something is a certain way automatically equates to saying that it should be that way. The fallacy comes when we make this illogical leap without presenting reasons why it should be that way and only saying that because something is, it should be.
Just because children can be selfish or competitive does not mean they should be. Just because history seems to demonstrate there is a violent streak throughout humankind does not mean that we should be violent. These are obvious cases. Others may be more subtle, depending on the details.
For instance, saying that just because technology is capable of creating something does not lead directly to that it should do so. Others may be easy to miss, or others may point to something intuitive, but the main point to remember here is to not make an argument based solely on leaping from an is to an ought. Keep in mind that this can also go in the other direction, arguing an ought from an is, as the error is the same, just in reverse order.
8. The Naturalistic Fallacy

This fallacy commonly involves making an ethical judgment about something based on its state of naturalness. This means that the fallacious argument presented in this scenario is when something is deemed immoral because it is unnatural. So, there is some similarity with the previous fallacy, but the idea is that something considered natural is thus judged good, and something considered unnatural is thus deemed bad. Just because something is unnatural, it does not automatically always follow that it is bad. This continues to be a frequent argument against homosexuality, arguing that it is not “normal” and therefore is wrong.
We have to ask ourselves what we mean by “natural” when considering such types of arguments. Do we mean “natural” in terms of “common” or the “average” (“normal”), or do we mean “natural” in terms of “occurs in nature and does not break natural laws.” The first one can include cases of the second, and the second one can be an outlier, which means that what is really meant by the first one is that it is not common, which does not make something bad for that reason alone.
Take again the example of homosexuality as a case in point. Homosexual couples may be a minority; it is not the average, but that is just a question of statistics and not ethics (lots of things are “abnormal” in terms of statistics, and we do not equate that to being bad, like being a billionaire). Homosexuality does occur in nature, so we cannot say it is unnatural in the second sense. So, both cases represent examples of the naturalistic fallacy.
9. Oversimplification

This fallacy is just as it sounds: overly simplifying something to the point of it becoming an unwarranted conclusion. So, with this fallacy we do not factor in complexities and try to argue that something is simpler than it really is. Complicated scenarios require complicated explanations. We do this all the time, perhaps most frequently, to feel like we have some understanding of what is, ultimately, a very complicated topic. We like to think we understand complex topics.
Many of these fallacies have similarities and overlaps. For example, when one argues that poverty is the result of laziness, this is an oversimplification (poverty is a very complicated situation that is caused by many different relevant factors). And, if one then additionally tries to argue that welfare systems are a waste because these lazy people just take advantage of them, then this is a kind of oversimplification that can lead to a hasty generalization, which is also a main cause of discrimination.
Oversimplifications can be made about our motivations and our systems, and they can be made purposefully, or not. Politicians, for instance, sometimes deliberately oversimplify things to divide people further and get people on their side feeling more strongly in agreement as a result. In this case, there may also be some overlap with another kind of fallacy, called the “false dilemma,” where someone presents a situation as if there were only two options when there are really more (also oversimplifying the situation, generally speaking).
To avoid this, we need to recognize when a situation is complicated and consider all the elements involved. Failure to appreciate the complexity of cause and effect is another example of this fallacy. Sometimes there is a set of factors that together cause something but one may present the case, erroneously, for a single cause.
10. The Red Herring Fallacy

This fallacy is one that works through distraction. Imagine being in an argument when suddenly a stinky fish is thrown in, which has nothing to do with the argument at hand—that is a red herring fallacy: distracting with something irrelevant.
Herring is a particularly strong-smelling fish, and this name comes from how it was used by escaping prisoners as a ruse to distract police. There are several reasons why we may do this, consciously or not. We may see the weakness in our argument and want to divert attention away from that. Or, we may not realize it is irrelevant and think it should be a part of our argument, for a few possible reasons.
It is common to hear, in arguments against gun control, that guns do not kill people; people do. Thus, it is not the guns that are the problem; people are the problem. This example demonstrates how they can be tricky to spot because it may actually lead to a warranted claim, but that claim is not supported by the reasons given, or it does not address the issue at hand.
So, in this example, the issue under discussion is gun control. Thus, the concern is the safety of guns, not the behavior of people. Therefore, by shifting attention to the psychology of people it turns it away from the real issue at hand, which is the safety of guns. This fallacy, like many others, often occurs in political debates.
11. Straw Man Fallacy

If you imagine what a straw man is like, you will note that it is something flimsy, something that even just a quick blow of wind will knock over. That is what happens with this type of fallacious argument: it is something presented because it is easy to knock down, but it is a misrepresentation of the original argument. So, with this fallacy, the arguer takes the opposing argument and distorts it to make it seem like it is easy to knock down. There are subtly different ways in which this may occur, but the trick is to confirm the argument is represented as it originally stands.
For instance, imagine a group of workers arguing for better working conditions, and the administration’s response is a list of reasons against that demand, but it turns out that those reasons they offer against it really have nothing to do with what the workers are asking for. Imagine they simply want better ventilation, but the administration lists all sorts of reasons why that would be expensive, ignoring the very relevant and cheap option of, say, opening more windows or purchasing more fans, not just a costly AC system.
This also happens often in politics. It can be a very persuasive but misleading way to distort or oversimplify an argument or be a distraction.
12. The Slippery Slope Fallacy

Undoubtedly in the not-so-distant past you have heard someone argue that if we do X, then that will inevitably lead to Y, and Y is an undesirable outcome so we should be sure to not do X. However, if there is any chance that Y is not an inevitable outcome of X, but we present it as such, then we are committing the slippery slope fallacy.
Again, this, as with many others, frequently occurs in politics. To return to the debate over gun control, imagine someone arguing that if we have too much gun control, then we will have a police state. Conversely, if we have too little gun control, many people will have rocket launchers in their backyard. Both opposing positions commit the slippery slope fallacy because both present an X leading to a Y when that might not be inevitable.
This fallacy is also a variation of an oversimplification, an oversimplification about a causal chain of events, like dominos. This can have very dire consequences; a case in point in history was the fear of a communist takeover in the world in the twentieth century, leading to multiple violent conflicts and wars.
13. The Weak Analogy Fallacy

We know apples and oranges are very different, and this fallacy is essentially about trying to present them as if they were similar. It is falsely comparing apples to oranges. An analogy compares two things, so if the comparison is weak, it runs the risk of committing this fallacy. When considering an analogy, we need to ask if it is a strong comparison, which we can measure by making sure there are enough relevant similarities and no relevant dissimilarities, most essentially.
A classic example is one presented by climate change deniers, who argue that there have always been variations in climate, so today is no different. However, this, too, is an oversimplification that, as such, ignores the differences in different periods of history. There may have been periods of increased heat, but there were many other factors present, some of which we still find and others that we do not find today. A major difference in our current period is all the humanmade contributions to the warming of the planet, such as pollution from our various modes of transportation; after all, the airplane was only invented at the start of the twentieth century.
14. The Suppressed Evidence Fallacy

Numbers and statistics are good examples of things that can easily confuse people and thus be used for manipulation. In the case of the suppressed evidence fallacy, this is when crucially important, relevant evidence is being suppressed, which can happen for various reasons, consciously or not. In other words, there is a presumption that there should not be.
This happens a lot in politics and advertisements. In politics, we sometimes see politicians present evidence for reasons not to vote for an opponent yet suppress the evidence for why they could be a good choice — and they do so purposefully to make themselves look more appealing. Some advertisements may emphasize all the good that can come from something when there could be something critically bad that can also result, but that data is suppressed.
15. The Begging the Question Fallacy

The begging-the-question fallacy is another kind of error of presumption (though it can take some different forms), of presuming something within an argument so that there is only one conclusion possible but in an erroneous manner. So, there is a loaded question involved. For example, imagine someone asking someone else: “Have you stopped taking drugs?” Both a yes and a no answer would force that person to admit they have taken drugs. Of course, clearing this up can be easier than with some of the others — here, one could simply answer, “I have never taken drugs.”
This, too, can be a frequent occurrence in political debates when one politician wants to try to back another into a corner with loaded questions. There are often words chosen that carry strong connotations as well to further have this effect — for instance, imagine a politician asks another if they still favor such “unnecessary spending” by the government.
Thus, while there can be some confusion with this fallacy and others that are similar, here, a question is begged toward the arguer to clarify their position and avoid the fallacy. This can relate to another fallacy of arguing in a circle — for example, arguing that abortion is murder because murder is bad begs the question to be answered regarding explaining how abortion is murder. Murder is obviously bad, but this arguer would need to explain how abortion is a form of murder to avoid the fallacy in argumentation.
16. Confirmation Bias

Perhaps this is one of the most common fallacies. We all fall prey to this from time to time because we have values that we want to uphold, we do not like to be wrong, and changing our views can be hard. With this fallacy, one consciously or not only searches for information that confirms their “biases” and ignores evidence that goes against it. It is a form of cherry-picking data to suit our biases.
This one is very important because, in today’s climate of the increasing prevalence of misinformation, disinformation, and algorithms that automatically start to select the information you may look for, this all is making this bias all too frequent. Active critical thinkers will be sure to sift through all the evidence on a topic before considering a conclusion; they will follow the evidence neutrally, not in a way that simply confirms their biases.
The misinformation about vaccines is a good case in point. In the late nineties, one very small sample was studied that led to an article arguing for a link between autism and vaccines. That has since been debunked by multiple experts and credible authorities, but you can still find a small subset of arguments for it. The danger here is ignoring the other important and relevant evidence that goes entirely against this. Again, fallacies can lead to dire consequences, such as not getting a vaccine.
This is also a case of oversimplification of cause and effect. Direct causal connections are quite hard to discern, as often there are multiple factors involved, and it may just be a correlational relation — correlation does not always mean causation. The cause of autism is complicated (it has no known single cause), and it is an oversimplification to point to just one factor, even if it were presented as just a main cause.
17. Equivocation

This fallacy can be subtle, but often, it should not slip by us. Language is powerful. We cannot imagine our human world without the ability to communicate with words. But language can be used to mislead, misrepresent, and distort, and this, of course, is bad. Equivocation is a kind of error that is made with language, and specifically, the definition of a term used in argumentation. When someone equivocates on a term, they shift its presented meaning during the argument. So, a word or a phrase is used ambiguously, and an argument may seem sound when it is not. This can happen with any word that can have multiple means — for instance, being rich in money or rich in flavor, or a star as in the galaxy or a star as in Hollywood.
In general, while we should ask questions when warranted, we also do not want all this to make us overly skeptical. There are times when our emotions lead us to do good things and reason well. There are times when the majority and traditions are correct. Sometimes, our intuitions can lead to some insight. In general, knowing that these fallacies exist and being on guard against them should also give us the confidence that we are not always poor thinkers. Again, we must look at the merits of the argument itself and make sure we are not succumbing to unjustified and unwarranted conclusions.