In the early 1980s, the Nobel prize-winning cognitive scientists Amos Tversky and Daniel Kahneman researched what they dubbed the Overconfidence Effect. This is a well-established error in reasoning in which a person's subjective confidence in his or her judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. In the last several decades, continuing research on overconfidence has further supported this effect. Yet the feeling of certainty is so powerful that, for many people, nothing—not facts, science, nor reason—can get one to admit that there is even a slight chance that they could be wrong.
Before we get too much deeper in this discussion of certainty, it is important to recognize the colloquial use of the term "certain." People claim to be certain quite often when it is really shorthand for "I am extremely confident." This kind of certainty shares the same problems as overconfidence. However, what we will be referring to in this article is what could be considered philosophical certainty, in which the person claiming this kind of certainty is not open to evidence that they could be wrong.
As stated, one's level of confidence is subjective, meaning that it's personal to that person and independent of any objective reality. Confidence has to do with knowledge—what we think we know. The study of knowledge is known as epistemology. Confidence, therefore, is a claim about our knowledge, NOT a claim about objective reality or what might be referred to as an ontological claim.
This concept is best understood by an example. Imagine that you have two witnesses to a crime. One is certain that the criminal took off in the car and headed south on the highway, the other is certain the criminal headed north. Unless the criminal was a quantum particle, both witnesses cannot be right. Even though both witnesses are 100% confident (or certain), at least one witness is objectively wrong in his or her claim.
In claims that can be known, as in which direction someone headed, confidence level is generally a decent heuristic to assess the objective truth of the claim being made. For example, claims followed by "I am really not too sure about that" tend to be false more often than "I am sure about that." However, for claims about objective reality that cannot be known (i.e., are unfalsifiable), the relationship of confidence level to objective truth is unknown and cannot be assumed to be a decent heuristic.
The reason confidence level is generally a decent heuristic to assess the objective truth of the claim is due to feedback from claims that are possible to be proven false (or falsifiable). If you claim that you are certain about X, then you find out that you were wrong about X, you will be less likely to make similar claims of certainty in the future. Each realization that you are off in your confidence acts as a corrective mechanism to your overall judgment in confidence. Of course, self-serving biases and overconfidence effects still apply, meaning that no matter how many times you are shown to be overconfident in your evaluations, overconfidence will persist to some degree.
When it comes to claims that are unfalsifiable, such as ambiguous claims of invisible beings helping you make life decisions, there is no feedback loop. If you want to believe it badly enough, any poor life decisions you make will be rationalized by some mysterious plan the being has in store for you. This inability to recognize poor decisions or bad advice from these alleged supernatural sources doesn't prompt you to call into question the certainty that you have regarding the existence of these beings. Thus, no such corrective mechanism exists for claims of certainty that are unfalsifiable.
Perhaps the biggest error in reasoning one can make when it comes to certainty is confusing a subjective or personal feeling with objective reality in a process I will refer to as cognitive overreaching.
Imagine your friend says she has a headache. You irritatingly ask, "Are you certain you have a headache?" To which she answers an emphatic "Yes!" Your friend is entitled to her certainty in this case because she is making a claim about her subjective self... a feeling-based claim. If she were to say that a demon was in her head causing the headache, now she is making a claim about the objective existence of a being based on her subjective experience. In an attempt to explain or understand feelings, one often injects personal or cultural narratives to build a plausible story. The certainty they have for the feeling is then irrationally extended to the certainty they have for the narrative they have built around the feeling. This is cognitive overreaching.
When we overreach, we are actually making a series of claims with certainty. With each claim there is some non-zero margin of error that is ignored. If our friend is certain she has a headache, that is a single claim of a subjective experience to which she is the ultimate authority. So far so good. But by claiming that she is certain that a demon is causing the headache she is also claiming
With each claim made, the probability of the ultimate claim being true is decreased. The rational position is to adjust your level of confidence according to the probability. Unfortunately, due to cognitive biases such as the overconfidence effect as well as logical fallacies such as the conjunction fallacy, few people make these adjustments to their confidence level and insist that they are certain.
A naturalistic claim of certainty refers to information obtained that does not violate any of the known laws of the universe. For example, if you are certain that you have a doctor's appointment scheduled for tomorrow, the information that led to your certainty was obtained by processing stimuli through one or more of your senses (e.g., the doctor's office just called you to confirm). This includes the well-known biological and psychological processes that convert the stimuli to information encoded and stored in the brain where at some point in the future the information is recalled and decoded. There are many points of interference on this journey from stimuli to recalled information, so many, in fact, that claiming certainty about anything objective or making any kind of ontological claim with certainty borders on the absurd. But what if there were some way that information could bypass all our imperfect biological machinery? Enter the supernatural claim of certainty.
There are those who claim that they are certain of things because one or more invisible beings are communicating with them through some unknown sense. For some, this invisible being is a god. For some, these invisible beings are "spirits." And for some (although much more rare), they are aliens, the government, or time travelers from the future. In order to attempt to bypass the flawed human brain, one might claim that these beings are communicating directly with their "spirit" or that the being's spirit resides in them. This could be considered the dualist view where humans are actually immaterial spirits or souls that temporarily occupy a physical body. This still does not solve the problem of certainty because as long as the spirit or soul is interacting with the physical body and the imperfect human brain, this supposed knowledge that is communicated is still filtered, which should call into question one's certainty.
As we have seen, claims of certainty regarding one's subjective states are justified because each of us is the ultimate authority on how we feel. But what happens when people make claims of certainty about the objective world that we all share?
The allure of certainty is not difficult to understand. When one claims certainty, people are more likely to believe him or her. When one is certain, the psychological discomfort of uncertainty is nonexistent. And when one claims certainty and "explains" that certainty with "I can't explain it. I just know," they are absolving themselves of the responsibility to offer an explanation or reasoning for their certainty. The problem is, if we propose this methodology as a legitimate way to know things, we quickly see how unreliable it is when millions of people make conflicting claims with certainty. When Islamic extremists are certain that Allah wants them to kill infidels, the rest of the world dismisses them as delusional. When Christian parents refuse to treat their sick children with medicine because they are certain that Jesus will heal them—and the children die, the rest of the world accuses the parents of being confused or misled. But when the rest of the world is certain about the knowledge their invisible being is sharing with them, they are right. This is special pleading.
Ironically, all claims of certainty rest on this "meta claim" that this methodology is a reliable way to know things with certainty. If one is not certain that this methodology is 100% reliable, they cannot be certain about any claims resulting from this methodology. The mere fact that people claim certainty about all kinds of mutually exclusive claims should be a clear indicator that this methodology is not only imperfect, but that it is highly problematic and outright unreliable. This is like being certain about a measurement when the instrument you are using to take the measurement is flawed.
Show a little humility. Acknowledge that your personal feelings aren't necessarily accurate representations of the objective world. Realize that understanding knowledge depends on imperfect biological processes that have been demonstrated to lead to a false sense of overconfidence and certainty. Accept the fact that because your unfalsifiable claims can't be disproved, this doesn't mean that they are true. Finally, understand that the feeling of certainty you have for your claims is the same feeling of certainty that millions of others have for their contradicting claims. Then, and only then, will there be room to use evidence and reason to change minds and better understand the objective world that we all share.
Reason: Book I - A Critical Thinking-, Reason-, and Science-based Approach to Issues That Matter is based on the first two years of The Dr. Bo Show, where Bo takes a critical thinking-, reason-, and science-based approach to issues that matter with the goal of educating and entertaining. Every chapter in the book explores a different aspect of reason by using a real-world issue or example.
Get the book, Reason: Book I - A Critical Thinking-, Reason-, and Science-based Approach to Issues That Matter by Bo Bennett, PhD by selecting one of the following options:
Enroll in the Daily Doses of Reason Online Course. This is passive course where you are sent one lesson per day by e-mail. There is no required interactivity. Each lesson averages just a few minutes.
Enroll in the Psychology of Woo Online Course. This is a crash course, designed to help you understand why your brain favors magical explanations over rational ones.
Have a podcast or know someone who does? Putting on a conference? Dr. Bennett is available for interviews and public speaking events. Contact him directly here.