—Evaluating what you accept as true

Alice challenged the queen who believed six impossible things before breakfast.

Introduction

edit

What do you hold to be true?[1] Why did you choose these beliefs? Do you act according to those beliefs? Perhaps you believe particular widely-held values that provide an excellent standard for judging right and wrong, good and bad, important from trivial. Perhaps you have other values and believe something else. Knowing yourself requires a careful examination of your own values and beliefs. What are they? How did they originate? What are they based on? Why do you hold these beliefs? Are they based on reliable evidence? Are your goals and actions consistent with your beliefs? How do your beliefs align with your values? How have they evolved over your lifetime? How do they help you live a gratifying life?

Adopt a robust theory of knowledge and use it to carefully choose your own values and beliefs.

Forming beliefs Dialogue

Objectives

edit

The objectives of this course are to:

  • Explore how beliefs are formed;
  • reevaluate existing beliefs;
  • exercise critical thinking;
  • progress toward true beliefs.

This course is part of the Emotional Competency curriculum. This material has been adapted from the EmotionalCompetency.com page on beliefs, with permission of the author.

If you wish to contact the instructor, please click here to send me an email or leave a comment or question on the discussion page.

Definitions

edit

A belief is

  • A statement, assertion, or theory you accept as true.
  • a basis for deciding, choosing, and acting.

Myths and Misconceptions:

edit

Many people profess beliefs that are obviously false. Here are some of the more destructive and common examples:

  1. I had no choice.
  2. He made me do it.
  3. That's just how I am.
  4. It's all my parent's fault.
  5. It's all your fault.
  6. If we don't talk about it the issue will disappear.
  7. The past constrains the future.
  8. Denial is a solution.

Discard these unhelpful and false beliefs along with unhelpful primal rules that may be harming your decision making.

Assumptions

edit

An assumption is an unfounded belief. Assumptions are unchallenged, unquestioned, unexamined, and very often untrue. Many terms describe unfounded beliefs including: rumors, myths, legends, folk-lore, blind-faith, and wives-tales. Our bias, prejudices, ignorance, and experiences manifest in our assumptions. Apply your theory of knowledge to challenge rumors and assumptions before basing decisions on them. Stay curious. Don't be gullible, don't be fooled.

Firm Beliefs

edit

Possibilities and speculations may become firm beliefs after curiosity, inquiry, and exploration transform assumptions into opinions and opinions into facts. This is the substance of wisdom.

Each of us approaches a new idea, information, rumor, proposal, or explanation with a particular presumption. This presumption can range from a very unlikely, dismissive, and skeptical stance to a very likely and accepting stance. This presumption is plotted on the vertical axis in the diagram on the right, ranging from unlikely at the bottom, ranging through possible in the middle and extending to likely at the top.

To determine the truth of a belief we assess the correspondence of this belief to reality. As we become more curious about the proposal, we can learn more about the evidence that supports or contradicts its accuracy. Our understanding of the evidence begins to increase as a result of our inquiry and exploration. As more and more information becomes available, we become better informed and create a more accurate understanding and assessment of the situation. This accumulation of evidence is plotted on the horizontal axis in the following diagram. It ranges from unexamined on the left to examined on the right.

In the language of Bayesian inference, the vertical axis represents the prior probability—the likelihood assigned to the original presumption prior to any investigation, and the horizontal axis represents the likelihood of observing particular evidence in light of the presumption.

 
We form beliefs based on our original likelihood estimates, modified by the body of evidence our investigations uncover.

The colors on the grid indicate more reliable and authentic regions in blue, and less authentic regions in red.

The most authentic path is the blue region across the center of the diagram. Beginning on the left, a new idea is proposed, and we begin with the neutral presumption that it is possible. We suspend judgment and even resist forming an opinion until we can gather more facts. As we begin to ask questions and explore the evidence, we learn enough to begin to form an opinion—a preliminary or tentative belief. If the evidence is scarce, ambiguous, or contradictory we may not be able to gather enough support for or against the idea to confirm a particular belief. If the evidence is clear for one position or the other, we can form a belief, and perhaps even a firm belief. Along this path you are diligent, you know what you know and how you know it. This path applies your well-founded theory of knowledge and leads toward wisdom.

In short, use Bayesian inference to update your understanding as evidence emerges. Reject dogma by choosing prior probabilities greater than zero and less than one to reflect some level of uncertainty.

As an example, consider how your belief in the existence, importance, and causes of global warming may have evolved. Perhaps you first heard of the issue a few years ago and did not give it much thought. After hearing about it a few more times, you may have become curious. You probably did not know enough about the issue to form an opinion, so you suspended your judgment. Alternatively, you may have heard an opinion from a credible source and adopted that position as your own. As you learned more and more about the issue, perhaps you began to believe the issue was real, and important, but did not yet believe it was caused by human activities or that it would be consequential in your lifetime. You remain curious, you see the movie An Inconvenient Truth, you attend geology and environmental science lectures, read books on the topic, discuss your understanding and doubts with informed friends, follow the issue in the news, and read some scientific papers on the topic. Eventually you come to believe, then firmly believe, the problem is urgent, important, and caused by human activity.

But we often take other paths toward establishing our beliefs. We may be skeptical and begin with the assumption that that idea cannot be true. We defer our beliefs until more information is available. We demand proof. This is a cautious course and is prudent unless we act as if our skeptical assumptions are well founded beliefs. As we gather some evidence supporting the idea, we remain doubtful. As further inquiry and exploration uncovers more supporting evidence, we may eventually begin to believe. Alternatively, we may hold stubbornly to our disbelief, dismissing, discounting, or distorting evidence contrary to our original presumptions. We are obstinate, holding onto our disbelief despite clear evidence supporting the new idea. This is the territory of the flat earth society, Holocaust denial, moon walk conspiracy theorists, and other closed-minded people who choose to deny clear evidence. Ignorance and misbelief often thrive here.

A more foolish path is often taken. Here a gullible person is ready to believe almost anything. Rather than pose critical inquiries or examine evidence, they believe the rumors, hoaxes, myths, legends, fantasies, innuendos, and other preposterous claims, ideas, accusations, and proposals. Rumors are passed on, gossip is treated as fact, and too often the truth is never uncovered or even sought. Even as evidence mounts contrary to the idea, they remain hopeful, perhaps even detached, defiant, or contemptuous. If further evidence is gathered, perhaps opinions can mature into well founded beliefs. But too often the idea is firmly held onto despite clear contradictory evidence. This is the fantasy land of blind faith, alien abductions, demonic possession, and channeling.

Consider the range of beliefs people have regarding life after death. Direct evidence for or against life after death is minimal or non-existent. However, many people hold firmly to this belief. Elaborate and detailed descriptions of the afterlife are studied, propagated, discussed, defended, and often relied on. Other people simply dismiss the whole idea for lack of evidence. Passionate arguments on this topic are commonplace, and it is remarkable how determined people can be in defending their own assumptions and opinions.

Know how you know. Don't be seduced by assumptions, challenge them instead. Don't ignore or dismiss evidence, be guided by it. Don't rely on blind-faith; inquire and explore.

Assignment

edit

Complete the Wikiversity course on Seeking True Beliefs.

Difficulties

edit

Several characteristics of human nature and our minds make it easy to mistake false beliefs for true beliefs. These include cognitive biases, confirmation bias, motivated reasoning, soldier mindset, social proof, illusory truth effect, overconfidence,the iIllusion of explanatory depth, the dunning-kruger effect, and other intellectual vices.

Each of these in briefly introduced below. The hyperlinks contain much more information for the interested readers.

Cognitive Biases

edit

A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, and irrationality.

Many cognitive biases have been studied. This extensive list of cognitive biases introduces many. Several of the more prevalent biases that often lead to misbeliefs are listed below.

Confirmation Bias

edit

Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs.

Biased search for information

edit

One manifestation of confirmation bias is a biased search for information.

Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current hypothesis. Rather than searching through all the relevant evidence, they phrase questions to receive an affirmative answer that supports their theory. They look for the consequences that they would expect if their hypothesis was true, rather than what would happen if it was false.

The Wason selection task demonstrates our tendency to favor confirmation rather than falsification of our present beliefs.

Motivated Reasoning

edit

Motivated reasoning is a cognitive and social response in which individuals, consciously or sub-consciously, allow emotion-loaded motivational biases to affect how new information is perceived. Individuals tend to favor evidence that coincides with their current beliefs and reject new information that contradicts them, despite contrary evidence.

Motivated reasoning overlaps with confirmation bias. Both favor evidence supporting one's beliefs, at the same time dismissing contradictory evidence. However, confirmation bias is mainly a sub-conscious (innate) cognitive bias. In contrast, motivated reasoning (motivational bias) is a sub-conscious or conscious process by which one's emotions control the evidence supported or dismissed. For confirmation bias, the evidence or arguments can be logical as well as emotional.

As a defense against motivated reasoning author Julia Galef describes the scout mindset as “The motivation to see things as they are, not as you wish they were.”[2] In contrast to the scout mindset, the soldier mindset is a motivation to attack differing points of view or defend a position in each argument we encounter.

Social Proof

edit

Social proof is a psychological and social phenomenon wherein people copy the actions of others in choosing how to behave in a given situation. The term was coined by Robert Cialdini in his 1984 book Influence: Science and Practice.

Social proof is used in ambiguous social situations where people are unable to determine the appropriate mode of behavior, and is driven by the assumption that the surrounding people possess more knowledge about the current situation.

The effects of social influence can be seen in the tendency of large groups to conform.

Illusory truth effect

edit

The illusory truth effect is the tendency to believe that false information is true after repeated exposure. This phenomenon was first identified in a 1977 study at Villanova University and Temple University. When truth is assessed, people rely on whether the information is in line with their understanding or if it feels familiar. The first condition is logical, as people compare new information with what they already know to be true. Repetition makes statements easier to process relative to new, unrepeated statements, leading people to believe that the repeated conclusion is more truthful. The illusory truth effect has also been linked to hindsight bias, in which the recollection of confidence is skewed after the truth has been received.

Overconfidence

edit

The overconfidence effect is a well-established bias in which a person's subjective confidence in their judgments is reliably greater than the objective accuracy of those judgments, especially when confidence is relatively high. Overconfidence is one example of a miscalibration of subjective probabilities. Throughout the research literature, overconfidence has been defined in three distinct ways: (1) overestimation of one's actual performance; (2) over placement of one's performance relative to others; and (3) over precision in expressing unwarranted certainty in the accuracy of one's beliefs.

After choosing a particular belief, people ae often overconfident in the accuracy of that belief.

Illusion of explanatory depth

edit
 
Intellectual Virtues Overcome Vices

The illusion of explanatory depth is cognitive bias or an illusion where people tend to believe they understand a topic better than they actually do. The effect was observed in only one type of knowledge called explanatory knowledge, in this case defined as "knowledge that involves complex causal patterns". The effect has not been observed in procedural, narrative, or factual (descriptive) knowledge. Evidence of the illusion occurring has been found in everyday mechanical and electrical devices such as bicycles, in addition to mental disorders, natural phenomena, folk theories, and politics, with the most studied effect of the illusion being in politics in the form of political polarization.

If you suspect that someone is rigidly holding to some belief, without having a good understanding of that belief or how they came to hold that belief, ask them to explain this in depth. They may become aware that their certainty exceeds their understanding.

Dunning-Kruger Effect

edit

The Dunning–Kruger effect is a cognitive bias in which people with limited competence in a particular domain overestimate their abilities. This describes the tendency of misinformed people to hold firmly to false beliefs.

Countermeasures

edit

Develop intellectual virtues to overcome the pitfalls described above and pursue true beliefs.

Flipping Positions

edit

How does passionate love so often turn into bitter divorce? The firm belief of “I love my wife” can eventually and precipitously become “I really hate her.” Here is a hypothesis:

A cautious style of decision making, shown in the blue region in the diagram above, is to reserve judgment; wait until you have gathered and evaluated lots of representative and relevant evidence, then carefully form an opinion. As you gather more evidence that opinion becomes a firm belief. But the more common style is to presume the decision early, then to filter and distort evidence to support that decision. This is shown along the top red band, extending from “gullible” to “fantasy” in the above diagram.

Consider how this decision-making style might apply to the belief: “I love her.” You meet a woman and are enamored with her. Passion helps you quickly decide she is perfect and you love her with all your heart. You enjoy time together and are willing to ignore or explain away any of her shortcomings. Even when she stays out late, comes home drunk, tells transparent lies, and gambles away the family savings you distort the evidence to support your position of “she is the perfect woman for me.”

Eventually the accumulation of evidence prevails. Your opinion changes, perhaps because of overwhelming evidence, or just a change of heart. Your viewpoint suddenly flips from “I look at the evidence in a positive light” to “I look at this in a negative light.” Suddenly the evidence fits better with the new viewpoint. The spin quickly unravels. Now your opinion is “she is a bitch” and you have all the respun evidence to prove it, and you can also spin some more.

Furthermore, you are a bit humiliated because you held onto your “I love her” position too long, well beyond what the evidence could support. You are ashamed to think “How could I have been so blind, so stupid, not to see what was really happening.”

Similar shifts in thinking can quickly transform pride into shame or guilt; envy, jealousy, or compassion into contempt or gloating; hope into sadness, fear or joy; and fear into relief.

People are all human. We each have many outstanding qualities and many shortcomings. Establish an authentic, balanced, complex, integrated, evidence-based, and evolving understanding of your lover and your self. Take the bad with the good and continue to refine and strengthen your relationship.

Beliefs Vary

edit

Beliefs vary considerably from one person to the next. The website ThisIBelieve.org maintains a fascinating collection of thousands of essays proclaiming the beliefs[3] of many thoughtful people. Perhaps you will enjoy reading some.

Professed Beliefs and Actual Beliefs

edit

We can only determine what some else professes to believe. We can never know what they truly believe. Comparing their behavior with their professed beliefs can provide clues to their true beliefs.

Assignment

edit

Increase the alignment of your beliefs with reality by taking actions selected from the following list.

  1. Think more clearly
    1. Face Facts.
    2. Evaluate evidence skillfully.
    3. Evaluate journalism standards.
    4. Seek true beliefs.
    5. Know how you know.
    6. Think scientifically.
    7. Align your worldview with reality.
    8. Use Socratic methods.
    9. Expect intellectual honesty from yourself and others.
edit

Students interested in learning more about forming beliefs may be interested in the following materials:

  • Wolpert, Lewis (July 17, 2008). Six Impossible Things Before Breakfast: The Evolutionary Origins of Belief. W. W. Norton & Company. pp. 256. ISBN 978-0393332032. 
  • Tavris, Carol (August 4, 2020). Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. Mariner. pp. 464. ISBN 978-0358329619. 
  • Kashdan, Todd (April 21, 2009). Curious?: Discover the Missing Ingredient to a Fulfilling Life. William Morrow. pp. 352. ISBN 978-0061661181. 
  • Burton, Robert. On Being Certain Paperback. Griffin. pp. 272. ISBN 978-0312541521. 
  • Gray, Dave (September 14, 2016). Liminal Thinking: Create the Change You Want by Changing the Way You Think. Two Waves Books. pp. 184. ISBN 978-1933820460. 
  • Ariely, Dan (September 17, 2024). Misbelief: What Makes Rational People Believe Irrational Things. Harper Perennial. pp. 320. ISBN 978-0063280434. 

Notes

edit
  1. This material is adapted from the EmotionalCompetency.com website with permission from the author.
  2. Galef, Julia (April 13, 2021). The Scout Mindset: Why Some People See Things Clearly and Others Don't. Piatkus. ISBN 978-0349427645.@9 of 541
  3. See http://thisibelieve.org/dsp_Browse.php