bookmark

Confirmation bias in religion


Overview

  • Confirmation bias — the tendency to seek, notice, and remember evidence that supports existing beliefs while discounting or reinterpreting disconfirming evidence — operates with particular force in religious contexts, where believers systematically remember answered prayers and forget unanswered ones, interpret ambiguous events as divine signs, and find prophetic fulfillment in texts through a pattern logicians call the Texas sharpshooter fallacy.
  • Cognitive mechanisms identified by researchers including Justin Barrett (hyperactive agency detection), Deborah Kelemen (promiscuous teleology), Leon Festinger (cognitive dissonance), and Ziva Kunda (motivated reasoning) together explain why religious belief tends to be self-reinforcing: each anomalous event can be reinterpreted within the framework, and presenting disconfirming evidence can paradoxically strengthen rather than weaken conviction through the backfire effect.
  • Recognizing the role of confirmation bias in religious belief is an epistemological observation, not a theological argument — to argue that a belief is held because of bias is to commit the genetic fallacy unless accompanied by independent evidence against the belief’s content; the same cognitive mechanisms that sustain false beliefs can in principle sustain true ones.

Confirmation bias is the tendency to seek out, notice, interpret, and remember information in ways that confirm beliefs already held, while discounting, overlooking, or reinterpreting evidence that would challenge them. First systematically described in the psychological literature by Peter Wason’s card-selection experiments in the 1960s and consolidated as a field-defining concept by Raymond Nickerson’s 1998 review, confirmation bias ranks among the most robustly documented findings in cognitive psychology — observable across education levels, cultures, and domains of belief.2 Its relevance to religion is not incidental. The structure of religious belief — unfalsifiable metaphysical claims, emotionally significant interpretations of personal experience, socially reinforced commitment — creates conditions in which confirmation bias operates with unusual force and persistence. Understanding how this works requires examining both the general mechanisms of biased cognition and the specific features of religious belief that amplify them.

This article is concerned with the epistemological question of how religious beliefs are maintained and reinforced, not with the metaphysical question of whether any religious beliefs are true. The same cognitive machinery that sustains false beliefs can in principle sustain true ones; explaining the psychological origin of a belief says nothing about its truth value. That distinction — the genetic fallacy caveat — is taken up in the final section. The concern here is with how cognitive science of religion illuminates the specific mechanisms through which confirmation bias operates in religious contexts, and what that implies for religious epistemology.14, 22

Confirmation bias: the core mechanism

Nickerson’s 1998 review catalogued confirmation bias as a family of related tendencies rather than a single mechanism.2 The core variants are: selective search for confirming evidence (looking for reasons a hypothesis is true rather than reasons it might be false); selective exposure (preferentially seeking information sources likely to affirm existing beliefs); selective recall (remembering confirming instances more readily than disconfirming ones); and biased assimilation (subjecting confirming evidence to weaker scrutiny than disconfirming evidence). Each of these operates largely below the level of conscious awareness. People do not typically feel that they are engaging in biased reasoning; they experience themselves as responding to evidence. The bias operates at the level of attention, encoding, and retrieval rather than at the level of deliberate inference, which makes it unusually difficult to correct through conscious reflection alone.10

Ziva Kunda’s influential 1990 account of motivated reasoning sharpened this picture by distinguishing between accuracy goals (the desire to reach correct conclusions) and directional goals (the desire to reach a particular conclusion).3 When directional goals are active, people use cognitive processes — memory search, inference, causal attribution — selectively to construct justifications for the conclusion they are motivated to reach, while maintaining the subjective experience of reasoning objectively. Directional goals are activated whenever beliefs are identity-constitutive, socially reinforced, or emotionally significant — all features that characterize religious commitment at its most intense. The result is that the more important a belief is to a person’s sense of self, the stronger the motivated reasoning that protects it, and the weaker the evidentiary challenge needed to trigger a defensive response.3, 15

Prayer, answered and unanswered: counting hits, ignoring misses

Among the most concrete and empirically tractable manifestations of confirmation bias in religion is the selective accounting of petitionary prayer. When a believer prays for a specific outcome — recovery from illness, safe passage through a storm, success in an examination — and the desired outcome occurs, the event is registered as an answered prayer and typically remembered with some emotional vividness. When the desired outcome does not occur, several cognitive moves are available that prevent the non-event from counting as evidence against prayer’s efficacy: the outcome was “God’s will”; the prayer was not sufficiently sincere; there was a hidden benefit in the denial; the request was insufficiently specific; or the person asking was not spiritually qualified. The asymmetry is structurally guaranteed: successes confirm the belief, failures are absorbed by the theology. Over a lifetime of prayer, a believer accumulates a selective archive of vivid confirming instances from which the null results have been systematically filtered.2, 15

This pattern is not unique to naive believers. The clinical literature on intercessory prayer documents the same structure at the institutional level. Herbert Benson and colleagues’ 2006 STEP trial — the largest randomized controlled trial of intercessory prayer, involving 1,802 cardiac surgery patients — found no significant benefit from third-party prayer on primary clinical outcomes, and found that patients who knew they were being prayed for had a slightly higher rate of complications.11 The absence of a positive effect does not register in popular religious discourse in proportion to its evidential weight: positive anecdotes circulate widely, while null results in peer-reviewed journals are invisible to the communities whose practices they evaluate. The popular perception that prayer “works” is thus sustained not by the aggregate evidence but by the selective circulation of confirming cases. For a fuller treatment of the empirical literature on prayer, see Prayer and empirical evidence.11

Apophenia, agency detection, and divine signs

A closely related phenomenon is the tendency to perceive meaningful patterns — especially agency-laden patterns — in random or ambiguous events. Michael Shermer has termed this general tendency patternicity: the finding of meaningful patterns in meaningless noise.16 In religious contexts it takes the specific form of interpreting coincidences, unexpected events, or natural regularities as communications from a divine source. A parking space found at the last moment, a song that “happened” to play at a meaningful time, a stranger who appeared with exactly the needed advice — these are stock examples of events that religious believers commonly interpret as divine signs or answered prayers, while the much larger number of unremarkable coincidences that receive no such interpretation passes unnoticed.

The cognitive mechanism behind this tendency is what Justin Barrett called the hyperactive agency detection device (HADD): a cognitive system that disposes humans to perceive intentional agents behind ambiguous stimuli.6 Barrett, building on Stewart Guthrie’s earlier work, argued that HADD evolved as an adaptive solution to the asymmetric costs of false negatives versus false positives in agent detection: missing a real predator or rival was more costly than falsely attributing agency to wind or falling rocks. The resulting system is calibrated toward over-detection. In a religious framework, this tendency is directed and amplified: believers are taught to look for God’s action in everyday events, and HADD’s output — the intuition that agency is present — is interpreted through that framework rather than subject to critical scrutiny.6, 7 The perception of divine signs thus reflects the interaction of a universal cognitive bias with culturally transmitted interpretive frameworks, a combination that the cognitive science of religion has investigated extensively.14, 22

Deborah Kelemen’s research on promiscuous teleology adds a second layer to this account. Kelemen found that children and adults under cognitive load systematically prefer teleological explanations of natural phenomena — explaining objects and events in terms of purposes and functions rather than physical causes.8, 9 When this bias is active, the natural world is spontaneously perceived as purposively arranged, making it easy to interpret natural regularities as signs of divine design and natural events as divine communications. The perception is not reached by reasoning from evidence to conclusion; it is a default output of intuitive cognition that specific kinds of training can suppress but rarely eliminate entirely.9, 18

The Texas sharpshooter fallacy in prophetic fulfillment

The Texas sharpshooter fallacy describes the error of identifying a target after the fact and then claiming to have hit it. The name derives from the image of a marksman who fires randomly at a barn wall, then draws a target around the tightest cluster of bullet holes and presents himself as an accurate shot. Applied to prophecy, the fallacy describes the practice of identifying a fulfillment after the event has occurred, selecting from a large pool of vague or multivalent prophetic texts the ones that can be made to match, and then presenting the match as evidence of supernatural foreknowledge.15

The conditions for this fallacy are structurally present in every major religious prophetic tradition. Ancient texts are sufficiently numerous, sufficiently ambiguous, and sufficiently general in their imagery that an interpreter with a specific event in mind can nearly always find a passage that, with suitable reading, can be construed as anticipating it. The hits are foregrounded; the vastly larger number of prophetic passages that have not been applied to any specific fulfillment, or that were applied and found wanting, are quietly set aside. Interpretation is post hoc and selective throughout. The evidential force of such “fulfilled prophecies” is therefore negligible unless controlled for the base rate of attempted applications and the false-positive rate of interpretive matching — neither of which is typically provided in apologetic presentations of the argument from prophecy.15, 17

Daniel Kahneman’s account of availability bias — the tendency to judge the frequency or probability of events by the ease with which examples come to mind — explains why this pattern is so persuasive.10 Memorable prophetic fulfillments are vivid, emotionally significant, and frequently rehearsed in religious communities; prophetic failures are unmemorable by construction. The resulting asymmetry in availability produces an inflated subjective sense of the hit rate, which then feeds back into confidence in the prophetic tradition as a whole.10, 17

Anchoring bias and the role of religious upbringing

Tversky and Kahneman’s classic work on anchoring demonstrated that people’s judgments are disproportionately influenced by an initial value — the “anchor” — even when they consciously try to adjust away from it.19 In the domain of religious belief, the anchor is set by childhood enculturation. A person raised in a specific religious tradition acquires not merely doctrinal beliefs but an entire interpretive framework — a set of background assumptions about the nature of reality, the structure of morality, and the proper methods for answering ultimate questions — within which all subsequent evidence is evaluated. This framework functions as an anchor in the technical sense: evidence that fits the framework is assimilated readily, while evidence that challenges it must overcome not just the specific belief in question but the entire background against which it is assessed.13, 19

The practical consequence is that the religious tradition a person was raised in is the single strongest predictor of the religious tradition they will hold as an adult — far stronger than any feature of the evidence for competing traditions. The outsider test for faith, developed by John Loftus, asks believers to apply the same critical scrutiny to their own religious tradition that they spontaneously apply to the claims of competing traditions.13 The asymmetry in critical standards — rigor toward rivals, credulity toward one’s own tradition — is a direct expression of anchoring bias interacting with motivated reasoning. The childhood anchor makes the home tradition feel natural and self-evident while rival traditions feel obviously in need of justification.13, 3

Cognitive dissonance and the backfire effect

Leon Festinger’s theory of cognitive dissonance, developed in the 1950s, holds that people experience psychological discomfort when they hold beliefs that are inconsistent with one another or with their behavior, and that they are motivated to reduce this discomfort — typically not by changing the most fundamental belief but by reinterpreting or dismissing the inconsistent element.5 Festinger’s most celebrated investigation of this mechanism in a religious context was his 1956 study of the Seekers, a small American cult whose leader, Dorothy Martin (disguised in the book as Marion Keech), prophesied that a great flood would destroy the Earth on December 21, 1954, and that a flying saucer would rescue the faithful. When no flood occurred and no saucer arrived, Festinger and his colleagues observed how the group responded.

The response was not deconversion. The group experienced an initial period of confusion, then reinterpreted the disconfirmation: God had spared the Earth because of the group’s faithfulness. Far from abandoning the belief, many group members became more committed and more publicly evangelical after the prophecy failed than they had been before.4 This pattern — belief strengthened rather than weakened by disconfirmation — is what Festinger’s theory predicts under specific conditions: when the belief is held with great conviction, when the believer has made public or costly commitments to it, and when social support from fellow believers is available in the moment of disconfirmation. All three conditions are commonly met in religious communities confronting empirical challenges to central doctrines.4, 5

Brendan Nyhan and Jason Reifler identified a related phenomenon they called the backfire effect: in some domains, presenting people with factual corrections to their false beliefs produces increased belief in the original false claim rather than decreased belief.12 While subsequent research has qualified the generality of the backfire effect — it appears more robust for identity-constitutive beliefs than for casual opinions — the core phenomenon is well-documented in religious contexts: direct challenge to deeply held religious beliefs frequently produces the impression that the challenger must be motivated by bad faith, spiritually blinded, or missing crucial background knowledge, none of which require engagement with the challenge on its merits.12, 15 The challenge is reinterpreted as confirming the tradition’s own account of why unbelievers fail to see the truth, which is itself a self-sealing mechanism.14

The genetic fallacy: a necessary caveat

The analysis developed in this article — that religious beliefs are systematically reinforced and maintained through cognitive biases including confirmation bias, motivated reasoning, anchoring, apophenia, and the backfire effect — is an epistemological observation about how such beliefs are held, not a metaphysical argument about whether any of them are true. Conflating the two is an instance of the genetic fallacy: the error of evaluating the truth of a belief by its causal origin rather than by its evidential support.20 A belief can be held initially for poor reasons, reinforced by cognitive bias, transmitted through indoctrination, and sustained by motivated reasoning — and still be true. Conversely, a belief can be held for excellent reasons, subject to rigorous critical scrutiny, and acquired through careful inquiry — and still be false. The psychology of belief formation is logically independent of the truth value of the belief formed.20, 21

Alvin Plantinga’s reformed epistemology presses this point from a theistic direction.21 Plantinga argues that even if theistic belief is produced by a cognitive mechanism — the sensus divinitatis — this does not undermine its warrant unless that mechanism is malfunctioning. On Plantinga’s account, a belief produced by a properly functioning cognitive faculty in an appropriate environment has warrant regardless of whether it was reached through inference from evidence. The CSR observation that religious belief is produced by natural cognitive mechanisms is, on this view, compatible with those beliefs being true and warranted. Whether the specific mechanisms identified by CSR — HADD, teleological reasoning, confirmation bias — constitute properly functioning truth-tracking faculties or systematic distortions is precisely where the philosophical debate lies.21, 22

What the confirmation bias literature does establish, independently of the genetic fallacy concern, is an epistemological standard for religious belief: beliefs that are insulated from disconfirmation by a suite of reinforcing cognitive mechanisms deserve heightened scrutiny, not because their content is necessarily false, but because the psychological processes that maintain them are unreliable guides to truth in any domain. A scientist who conducted research on their pet hypothesis while systematically discounting anomalous data, interpreting ambiguous results as confirmatory, and reacting to critical peer review by becoming more confident in their original position would be recognized as engaged in bad epistemic practice — regardless of whether the hypothesis happened to be correct. The standards appropriate to such cases in empirical inquiry apply, mutatis mutandis, in theology and religious epistemology as well.2, 3, 13

References

1

The Mismeasure of Man (revised ed.)

Gould, S. J. · Norton, 1996

open_in_new
2

Confirmation Bias: A Ubiquitous Phenomenon in Many Guises

Nickerson, R. S. · Review of General Psychology 2(2): 175–220, 1998

open_in_new
3

The Case for Motivated Reasoning

Kunda, Z. · Psychological Bulletin 108(3): 480–498, 1990

open_in_new
4

When Prophecy Fails: A Social and Psychological Study of a Modern Group That Predicted the Destruction of the World

Festinger, L., Riecken, H. W. & Schachter, S. · University of Minnesota Press, 1956

open_in_new
5

A Theory of Cognitive Dissonance

Festinger, L. · Stanford University Press, 1957

open_in_new
6

Why Would Anyone Believe in God?

Barrett, J. L. · AltaMira Press, 2004

open_in_new
7

Faces in the Clouds: A New Theory of Religion

Guthrie, S. E. · Oxford University Press, 1993

open_in_new
8

Are Children ‘Intuitive Theists’? Reasoning About Purpose and Design in Nature

Kelemen, D. · Psychological Science 15(5): 295–301, 2004

open_in_new
9

The Human Function Compunction: Teleological Explanation in Adults

Kelemen, D. & Rosset, E. · Cognition 111(1): 138–143, 2009

open_in_new
10

Thinking, Fast and Slow

Kahneman, D. · Farrar, Straus and Giroux, 2011

open_in_new
11

Study of the Therapeutic Effects of Intercessory Prayer (STEP) in Cardiac Bypass Patients

Benson, H. et al. · American Heart Journal 151(4): 934–942, 2006

open_in_new
12

When Corrections Fail: The Persistence of Political Misperceptions

Nyhan, B. & Reifler, J. · Political Behavior 32(2): 303–330, 2010

open_in_new
13

The Outsider Test for Faith: How to Know Which Religion Is True

Loftus, J. W. · Prometheus Books, 2013

open_in_new
14

Religion Explained: The Evolutionary Origins of Religious Thought

Boyer, P. · Basic Books, 2001

open_in_new
15

The Believing Brain: From Ghosts and Gods to Politics and Conspiracies — How We Construct Beliefs and Reinforce Them as Truths

Shermer, M. · Times Books, 2011

open_in_new
16

Patternicity: Finding Meaningful Patterns in Meaningless Noise

Shermer, M. · Scientific American 299(6): 48, 2008

open_in_new
17

Heuristics and Biases: The Psychology of Intuitive Judgment

Gilovich, T., Griffin, D. & Kahneman, D. (eds.) · Cambridge University Press, 2002

open_in_new
18

Why Religion Is Natural and Science Is Not

McCauley, R. N. · Oxford University Press, 2011

open_in_new
19

Anchoring and Adjustment in Judgment Under Uncertainty

Tversky, A. & Kahneman, D. · In Kahneman, D., Slovic, P. & Tversky, A. (eds.), Judgment Under Uncertainty: Heuristics and Biases, Cambridge University Press: 14–21, 1982

open_in_new
20

The Genetic Fallacy

Flew, A. · In Flew, A. (ed.), Logic and Language (Second Series), Blackwell: 183–207, 1953

open_in_new
21

Warranted Christian Belief

Plantinga, A. · Oxford University Press, 2000

open_in_new
22

The Oxford Handbook of the Cognitive Science of Religion

Barrett, J. L. (ed.) · Oxford University Press, 2022

open_in_new
0:00