People who believe in conspiracy theories may be more likely to exhibit specific cognitive biases also found in individuals with subclinical delusional thinking, according to a pair of new studies published in Applied Cognitive Psychology. The research found that cognitive tendencies such as jumping to conclusions, emotional reasoning, and anomalous perception were significantly associated with both general and specific conspiracy beliefs.
Conspiracy theories are widely held beliefs that certain powerful groups secretly conspire to achieve harmful goals. These theories are not supported by strong evidence, yet they persist across cultures and political contexts. Events like the January 6 U.S. Capitol riot and the public embrace of conspiracy narratives by influential figures have made it increasingly important to understand the psychological foundations of conspiratorial thinking.
The researchers behind the current study wanted to explore whether some of the same cognitive patterns found in delusional ideation—typically studied in the context of psychosis—might also be present in individuals who are prone to conspiracy theories. Although conspiracy beliefs are not considered clinical delusions, both involve believing things that lack evidentiary support and often resist counterevidence. By focusing on cognitive biases commonly associated with psychosis, the researchers hoped to clarify whether these patterns help explain why some people are more likely to believe in conspiracy theories.
“I team teach a course at Southern New Hampshire University about the psychology of conspiracy theories,” explained Professor Peter Frost, the corresponding author of the study. “My colleague and I came across research showing that teaching people to recognize logical fallacies and cognitive biases can decrease people’s belief in conspiracy theories. That led some student researchers and I to question whether some people who are prone to cognitive biases are more likely to believe in conspiracy theories. The other authors on the paper are undergraduates from SNHU.”
The researchers conducted two studies using online surveys distributed through social media platforms such as Facebook, Reddit, and Instagram. Both studies used the Cognitive Bias Questionnaire for Psychosis (CBQp), a 30-item tool designed to measure cognitive biases related to delusional thinking. Participants were not informed of the questionnaire’s link to psychosis to avoid biasing their responses.
In the first study, the researchers collected data from 200 participants, the majority of whom were white and female, with an average age of 37. Participants also completed the Generic Conspiracy Beliefs Scale (GCBS), which measures a person’s general tendency to endorse conspiracy theories across domains like government, health, and extraterrestrials. (For example, “A small, secret group of people is responsible for making all major world decisions, such as going to war.”)
The second study followed a similar procedure with a new sample of 182 participants, averaging 41 years old. This time, the researchers used a custom questionnaire that asked participants to rate their agreement with 14 specific contemporary conspiracy beliefs. These included theories related to global elites, immigration policy, and public health. (For example, “Moon landings have never happened, and the proofs have been fabricated by NASA and the US government,” “The current U.S. border policy and/or certain groups of people are secretly doing things to change the racial mix of the country,” and “The earth is actually flat and/or hollow.”)
In both studies, participants answered questions from the CBQp assessing seven types of cognitive biases: anomalous perception (perceiving stimuli that are not present), catastrophizing (expecting the worst), dichotomous thinking (seeing things in black-and-white terms), emotional reasoning (relying on feelings over logic), intentionalizing (assuming others’ actions are intentional and malicious), jumping to conclusions (making snap judgments without sufficient evidence), and interpreting ambiguous events as threatening.
In the first study, the researchers found that five of the seven cognitive biases were positively associated with generalized conspiracy beliefs. The strongest relationship was with jumping to conclusions, which had a correlation coefficient of 0.67. Anomalous perception also showed a strong relationship (r = 0.54), followed by moderate correlations with intentionalizing, interpreting ambiguous events as threatening, and emotional reasoning. Catastrophizing and dichotomous thinking did not show statistically significant correlations in this sample.
Together, the cognitive biases explained more than half (56%) of the variation in participants’ general tendency to believe in conspiracies. This finding supports the idea that people who are more susceptible to certain thinking errors are also more likely to endorse conspiratorial ideas, even if they are not experiencing clinical symptoms of delusion.
In the second study, which focused on specific conspiracy beliefs, the researchers found a similar pattern. Jumping to conclusions was again the strongest predictor (r = 0.57), followed by anomalous perception (r = 0.45) and intentionalizing (r = 0.32). Emotional reasoning and threat interpretation also showed modest correlations, while catastrophizing and dichotomous thinking did not reach statistical significance.
Although the second study explained a smaller portion of the variance in conspiracy belief (about 37%), the pattern of findings closely mirrored those from the first study. This suggests that the same cognitive biases may underlie both general beliefs in conspiracies and more context-specific conspiracy theories tied to current events.
“I was not sure how this study would turn out,” Frost told PsyPost. “We kept seeing research showing that belief in conspiracy theories, like QAnon, tends to occur for emotional reasons. It offers justification and scapegoating that, at least in the short run, addresses emotional issues. Our research shows that some cognitive biases, like jumping to conclusions and intentionality, are strongly correlated with conspiratorial thinking. Even if conspiracy theories serve some emotional need, it is easier to justify them if you possess certain cognitive biases.”
The results contribute to a growing body of research showing that conspiracy thinking is not simply a matter of ignorance or personality but may stem from underlying cognitive patterns. While belief in conspiracies is not itself a sign of mental illness, the study found that biases commonly observed in people with subclinical delusional ideation—such as interpreting random or ambiguous information as meaningful or hostile—are also present among conspiracy believers.
“There is a part of our population that, although not clinically delusional, engage in thinking that is associated with delusional thinking,” Frost explained. “They tend to jump to conclusions without carefully considering evidence. They perceive things that others don’t, like ghosts or spirits. They suspect intention where there is no evidence for it. They might substitute emotional reactions for reason when making important decisions. These are the risk factors that make some people more susceptible to believing in conspiracy theories.”
“Before social media and the internet, we had to mix with people that might not share our same views. Today, if you believe that some people are actually lizards—the reptilian conspiracy theory—you can surround yourself with like-minded people in a chat. Even though only 4% of people tend to believe in this theory, encountering others who have a similar belief can embolden a cognitive bias that might not have manifested had the person been in more mixed company.”
While the studies offer valuable insights, they have some limitations. Both relied on self-reported data and non-random online samples, which may not be fully representative of the general population. The participants were disproportionately white and female, limiting the generalizability of the findings across different demographic groups.
Another limitation is that the study design was correlational. It cannot determine whether cognitive biases cause conspiracy beliefs or if belief in conspiracies reinforces these biases over time. Longitudinal or experimental research would be needed to explore questions of causality.
“We don’t know if cognitive biases make certain people more susceptible to conspiracy theory belief or if going down the rabbit hole of conspiracy theories results in greater susceptibility towards cognitive biases,” Frost noted. “There may also be a bidirectional influence.”
Despite the caveats, the research offers new evidence that conspiratorial thinking is closely tied to specific patterns of cognitive bias. While not diagnostic of mental illness, these biases reflect systematic ways of processing information that can leave people more vulnerable to unverified and often harmful beliefs.
“The ultimate goal for me is to understand the causes behind why conspiracy theory belief spreads,” Frost said. “Once we understand the causes better, it would be easier to help prevent it or remediate it. Too many of my students tell me how belief in conspiracy theories has led to rifts in their families. This is a trend, now in the mainstream, that is causing significant stress for far too many families and friends.”
“I also believe that our research suggests that too many people are susceptible to cognitive biases and logical fallacies. Critical thinking or at least recognition of illogical arguments could help prevent people from believing in disinformation. A democratic society relies on basic logic and critical thinking. Our educational system needs to teach this at an early age, along with media literacy.”
The study, “Cognitive Biases Associated With Specific and Generalized Beliefs in Conspiracy Theory,” was authored by Peter J. Frost, Alyssa Simard, Lauren Iraci, Serena Stack, Carolyn Gould-Faulkner, Abby Alexakos, Manny Fernandez, and Shubham Oza.
To see more documents/articles regarding this group/organization/subject click here.