Despite hundreds of studies examining belief in conspiracy theories, it is still unclear who—
- Award ID(s):
- 1908407
- PAR ID:
- 10181336
- Date Published:
- Journal Name:
- International Conference on Social Media and Society
- Page Range / eLocation ID:
- 184 to 192
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract demographically —is most likely to believe such theories. To remedy this knowledge gap, we examine survey data containing various operationalizations of conspiracism across diverse sociopolitical contexts. Study 1 employs a 2021 U.S. survey (n = 2021) to examine associations between sociodemographic characteristics and beliefs in 39 conspiracy theories. Study 2 similarly employs a survey of 20 countries (n = 26,416) and 11 conspiracy theory beliefs. Study 3 reports results from a 2020 U.S. survey (n = 2015) measuring perceptions about which groups are engaging in conspiracies. Study 4 interrogates data from nine U.S. surveys (2012–2022; n = 14,334) to examine the relationships between sociodemographic characteristics and generalized conspiracy thinking. Study 5 synchronizes studies 1–4 to provide an intersectional analysis of conspiracy theory belief. Across studies, we observe remarkably consistent patterns: education, income, age (older), and White identification are negatively related to conspiracism, while Black identification is positively related. We conclude by discussing why conspiracy theories may appeal most to historically marginalized groups and how our findings can inform efforts to mitigate the negative effects of conspiracy theories. -
Richey, Sean Eric (Ed.)The public is convinced that beliefs in conspiracy theories are increasing, and many scholars, journalists, and policymakers agree. Given the associations between conspiracy theories and many non-normative tendencies, lawmakers have called for policies to address these increases. However, little evidence has been provided to demonstrate that beliefs in conspiracy theories have, in fact, increased over time. We address this evidentiary gap. Study 1 investigates change in the proportion of Americans believing 46 conspiracy theories; our observations in some instances span half a century. Study 2 examines change in the proportion of individuals across six European countries believing six conspiracy theories. Study 3 traces beliefs about which groups are conspiring against “us,” while Study 4 tracks generalized conspiracy thinking in the U.S. from 2012 to 2021. In no instance do we observe systematic evidence for an increase in conspiracism, however operationalized. We discuss the theoretical and policy implications of our findings.more » « less
-
Online discussions frequently involve conspiracy theories, which can contribute to the proliferation of belief in them. However, not all discussions surrounding conspiracy theories promote them, as some are intended to debunk them. Existing research has relied on simple proxies or focused on a constrained set of signals to identify conspiracy theories, which limits our understanding of conspiratorial discussions across different topics and online communities. This work establishes a general scheme for classifying discussions related to conspiracy theories based on authors' perspectives on the conspiracy belief, which can be expressed explicitly through narrative elements, such as the agent, action, or objective, or implicitly through references to known theories, such as chemtrails or the New World Order. We leverage human-labeled ground truth to train a BERT-based model for classifying online CTs, which we then compared to the Generative Pre-trained Transformer machine (GPT) for detecting online conspiratorial content. Despite GPT's known strengths in its expressiveness and contextual understanding, our study revealed significant flaws in its logical reasoning, while also demonstrating comparable strengths from our classifiers. We present the first large-scale classification study using posts from the most active conspiracy-related Reddit forums and find that only one-third of the posts are classified as positive. This research sheds light on the potential applications of large language models in tasks demanding nuanced contextual comprehension.
-
null (Ed.)Widespread conspiracy theories, like those motivating anti-vaccination attitudes or climate change denial, propel collective action, and bear society-wide consequences. Yet, empirical research has largely studied conspiracy theory adoption as an individual pursuit, rather than as a socially mediated process. What makes users join communities endorsing and spreading conspiracy theories? We leverage longitudinal data from 56 conspiracy communities on Reddit to compare individual and social factors determining which users join the communities. Using a quasi-experimental approach, we first identify 30K future conspiracists?(FC) and30K matched non-conspiracists?(NC). We then provide empirical evidence of the importance of social factors across six dimensions relative to the individual factors by analyzing 6 million Reddit comments and posts. Specifically, in social factors, we find that dyadic interactions with members of the conspiracy communities and marginalization outside of the conspiracy communities are the most important social precursors to conspiracy joining-even outperforming individual factor baselines. Our results offer quantitative backing to understand social processes and echo chamber effects in conspiratorial engagement, with important implications for democratic institutions and online communities.more » « less
-
Social media provide a fertile ground where conspiracy theories and radical ideas can flourish, reach broad audiences, and sometimes lead to hate or violence beyond the online world itself. QAnon represents a notable example of a political conspiracy that started out on social media but turned mainstream, in part due to public endorsement by influential political figures. Nowadays, QAnon conspiracies often appear in the news, are part of political rhetoric, and are espoused by significant swaths of people in the United States. It is therefore crucial to understand how such a conspiracy took root online, and what led so many social media users to adopt its ideas. In this work, we propose a framework that exploits both social interaction and content signals to uncover evidence of user radicalization or support for QAnon. Leveraging a large dataset of 240M tweets collected in the run-up to the 2020 US Presidential election, we define and validate a multivariate metric of radicalization. We use that to separate users in distinct, naturally-emerging, classes of behaviors associated with radicalization processes, from self-declared QAnon supporters to hyper-active conspiracy promoters. We also analyze the impact of Twitter's moderation policies on the interactions among different classes: we discover aspects of moderation that succeed, yielding a substantial reduction in the endorsement received by hyperactive QAnon accounts. But we also uncover where moderation fails, showing how QAnon content amplifiers are not deterred or affected by the Twitter intervention. Our findings refine our understanding of online radicalization processes, reveal effective and ineffective aspects of moderation, and call for the need to further investigate the role social media play in the spread of conspiracies.