This content will become publicly available on May 31, 2025
Online discussions frequently involve conspiracy theories, which can contribute to the proliferation of belief in them. However, not all discussions surrounding conspiracy theories promote them, as some are intended to debunk them. Existing research has relied on simple proxies or focused on a constrained set of signals to identify conspiracy theories, which limits our understanding of conspiratorial discussions across different topics and online communities. This work establishes a general scheme for classifying discussions related to conspiracy theories based on authors' perspectives on the conspiracy belief, which can be expressed explicitly through narrative elements, such as the agent, action, or objective, or implicitly through references to known theories, such as chemtrails or the New World Order. We leverage human-labeled ground truth to train a BERT-based model for classifying online CTs, which we then compared to the Generative Pre-trained Transformer machine (GPT) for detecting online conspiratorial content. Despite GPT's known strengths in its expressiveness and contextual understanding, our study revealed significant flaws in its logical reasoning, while also demonstrating comparable strengths from our classifiers. We present the first large-scale classification study using posts from the most active conspiracy-related Reddit forums and find that only one-third of the posts are classified as positive. This research sheds light on the potential applications of large language models in tasks demanding nuanced contextual comprehension.
more » « less- Award ID(s):
- 2318461
- PAR ID:
- 10533047
- Publisher / Repository:
- AAAI
- Date Published:
- Journal Name:
- Proceedings of the International AAAI Conference on Web and Social Media
- Volume:
- 18
- ISSN:
- 2162-3449
- Page Range / eLocation ID:
- 340 to 353
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Online communities play a crucial role in disseminating conspiracy theories. New theories often emerge in the aftermath of catastrophic events. Despite evidence of their widespread appeal, surprisingly little is known about who participates in these event-specific conspiratorial discussions or how do these discussions evolve over time. We study r/conspiracy, an active Reddit community of more than 200,000 users dedicated to conspiratorial discussions. By focusing on four tragic events and 10 years of discussions, we find three distinct user cohorts: joiners, who never participated in Reddit but joined r/conspiracy only after the event; converts who were active Reddit users but joined r/conspiracy only after the event; and veterans, who are longstanding r/conspiracy members. While joiners and converts have a shorter lifespan in the community in comparison to the veterans, joiners are more active during their shorter tenure, becoming increasingly engaged over time. Finally, to investigate how these events affect users’ conspiratorial discussions, we adopted a causal inference approach to analyze user comments around the time of the events. We find that discussions happening after the event exhibit signs of emotional shock, increased language complexity, and simultaneous expressions of certainty and doubtfulness. Our work provides insight on how online communities may detect new conspiracy theories that emerge ensuing dramatic events, and in the process stop them before they spread.more » « less
-
Online discussion platforms provide a forum to strengthen and propagate belief in misinformed conspiracy theories. Yet, they also offer avenues for conspiracy theorists to express their doubts and experiences of cognitive dissonance. Such expressions of dissonance may shed light on who abandons misguided beliefs and under what circumstances. This paper characterizes self-disclosures of dissonance about QAnon-a conspiracy theory initiated by a mysterious leader "Q" and popularized by their followers ?anons"-in conspiratorial subreddits. To understand what dissonance and disbelief mean within conspiracy communities, we first characterize their social imaginaries-a broad understanding of how people collectively imagine their social existence. Focusing on 2K posts from two image boards, 4chan and 8chan, and 1.2 M comments and posts from 12 subreddits dedicated to QAnon, we adopt a mixed-methods approach to uncover the symbolic language representing the movement,expectations,practices,heroes and foes of the QAnon community. We use these social imaginaries to create a computational framework for distinguishing belief and dissonance from general discussion about QAnon, surfacing in the 1.2M comments. We investigate the dissonant comments to characterize the dissonance expressed along QAnon social imaginaries. Further, analyzing user engagement with QAnon conspiracy subreddits, we find that self-disclosures of dissonance correlate with a significant decrease in user contributions and ultimately with their departure from the community. Our work offers a systematic framework for uncovering the dimensions and coded language related to QAnon social imaginaries and can serve as a toolbox for studying other conspiracy theories across different platforms. We also contribute a computational framework for identifying dissonance self-disclosures and measuring the changes in user engagement surrounding dissonance. Our work provide insights into designing dissonance based interventions that can potentially dissuade conspiracists from engaging in online conspiracy discussion communities.more » « less
-
null (Ed.)Widespread conspiracy theories, like those motivating anti-vaccination attitudes or climate change denial, propel collective action, and bear society-wide consequences. Yet, empirical research has largely studied conspiracy theory adoption as an individual pursuit, rather than as a socially mediated process. What makes users join communities endorsing and spreading conspiracy theories? We leverage longitudinal data from 56 conspiracy communities on Reddit to compare individual and social factors determining which users join the communities. Using a quasi-experimental approach, we first identify 30K future conspiracists?(FC) and30K matched non-conspiracists?(NC). We then provide empirical evidence of the importance of social factors across six dimensions relative to the individual factors by analyzing 6 million Reddit comments and posts. Specifically, in social factors, we find that dyadic interactions with members of the conspiracy communities and marginalization outside of the conspiracy communities are the most important social precursors to conspiracy joining-even outperforming individual factor baselines. Our results offer quantitative backing to understand social processes and echo chamber effects in conspiratorial engagement, with important implications for democratic institutions and online communities.more » « less
-
The disruptive offline mobilization of participants in online conspiracy theory (CT) discussions has highlighted the importance of understanding how online users may form radicalized conspiracy beliefs. While prior work researched the factors leading up to joining online CT discussions and provided theories of how conspiracy beliefs form, we have little understanding of how conspiracy radicalization evolves after users join CT discussion communities. In this paper, we provide the empirical modeling of various radicalization phases in online CT discussion participants.To unpack how conspiracy engagement is related to radicalization, we first characterize the users' journey through CT discussions via conspiracy engagement pathways. Specifically, by studying 36K Reddit users through their 169M contributions, we uncover four distinct pathways of conspiracy engagement: steady high, increasing, decreasing, and steady low.We further model three successive stages of radicalization guided by prior theoretical works.Specific sub-populations of users, namely those on steady high and increasing conspiracy engagement pathways, progress successively through various radicalization stages. In contrast, users on the decreasing engagement pathway show distinct behavior: they limit their CT discussions to specialized topics, participate in diverse discussion groups, and show reduced conformity with conspiracy subreddits. By examining users who disengage from online CT discussions, this paper provides promising insights about conspiracy recovery process.more » « less
-
The prevalence and spread of online misinformation during the 2020 US presidential election served to perpetuate a false belief in widespread election fraud. Though much research has focused on how social media platforms connected people to election-related rumors and conspiracy theories, less is known about the search engine pathways that linked users to news content with the potential to undermine trust in elections. In this paper, we present novel data related to the content of political headlines during the 2020 US election period. We scraped over 800,000 headlines from Google's search engine results pages (SERP) in response to 20 election-related keywords—10 general (e.g., "Ballots") and 10 conspiratorial (e.g., "Voter fraud")—when searched from 20 cities across 16 states. We present results from qualitative coding of 5,600 headlines focused on the prevalence of delegitimizing information. Our results reveal that videos (as compared to stories, search results, and advertisements) are the most problematic in terms of exposing users to delegitimizing headlines. We also illustrate how headline content varies when searching from a swing state, adopting a conspiratorial search keyword, or reading from media domains with higher political bias. We conclude with policy recommendations on data transparency that allow researchers to continue to monitor search engines during elections.more » « less