skip to main content


Title: Rise of QAnon: A mental model of good and evil stews in an echochamber
The QAnon conspiracy posits that Satan-worshiping Democrats operate a covert child sex-trafficking operation, which Donald Trump is destined to expose and annihilate. Emblematic of the ease with which political misconceptions can spread through social media, QAnon originated in late 2017 and rapidly grew to shape the political beliefs of millions. To illuminate the process by which a conspiracy theory spreads, we report two computational studies examining the social network structure and semantic content of tweets produced by users central to the early QAnon network on Twitter. Using data mined in the summer of 2018, we examined over 800,000 tweets about QAnon made by about 100,000 users. The majority of users disseminated rather than produced information, serving to create an online echochamber. Users appeared to hold a simplistic mental model in which political events are viewed as a struggle between antithetical forces—both observed and unobserved—of Good and Evil.  more » « less
Award ID(s):
1827374
NSF-PAR ID:
10231805
Author(s) / Creator(s):
; ;
Editor(s):
Fitch, T.; Lamm, C.; Leder, H.; Teßmar-Raible, K.
Date Published:
Journal Name:
Proceedings of the 43rd Annual Meeting of the Cognitive Science Society
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. While COVID-19 text misinformation has already been investigated by various scholars, fewer research efforts have been devoted to characterizing and understanding COVID-19 misinformation that is carried out through visuals like photographs and memes. In this paper, we present a mixed-method analysis of image-based COVID-19 misinformation in 2020 on Twitter. We deploy a computational pipeline to identify COVID-19 related tweets, download the images contained in them, and group together visually similar images. We then develop a codebook to characterize COVID-19 misinformation and manually label images as misinformation or not. Finally, we perform a quantitative analysis of tweets containing COVID-19 misinformation images. We identify five types of COVID-19 misinformation, from a wrong understanding of the threat severity of COVID-19 to the promotion of fake cures and conspiracy theories. We also find that tweets containing COVID-19 misinformation images do not receive more interactions than baseline tweets with random images posted by the same set of users. As for temporal properties, COVID-19 misinformation images are shared for longer periods of time than non-misinformation ones, as well as have longer burst times. %\ywi added "have'' %\ywFor RQ2, we compare non-misinformation images instead of random images, and so it is not a direct comparison. When looking at the users sharing COVID-19 misinformation images on Twitter from the perspective of their political leanings, we find that pro-Democrat and pro-Republican users share a similar amount of tweets containing misleading or false COVID-19 images. However, the types of images that they share are different: while pro-Democrat users focus on misleading claims about the Trump administration's response to the pandemic, as well as often sharing manipulated images intended as satire, pro-Republican users often promote hydroxychloroquine, an ineffective medicine against COVID-19, as well as conspiracy theories about the origin of the virus. Our analysis sets a basis for better understanding COVID-19 misinformation images on social media and the nuances in effectively moderate them. 
    more » « less
  2. Budak, Ceren ; Cha, Meeyoung ; Quercia, Daniele ; Xie, Lexing (Ed.)
    Parler is as an ``alternative'' social network promoting itself as a service that allows to ``speak freely and express yourself openly, without fear of being deplatformed for your views.'' Because of this promise, the platform become popular among users who were suspended on mainstream social networks for violating their terms of service, as well as those fearing censorship. In particular, the service was endorsed by several conservative public figures, encouraging people to migrate from traditional social networks. After the storming of the US Capitol on January 6, 2021, Parler has been progressively deplatformed, as its app was removed from Apple/Google Play stores and the website taken down by the hosting provider. This paper presents a dataset of 183M Parler posts made by 4M users between August 2018 and January 2021, as well as metadata from 13.25M user profiles. We also present a basic characterization of the dataset, which shows that the platform has witnessed large influxes of new users after being endorsed by popular figures, as well as a reaction to the 2020 US Presidential Election. We also show that discussion on the platform is dominated by conservative topics, President Trump, as well as conspiracy theories like QAnon. 
    more » « less
  3. Conspiracy theories and misinformation (CTM) became a salient feature of the Trump era. However, traditional explanations of political attitudes and behaviors inadequately account for beliefs in CTM or the deleterious behaviors they are associated with. Here, we integrate disparate literatures to explain beliefs in CTM regarding COVID-19, QAnon, and voter fraud. We aim to provide a more holistic accounting, and to determine which political, psychological, and social factors are most associated with such beliefs. Using a unique national survey, we find that anti-social personality traits, anti-establishment orientations, and support for Donald Trump are more strongly related to beliefs in CTM than traditional left-right orientations or other frequently posited factors, such as education, science literacy, and social media use. Our findings encourage researchers to move beyond the traditional correlates of political behavior when examining beliefs that express anti-social tendencies or a deep skepticism of social and political institutions.

     
    more » « less
  4. Online discussion platforms provide a forum to strengthen and propagate belief in misinformed conspiracy theories. Yet, they also offer avenues for conspiracy theorists to express their doubts and experiences of cognitive dissonance. Such expressions of dissonance may shed light on who abandons misguided beliefs and under what circumstances. This paper characterizes self-disclosures of dissonance about QAnon-a conspiracy theory initiated by a mysterious leader "Q" and popularized by their followers ?anons"-in conspiratorial subreddits. To understand what dissonance and disbelief mean within conspiracy communities, we first characterize their social imaginaries-a broad understanding of how people collectively imagine their social existence. Focusing on 2K posts from two image boards, 4chan and 8chan, and 1.2 M comments and posts from 12 subreddits dedicated to QAnon, we adopt a mixed-methods approach to uncover the symbolic language representing the movement,expectations,practices,heroes and foes of the QAnon community. We use these social imaginaries to create a computational framework for distinguishing belief and dissonance from general discussion about QAnon, surfacing in the 1.2M comments. We investigate the dissonant comments to characterize the dissonance expressed along QAnon social imaginaries. Further, analyzing user engagement with QAnon conspiracy subreddits, we find that self-disclosures of dissonance correlate with a significant decrease in user contributions and ultimately with their departure from the community. Our work offers a systematic framework for uncovering the dimensions and coded language related to QAnon social imaginaries and can serve as a toolbox for studying other conspiracy theories across different platforms. We also contribute a computational framework for identifying dissonance self-disclosures and measuring the changes in user engagement surrounding dissonance. Our work provide insights into designing dissonance based interventions that can potentially dissuade conspiracists from engaging in online conspiracy discussion communities. 
    more » « less
  5. The prevalence and spread of online misinformation during the 2020 US presidential election served to perpetuate a false belief in widespread election fraud. Though much research has focused on how social media platforms connected people to election-related rumors and conspiracy theories, less is known about the search engine pathways that linked users to news content with the potential to undermine trust in elections. In this paper, we present novel data related to the content of political headlines during the 2020 US election period. We scraped over 800,000 headlines from Google's search engine results pages (SERP) in response to 20 election-related keywords—10 general (e.g., "Ballots") and 10 conspiratorial (e.g., "Voter fraud")—when searched from 20 cities across 16 states. We present results from qualitative coding of 5,600 headlines focused on the prevalence of delegitimizing information. Our results reveal that videos (as compared to stories, search results, and advertisements) are the most problematic in terms of exposing users to delegitimizing headlines. We also illustrate how headline content varies when searching from a swing state, adopting a conspiratorial search keyword, or reading from media domains with higher political bias. We conclude with policy recommendations on data transparency that allow researchers to continue to monitor search engines during elections. 
    more » « less