Conspiracy theories and misinformation (CTM) became a salient feature of the Trump era. However, traditional explanations of political attitudes and behaviors inadequately account for beliefs in CTM or the deleterious behaviors they are associated with. Here, we integrate disparate literatures to explain beliefs in CTM regarding COVID-19, QAnon, and voter fraud. We aim to provide a more holistic accounting, and to determine which political, psychological, and social factors are most associated with such beliefs. Using a unique national survey, we find that anti-social personality traits, anti-establishment orientations, and support for Donald Trump are more strongly related to beliefs in CTM than traditional left-right orientations or other frequently posited factors, such as education, science literacy, and social media use. Our findings encourage researchers to move beyond the traditional correlates of political behavior when examining beliefs that express anti-social tendencies or a deep skepticism of social and political institutions.
This content will become publicly available on October 26, 2023
- Jonason, Peter Karl
- Award ID(s):
- Publication Date:
- NSF-PAR ID:
- Journal Name:
- PLOS ONE
- Page Range or eLocation-ID:
- Sponsoring Org:
- National Science Foundation
More Like this
How Anti-Social Personality Traits and Anti-Establishment Views Promote Beliefs in Election Fraud, QAnon, and COVID-19 Conspiracy Theories and Misinformation
Widespread uptake of vaccines is necessary to achieve herd immunity. However, uptake rates have varied across U.S. states during the first six months of the COVID-19 vaccination program. Misbeliefs may play an important role in vaccine hesitancy, and there is a need to understand relationships between misinformation, beliefs, behaviors, and health outcomes. Here we investigate the extent to which COVID-19 vaccination rates and vaccine hesitancy are associated with levels of online misinformation about vaccines. We also look for evidence of directionality from online misinformation to vaccine hesitancy. We find a negative relationship between misinformation and vaccination uptake rates. Online misinformation is also correlated with vaccine hesitancy rates taken from survey data. Associations between vaccine outcomes and misinformation remain significant when accounting for political as well as demographic and socioeconomic factors. While vaccine hesitancy is strongly associated with Republican vote share, we observe that the effect of online misinformation on hesitancy is strongest across Democratic rather than Republican counties. Granger causality analysis shows evidence for a directional relationship from online misinformation to vaccine hesitancy. Our results support a need for interventions that address misbeliefs, allowing individuals to make better-informed health decisions.
COVID-19 risk perception and vaccine acceptance in individuals with self-reported chronic respiratory or autoimmune conditions
COVID-19 disproportionately affects those with preexisting conditions, but little research has determined whether those with chronic diseases view the pandemic itself differently - and whether there are differences between chronic diseases. We theorized that while individuals with respiratory disease or autoimmune disorders would perceive greater threat from COVID-19 and be more supportive of non-pharmaceutical interventions (NPIs), those with autoimmune disorders would be less likely to support vaccination-based interventions.
We conducted a two-wave online survey conducted in February and November 2021 asking respondents their beliefs about COVID-19 risk perception, adoption and support of interventions, willingness to be vaccinated against COVID-19, and reasons for vaccination. Regression analysis was conducted to assess the relationship of respondents reporting a chronic disease and COVID-19 behaviors and attitudes, compared to healthy respondents adjusting for demographic and political factors.
In the initial survey, individuals reporting a chronic disease had both stronger feelings of risk from COVID-19 as well as preferences for NPIs than healthy controls. The only NPI that was still practiced significantly more compared to healthy controls in the resample was limiting trips outside of the home. Support for community-level NPIs was higher among individuals reporting a chronic disease than healthy controls and remained highmore »
It is not enough to recognize the importance of health in determining attitudes: nuanced differences between conditions must also be recognized.
Who's in the Crowd Matters: Cognitive Factors and Beliefs Predict Misinformation Assessment AccuracyMisinformation runs rampant on social media and has been tied to adverse health behaviors such as vaccine hesitancy. Crowdsourcing can be a means to detect and impede the spread of misinformation online. However, past studies have not deeply examined the individual characteristics - such as cognitive factors and biases - that predict crowdworker accuracy at identifying misinformation. In our study (n = 265), Amazon Mechanical Turk (MTurk) workers and university students assessed the truthfulness and sentiment of COVID-19 related tweets as well as answered several surveys on personal characteristics. Results support the viability of crowdsourcing for assessing misinformation and content stance (i.e., sentiment) related to ongoing and politically-charged topics like the COVID-19 pandemic, however, alignment with experts depends on who is in the crowd. Specifically, we find that respondents with high Cognitive Reflection Test (CRT) scores, conscientiousness, and trust in medical scientists are more aligned with experts while respondents with high Need for Cognitive Closure (NFCC) and those who lean politically conservative are less aligned with experts. We see differences between recruitment platforms as well, as our data shows university students are on average more aligned with experts than MTurk workers, most likely due to overall differences in participant characteristicsmore »
While COVID-19 text misinformation has already been investigated by various scholars, fewer research efforts have been devoted to characterizing and understanding COVID-19 misinformation that is carried out through visuals like photographs and memes. In this paper, we present a mixed-method analysis of image-based COVID-19 misinformation in 2020 on Twitter. We deploy a computational pipeline to identify COVID-19 related tweets, download the images contained in them, and group together visually similar images. We then develop a codebook to characterize COVID-19 misinformation and manually label images as misinformation or not. Finally, we perform a quantitative analysis of tweets containing COVID-19 misinformation images. We identify five types of COVID-19 misinformation, from a wrong understanding of the threat severity of COVID-19 to the promotion of fake cures and conspiracy theories. We also find that tweets containing COVID-19 misinformation images do not receive more interactions than baseline tweets with random images posted by the same set of users. As for temporal properties, COVID-19 misinformation images are shared for longer periods of time than non-misinformation ones, as well as have longer burst times. %\ywi added "have'' %\ywFor RQ2, we compare non-misinformation images instead of random images, and so it is not a direct comparison. When lookingmore »