skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Who knowingly shares false political information online?
Some people share misinformation accidentally, but others do so knowingly. To fully understand the spread of misinformation online, it is important to analyze those who purposely share it. Using a 2022 U.S. survey, we found that 14 percent of respondents reported knowingly sharing misinformation, and that these respondents were more likely to also report support for political violence, a desire to run for office, and warm feelings toward extremists. These respondents were also more likely to have elevated levels of a psychological need for chaos, dark tetrad traits, and paranoia. Our findings illuminate one vector through which misinformation is spread.  more » « less
Award ID(s):
2123635
PAR ID:
10515006
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ;
Publisher / Repository:
Harvard Kennedy School
Date Published:
Journal Name:
Harvard Kennedy School Misinformation Review
ISSN:
2766-1652
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Misinformation runs rampant on social media and has been tied to adverse health behaviors such as vaccine hesitancy. Crowdsourcing can be a means to detect and impede the spread of misinformation online. However, past studies have not deeply examined the individual characteristics - such as cognitive factors and biases - that predict crowdworker accuracy at identifying misinformation. In our study (n = 265), Amazon Mechanical Turk (MTurk) workers and university students assessed the truthfulness and sentiment of COVID-19 related tweets as well as answered several surveys on personal characteristics. Results support the viability of crowdsourcing for assessing misinformation and content stance (i.e., sentiment) related to ongoing and politically-charged topics like the COVID-19 pandemic, however, alignment with experts depends on who is in the crowd. Specifically, we find that respondents with high Cognitive Reflection Test (CRT) scores, conscientiousness, and trust in medical scientists are more aligned with experts while respondents with high Need for Cognitive Closure (NFCC) and those who lean politically conservative are less aligned with experts. We see differences between recruitment platforms as well, as our data shows university students are on average more aligned with experts than MTurk workers, most likely due to overall differences in participant characteristics on each platform. Results offer transparency into how crowd composition affects misinformation and stance assessment and have implications on future crowd recruitment and filtering practices. 
    more » « less
  2. The spread of misinformation online is a global problem that requires global solutions. To that end, we conducted an experiment in 16 countries across 6 continents (N = 34,286; 676,605 observations) to investigate predictors of susceptibility to misinformation about COVID-19, and interventions to combat the spread of this misinformation. In every country, participants with a more analytic cognitive style and stronger accuracy-related motivations were better at discerning truth from falsehood; valuing democracy was also associated with greater truth discernment, whereas endorsement of individual responsibility over government support was negatively associated with truth discernment in most countries. Subtly prompting people to think about accuracy had a generally positive effect on the veracity of news that people were willing to share across countries, as did minimal digital literacy tips. Finally, aggregating the ratings of our non-expert participants was able to differentiate true from false headlines with high accuracy in all countries via the ‘wisdom of crowds’. The consistent patterns we observe suggest that the psychological factors underlying the misinformation challenge are similar across different regional settings, and that similar solutions may be broadly effective. 
    more » « less
  3. Online misinformation is believed to have contributed to vaccine hesitancy during the Covid-19 pandemic, highlighting concerns about social media’s destabilizing role in public life. Previous research identified a link between political conservatism and sharing misinformation; however, it is not clear how partisanship affects how much misinformation people see online. As a result, we do not know whether partisanship drives exposure to misinformation or people selectively share misinformation despite being exposed to factual content. To address this question, we study Twitter discussions about the Covid-19 pandemic, classifying users along the political and factual spectrum based on the information sources they share. In addition, we quantify exposure through retweet interactions. We uncover partisan asymmetries in the exposure to misinformation: conservatives are more likely to see and share misinformation, and while users’ connections expose them to ideologically congruent content, the interactions between political and factual dimensions create conditions for the highly polarized users—hardline conservatives and liberals—to amplify misinformation. Overall, however, misinformation receives less attention than factual content and political moderates, the bulk of users in our sample, help filter out misinformation. Identifying the extent of polarization and how political ideology exacerbates misinformation can help public health experts and policy makers improve their messaging. 
    more » « less
  4. Guidi, Barbara (Ed.)
    The COVID-19 pandemic brought widespread attention to an “infodemic” of potential health misinformation. This claim has not been assessed based on evidence. We evaluated if health misinformation became more common during the pandemic. We gathered about 325 million posts sharing URLs from Twitter and Facebook during the beginning of the pandemic (March 8-May 1, 2020) compared to the same period in 2019. We relied on source credibility as an accepted proxy for misinformation across this database. Human annotators also coded a subsample of 3000 posts with URLs for misinformation. Posts about COVID-19 were 0.37 times as likely to link to “not credible” sources and 1.13 times more likely to link to “more credible” sources than prior to the pandemic. Posts linking to “not credible” sources were 3.67 times more likely to include misinformation compared to posts from “more credible” sources. Thus, during the earliest stages of the pandemic, when claims of an infodemic emerged, social media contained proportionally less misinformation than expected based on the prior year. Our results suggest that widespread health misinformation is not unique to COVID-19. Rather, it is a systemic feature of online health communication that can adversely impact public health behaviors and must therefore be addressed. 
    more » « less
  5. ImportanceThe COVID-19 pandemic has been notable for the widespread dissemination of misinformation regarding the virus and appropriate treatment. ObjectiveTo quantify the prevalence of non–evidence-based treatment for COVID-19 in the US and the association between such treatment and endorsement of misinformation as well as lack of trust in physicians and scientists. Design, Setting, and ParticipantsThis single-wave, population-based, nonprobability internet survey study was conducted between December 22, 2022, and January 16, 2023, in US residents 18 years or older who reported prior COVID-19 infection. Main Outcome and MeasureSelf-reported use of ivermectin or hydroxychloroquine, endorsing false statements related to COVID-19 vaccination, self-reported trust in various institutions, conspiratorial thinking measured by the American Conspiracy Thinking Scale, and news sources. ResultsA total of 13 438 individuals (mean [SD] age, 42.7 [16.1] years; 9150 [68.1%] female and 4288 [31.9%] male) who reported prior COVID-19 infection were included in this study. In this cohort, 799 (5.9%) reported prior use of hydroxychloroquine (527 [3.9%]) or ivermectin (440 [3.3%]). In regression models including sociodemographic features as well as political affiliation, those who endorsed at least 1 item of COVID-19 vaccine misinformation were more likely to receive non–evidence-based medication (adjusted odds ratio [OR], 2.86; 95% CI, 2.28-3.58). Those reporting trust in physicians and hospitals (adjusted OR, 0.74; 95% CI, 0.56-0.98) and in scientists (adjusted OR, 0.63; 95% CI, 0.51-0.79) were less likely to receive non–evidence-based medication. Respondents reporting trust in social media (adjusted OR, 2.39; 95% CI, 2.00-2.87) and in Donald Trump (adjusted OR, 2.97; 95% CI, 2.34-3.78) were more likely to have taken non–evidence-based medication. Individuals with greater scores on the American Conspiracy Thinking Scale were more likely to have received non–evidence-based medications (unadjusted OR, 1.09; 95% CI, 1.06-1.11; adjusted OR, 1.10; 95% CI, 1.07-1.13). Conclusions and RelevanceIn this survey study of US adults, endorsement of misinformation about the COVID-19 pandemic, lack of trust in physicians or scientists, conspiracy-mindedness, and the nature of news sources were associated with receiving non–evidence-based treatment for COVID-19. These results suggest that the potential harms of misinformation may extend to the use of ineffective and potentially toxic treatments in addition to avoidance of health-promoting behaviors. 
    more » « less