skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Replication Data for Tiplines to Uncover Misinformation on Encrypted Platforms: A Case Study of the 2019 Indian General Election on WhatsApp
Metadata on the times at which text and image messages were submitted to a tipline and public groups along with similarity/clustering data grouping messages. Please see the README file and the published paper for further details. Please cite the following publication if you use this data: Kazemi, A., Garimella, K., Shahi, G. K., Gaffney, D., & Hale, S. A. (2022). Research note: Tiplines to uncover misinformation on encrypted platforms: A case study of the 2019 Indian general election on WhatsApp. Harvard Kennedy School (HKS) Misinformation Review. https://doi.org/10.37016/mr-2020-91  more » « less
Award ID(s):
2052335
PAR ID:
10340916
Author(s) / Creator(s):
; ; ; ;
Publisher / Repository:
Harvard Dataverse
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. A framework is presented for understanding how misinformation shapes decision-making, which has cognitive representations of gist at its core. I discuss how the framework goes beyond prior work, and how it can be implemented so that valid scientific messages are more likely to be effective, remembered, and shared through social media, while misinformation is resisted. The distinction between mental representations of the rote facts of a message—its verbatim representation—and its gist explains several paradoxes, including the frequent disconnect between knowing facts and, yet, making decisions that seem contrary to those facts. Decision makers can falsely remember the gist as seen or heard even when they remember verbatim facts. Indeed, misinformation can be more compelling than information when it provides an interpretation of reality that makes better sense than the facts. Consequently, for many issues, scientific information and misinformation are in a battle for the gist. A fuzzy-processing preference for simple gist explains expectations for antibiotics, the spread of misinformation about vaccination, and responses to messages about global warming, nuclear proliferation, and natural disasters. The gist, which reflects knowledge and experience, induces emotions and brings to mind social values. However, changing mental representations is not sufficient by itself; gist representations must be connected to values. The policy choice is not simply between constraining behavior or persuasion—there is another option. Science communication needs to shift from an emphasis on disseminating rote facts to achieving insight, retaining its integrity but without shying away from emotions and values. 
    more » « less
  2. Jonason, Peter Karl (Ed.)
    At the time of writing, nearly one hundred published studies demonstrate that beliefs in COVID-19 conspiracy theories and misinformation are negatively associated with COVID-19 preventive behaviors. These correlational findings are often interpreted as evidence that beliefs in conspiracy theories and misinformation are exogenous factors that shape human behavior, such as forgoing vaccination. This interpretation has motivated researchers to develop methods for “prebunking,” “debunking,” or otherwise limiting the spread of conspiracy theories and misinformation online. However, the robust literatures on conspiracy theory beliefs, health behaviors, and media effects lead us to question whether beliefs in conspiracy theories and misinformation should be treated as exogenous to vaccine hesitancy and refusal. Employing U.S. survey data (n = 2,065) from July 2021, we show that beliefs in COVID-19 conspiracy theories and misinformation are not only related to COVID-19 vaccine hesitancy and refusal, but also strongly associated with the same psychological, social, and political motivations theorized to drive COVID-19 vaccine hesitancy and refusal. These findings suggest that beliefs in conspiracy theories and misinformation might not always be an exogenous cause, but rather a manifestation of the same factors that lead to vaccine hesitancy and refusal. We conclude by encouraging researchers to carefully consider modeling choices and imploring practitioners to refocus on the worldviews, personality traits, and political orientations that underlie both health-related behaviors and beliefs in conspiracy theories and misinformation. 
    more » « less
  3. null (Ed.)
    The ongoing pandemic has heightened the need for developing tools to flag COVID-19-related misinformation on the internet, specifically on social media such as Twitter. However, due to novel language and the rapid change of information, existing misinformation detection datasets are not effective for evaluating systems designed to detect misinformation on this topic. Misinformation detection can be divided into two sub-tasks: (i) retrieval of misconceptions relevant to posts being checked for veracity, and (ii) stance detection to identify whether the posts Agree, Disagree, or express No Stance towards the retrieved misconceptions. To facilitate research on this task, we release COVIDLies (https://ucinlp.github.io/covid19 ), a dataset of 6761 expert-annotated tweets to evaluate the performance of misinformation detection systems on 86 different pieces of COVID-19 related misinformation. We evaluate existing NLP systems on this dataset, providing initial benchmarks and identifying key challenges for future models to improve upon. 
    more » « less
  4. Given the pervasiveness and dangers of misinformation, there has been a surge of research dedicated to uncovering predictors of and interventions for misinformation receptivity. One promising individual differences variable is intellectual humility (IH), which reflects a willingness to acknowledge the limitations of one’s views. Research has found that IH is correlated with less belief in misinformation, greater intentions to engage in evidence-based behaviors (e.g., receive vaccinations), and more actual engagement in evidence-based behaviors (e.g., take COVID-19 precautions). We sought to synthesize this growing area of research in a multi-level meta-analytic review (k = 27, S = 54, ES = 469, N = 33,814) to provide an accurate estimate of the relations between IH and misinformation receptivity and clarify potential sources of heterogeneity. We found that IH was related to less misinformation receptivity for beliefs (r = -.15, 95% CI [-.19, -.12]) and greater intentions to move away from misinformation (r = .13, 95% CI [.06, .19]) and behaviors that move people away from misinformation (r = .30, 95% CI [.24, .36]). Effect sizes were generally small, and moderator analyses revealed that effects were stronger for comprehensive (as opposed to narrow) measures of IH. These findings suggest that IH is one path for understanding resilience against misinformation, and we leverage our results to highlight pressing areas for future research focused on boundary conditions, risk factors, and causal implications. 
    more » « less
  5. While COVID-19 text misinformation has already been investigated by various scholars, fewer research efforts have been devoted to characterizing and understanding COVID-19 misinformation that is carried out through visuals like photographs and memes. In this paper, we present a mixed-method analysis of image-based COVID-19 misinformation in 2020 on Twitter. We deploy a computational pipeline to identify COVID-19 related tweets, download the images contained in them, and group together visually similar images. We then develop a codebook to characterize COVID-19 misinformation and manually label images as misinformation or not. Finally, we perform a quantitative analysis of tweets containing COVID-19 misinformation images. We identify five types of COVID-19 misinformation, from a wrong understanding of the threat severity of COVID-19 to the promotion of fake cures and conspiracy theories. We also find that tweets containing COVID-19 misinformation images do not receive more interactions than baseline tweets with random images posted by the same set of users. As for temporal properties, COVID-19 misinformation images are shared for longer periods of time than non-misinformation ones, as well as have longer burst times. %\ywi added "have'' %\ywFor RQ2, we compare non-misinformation images instead of random images, and so it is not a direct comparison. When looking at the users sharing COVID-19 misinformation images on Twitter from the perspective of their political leanings, we find that pro-Democrat and pro-Republican users share a similar amount of tweets containing misleading or false COVID-19 images. However, the types of images that they share are different: while pro-Democrat users focus on misleading claims about the Trump administration's response to the pandemic, as well as often sharing manipulated images intended as satire, pro-Republican users often promote hydroxychloroquine, an ineffective medicine against COVID-19, as well as conspiracy theories about the origin of the virus. Our analysis sets a basis for better understanding COVID-19 misinformation images on social media and the nuances in effectively moderate them. 
    more » « less