skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Multiple-choice quizzes improve memory for misinformation debunks, but do not reduce belief in misinformation
Fact-checkers want people to both read and remember their misinformation debunks. Retrieval practice is one way to increase memory, thus multiple-choice quizzes may be a useful tool for fact-checkers. We tested whether exposure to quizzes improved people’s accuracy ratings for fact-checked claims and their memory for specific information within a fact check. Across three experiments, 1551 US-based online participants viewed fact checks (either health- or politics-related) with or without a quiz. Overall, the fact checks were effective, and participants were more accurate in rating the claims after exposure. In addition, quizzes improved participants’ memory for the details of the fact checks, even 1 week later. However, that increased memory did not lead to more accurate beliefs. Participants’ accuracy ratings were similar in the quiz and no-quiz conditions. Multiple-choice quizzes can be a useful tool for increasing memory, but there is a disconnect between memory and belief.  more » « less
Award ID(s):
2122640
PAR ID:
10527663
Author(s) / Creator(s):
; ;
Publisher / Repository:
Springer
Date Published:
Journal Name:
Cognitive Research: Principles and Implications
Volume:
8
Issue:
1
ISSN:
2365-7464
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Countering misinformation can reduce belief in the moment, but corrective messages quickly fade from memory. We tested whether the longer-term impact of fact-checks depends on when people receive them. In two experiments (total N = 2,683), participants read true and false headlines taken from social media. In the treatment conditions, “true” and “false” tags appeared before, during, or after participants read each headline. Participants in a control condition received no information about veracity. One week later, participants in all conditions rated the same headlines’ accuracy. Providing fact-checks after headlines ( debunking ) improved subsequent truth discernment more than providing the same information during ( labeling ) or before ( prebunking ) exposure. This finding informs the cognitive science of belief revision and has practical implications for social media platform designers. 
    more » « less
  2. Students achieve functional knowledge retention through active, spaced repetition of concepts through homework, quizzes, and lectures. True knowledge retention is best achieved through proper comprehension of the concept. In the engineering curriculum, courses are sequenced into prerequisite chains of three to five courses per subfield –- a design aimed at developing and reinforcing core concepts over time. Knowledge retention of these prerequisite concepts is important for the next course. In this project, concept review quizzes were used to identify the gaps and deficiencies in students' prerequisite knowledge and measure improvement after a concept review intervention. Two quizzes (pre-intervention and post-intervention) drew inspiration from the standard concept inventories for fundamental concepts and include concepts such as Free Body Diagrams, Contact and Reaction Forces, Equilibrium Equations, and Calculation of the Moment. Concept inventories are typically multiple-choice, in this evaluation the concept questions were open-ended. A clear rubric was created to identify the missing prerequisite concepts in the students' knowledge. These quizzes were deployed in Mechanics of Materials, a second-level course in the engineering mechanics curriculum (the second in a sequence of four courses: Statics, Mechanics of Materials, Mechanical Design, and Kinematic Design). The pre-quiz was administered (unannounced) at the beginning of the class. The class then actively participated in a 30-minute concept review. A different post-quiz was administered in the same class period after the review. Quizzes were graded with a rubric to measure the effect of the concept review intervention on the students’ knowledge demonstration and calculations. The study evaluated four major concepts: free body diagrams, boundary reaction forces (fixed, pin, and contact), equilibrium, and moment calculation. Students showed improvements of up to 39\% in the case of drawing a free body diagram with fixed boundary condition, but continued to struggle with free body diagram involving contact forces. This study was performed at a large public institution in a class size of 240 students. A total of 224 students consented to the use of their data for this study (and attended class on the day of the intervention). The pre-quiz is used to determine the gaps (or deficiencies) in conceptual understanding among students. The post-quiz measures the response to the review and is used to determine which concept deficiencies were significantly improved by the review, and which concept deficiencies were not significantly improved by the concept review. This study presents a concept quiz and associated rubric for measuring student improvement resulting from an in-class intervention (concept review). It quantifies a significant improvement in the students’ retrieval of their prerequisite knowledge after a concept review session. This approach, therefore, has utility for improving knowledge retention in programs with a similar, sequenced course design. 
    more » « less
  3. Past work has explored various ways for online platforms to leverage crowd wisdom for misinformation detection and moderation. Yet, platforms often relegate governance to their communities, and limited research has been done from the perspective of these communities and their moderators. How is misinformation currently moderated in online communities that are heavily self-governed? What role does the crowd play in this process, and how can this process be improved? In this study, we answer these questions through semi-structured interviews with Reddit moderators. We focus on a case study of COVID-19 misinformation. First, our analysis identifies a general moderation workflow model encompassing various processes participants use for handling COVID-19 misinformation. Further, we show that the moderation workflow revolves around three elements: content facticity, user intent, and perceived harm. Next, our interviews reveal that Reddit moderators rely on two types of crowd wisdom for misinformation detection. Almost all participants are heavily reliant on reports from crowds of ordinary users to identify potential misinformation. A second crowd--participants' own moderation teams and expert moderators of other communities--provide support when participants encounter difficult, ambiguous cases. Finally, we use design probes to better understand how different types of crowd signals---from ordinary users and moderators---readily available on Reddit can assist moderators with identifying misinformation. We observe that nearly half of all participants preferred these cues over labels from expert fact-checkers because these cues can help them discern user intent. Additionally, a quarter of the participants distrust professional fact-checkers, raising important concerns about misinformation moderation. 
    more » « less
  4. Verifying political claims is a challenging task, as politicians can use various tactics to subtly misrepresent the facts for their agenda. Existing automatic fact-checking systems fall short here, and their predictions like "half-true" are not very useful in isolation, since it is unclear which parts of a claim are true or false. In this work, we focus on decomposing a complex claim into a comprehensive set of yes-no subquestions whose answers influence the veracity of the claim. We present CLAIMDECOMP, a dataset of decompositions for over 1000 claims. Given a claim and its verification paragraph written by fact-checkers, our trained annotators write subquestions covering both explicit propositions of the original claim and its implicit facets, such as additional political context that changes our view of the claim's veracity. We study whether state-of-the-art pre-trained models can learn to generate such subquestions. Our experiments show that these models generate reasonable questions, but predicting implied subquestions based only on the claim (without consulting other evidence) remains challenging. Nevertheless, we show that predicted subquestions can help identify relevant evidence to fact-check the full claim and derive the veracity through their answers, suggesting that claim decomposition can be a useful piece of a fact-checking pipeline. 
    more » « less
  5. Abstract Curiosity can be a powerful motivator to learn and retain new information. Evidence shows that high states of curiosity elicited by a specific source (i.e., a trivia question) can promote memory for incidental stimuli (non-target) presented close in time. The spreading effect of curiosity states on memory for other information has potential for educational applications. Specifically, it could provide techniques to improve learning for information that did not spark a sense of curiosity on its own. Here, we investigated how high states of curiosity induced through trivia questions affect memory performance for unrelated scholastic facts (e.g., scientific, English, or historical facts) presented in close temporal proximity to the trivia question. Across three task versions, participants viewed trivia questions closely followed in time by a scholastic fact unrelated to the trivia question, either just prior to or immediately following the answer to the trivia question. Participants then completed a surprise multiple-choice memory test (akin to a pop quiz) for the scholastic material. In all three task versions, memory performance was poorer for scholastic facts presented after trivia questions that had elicited high versus low levels of curiosity. These results contradict previous findings showing curiosity-enhanced memory for incidentally presented visual stimuli and suggest that target information that generates a high-curiosity state interferes with encoding complex and unrelated scholastic facts presented close in time. 
    more » « less