Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
The COVID-19 pandemic demonstrated the importance of social distancing practices to stem the spread of the virus. However, compliance with public health guidelines was mixed. Understanding what factors are associated with differences in compliance can improve public health messaging since messages could be targeted and tailored to different population segments. We utilize Twitter data on social mobility during COVID-19 to reveal which populations practiced social distancing and what factors correlated with this practice. We analyze correlations between demographic and political affiliation with reductions in physical mobility measured by public geolocation tweets. We find significant differences in mobility reduction between these groups in the United States. We observe that males, Asian and Latinx individuals, older individuals, Democrats, and people from higher population density states exhibited larger reductions in movement. Furthermore, our study also unveils meaningful insights into the interactions between different groups. We hope these findings will provide evidence to support public health policy-making.more » « lessFree, publicly-accessible full text available December 1, 2025
-
Influence operations are large-scale efforts to manipulate public opinion. The rapid detection and disruption of these operations is critical for healthy public discourse. Emergent AI technologies may enable novel operations that evade detection and influence public discourse on social media with greater scale, reach, and specificity. New methods of detection with inductive learning capacity will be needed to identify novel operations before they indelibly alter public opinion and events. To this end, we develop an inductive learning framework that: (1) determines content- and graph-based indicators that are not specific to any operation; (2) uses graph learning to encode abstract signatures of coordinated manipulation; and (3) evaluates generalization capacity by training and testing models across operations originating from Russia, China, and Iran. We find that this framework enables strong cross-operation generalization while also revealing salient indicators-illustrating a generic approach which directly complements transductive methodologies, thereby enhancing detection coverage.more » « lessFree, publicly-accessible full text available December 1, 2024
-
Background. Vaccine misinformation has been widely spread on social media, but attempts to combat it have not taken advantage of the attributes of social media platforms for health education. Methods. The objective was to test the efficacy of moderated social media discussions about COVID-19 vaccines in private Facebook groups. Unvaccinated U.S. adults were recruited using Amazon’s Mechanical Turk and randomized. In the intervention group, moderators posted two informational posts per day for 4 weeks and engaged in relationship-building interactions with group members. In the control group, participants received a referral to Facebook’s COVID-19 Information Center. Follow-up surveys with participants (N = 478) were conducted 6 weeks post-enrollment. Results. At 6 weeks follow-up, no differences were found in vaccination rates. Intervention participants were more likely to show improvements in their COVID-19 vaccination intentions (vs. stay same or decline) compared with control (p = .03). They also improved more in their intentions to encourage others to vaccinate for COVID-19. There were no differences in COVID-19 vaccine confidence or intentions between groups. General vaccine and responsibility to vaccinate were higher in the intervention compared with control. Most participants in the intervention group reported high levels of satisfaction. Participants engaged with content (e.g., commented, reacted) 11.8 times on average over the course of 4 weeks. Conclusions. Engaging with vaccine-hesitant individuals in private Facebook groups improved some COVID-19 vaccine-related beliefs and represents a promising strategy.more » « lessFree, publicly-accessible full text available February 1, 2025
-
Background: Distrust and partisan identity are theorized to undermine health communications. We examined the role of these factors on the efficacy of discussion groups intended to promote vaccine uptake. Method: W e analyzed survey data from unvaccinated Facebook users (N = 371) living in the US between January and April 2022. Participants were randomly assigned to Facebook discussion groups (intervention) or referred to Facebook ’s COVID-19 Information Center (control). We used Analysis of Covariance to test if the intervention was more effective at changing vaccination intentions and beliefs compared to the control in subgroups based on participants ’ p artisan identity, political views, and information trust views. Results: W e found a significant interaction between the intervention and trust in public health institutions (PHIs) for improving intentions to vaccinate (P = .04), intentions to encourage others to vaccinate ( P = .03), and vaccine confidence beliefs ( P = .01). Among participants who trusted PHIs, those in the intervention had higher posttest intentions to vaccinate ( P = .008) and intentions to encourage others to vaccinate ( P = .002) compared to the control. Among non-conservatives, participants in the intervention had higher posttest intentions to vaccinate ( P = .048). The intervention was more effective at improving intentions to encourage others to vaccinate within the subgroups of Republicans ( P = .03), conservatives (P = .02), and participants who distrusted government ( P = .02). Conclusions: Facebook discussion groups were more effective for people who trusted PHIs and non-conservatives. Health communicators may need to segment health messaging and develop strategies around trust views.more » « less
-
Online misinformation promotes distrust in science, undermines public health, and may drive civil unrest. During the coronavirus disease 2019 pandemic, Facebook—the world’s largest social media company—began to remove vaccine misinformation as a matter of policy. We evaluated the efficacy of these policies using a comparative interrupted time-series design. We found that Facebook removed some antivaccine content, but we did not observe decreases in overall engagement with antivaccine content. Provaccine content was also removed, and antivaccine content became more misinformative, more politically polarized, and more likely to be seen in users’ newsfeeds. We explain these findings as a consequence of Facebook’s system architecture, which provides substantial flexibility to motivated users who wish to disseminate misinformation through multiple channels. Facebook’s architecture may therefore afford antivaccine content producers several means to circumvent the intent of misinformation removal policies.more » « less