skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: What Did They Learn? Objective Assessment Tools Show Mixed Effects of Training on Science Communication Behaviors
There is widespread agreement about the need to assess the success of programs training scientists to communicate more effectively with non-professional audiences. However, there is little agreement about how that should be done. What do we mean when we talk about “effective communication”? What should we measure? How should we measure it? Evaluation of communication training programs often incorporates the views of students or trainers themselves, although this is widely understood to bias the assessment. We recently completed a 3-year experiment to use audiences of non-scientists to evaluate the effect of training on STEM (Science, Technology, Engineering and Math) graduate students’ communication ability. Overall, audiences rated STEM grad students’ communication performance no better after training than before, as we reported in Rubega et al. 2018. However, audience ratings do not reveal whether training changed specific trainee communication behaviors (e.g., jargon use, narrative techniques) even if too little to affect trainees’ overall success. Here we measure trainee communication behavior directly, using multiple textual analysis tools and analysis of trainees’ body language during videotaped talks. We found that student use of jargon declined after training but that use of narrative techniques did not increase. Flesch Reading Ease and Flesch-Kincaid Grade Level scores, used as indicators of complexity of sentences and word choice, were no different after instruction. Trainees’ movement of hands and hesitancy during talks was correlated negatively with audience ratings of credibility and clarity; smiling, on the other hand, was correlated with improvement in credibility, clarity and engagement scores given by audience members. We show that objective tools can be used to measure the success of communication training programs, that non-verbal cues are associated with audience judgments, and that an intensive communication course does change some, if not all, communication behaviors.  more » « less
Award ID(s):
2022036
PAR ID:
10342400
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
Frontiers in Communication
Volume:
6
ISSN:
2297-900X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    As the science community has recognized the vital role of communicating to the public, science communication training has proliferated. The development of rigorous, comparable approaches to assessment of training has not kept pace. We conducted a fully controlled experiment using a semester-long science communication course, and audience assessment of communicator performance. Evaluators scored the communication competence of trainees and their matched, untrained controls, before and after training. Bayesian analysis of the data showed very small gains in communication skills of trainees, and no difference from untrained controls. High variance in scores suggests little agreement on what constitutes “good” communication. 
    more » « less
  2. Graduate students emerging from STEM programs face inequitable professional landscapes in which their ability to practice inclusive and effective science communication with interdisciplinary and public audiences is essential to their success. Yet these students are rarely offered the opportunity to learn and practice inclusive science communication in their graduate programs. Moreover, minoritized students rarely have the opportunity to validate their experiences among peers and develop professional sensibilities through research training. In this article, the authors offer the Science Communication (Sci/Comm) Scholar’s working group at The University of Texas at San Antonio as one model for training graduate students in human dimensions and inclusive science communication for effective public engagement in thesis projects and beyond. The faculty facilitated peer-to-peer working group encouraged participation by women who often face inequities in STEM workplaces. Early results indicate that team-based training in both the science and art of public engagement provides critical exposure to help students understand the methodological care needed for human dimensions research, and to facilitate narrative-based citizen science engagements. The authors demonstrate this through several brief profiles of environmental science graduate students’ thesis projects. Each case emphasizes the importance of research design for public engagement via quantitative surveys and narrative-based science communication interventions. Through a faculty facilitated peer-to-peer working group framework, research design and methodological care function as an integration point for social scientific and rhetorical training for inclusive science communication with diverse audiences. 
    more » « less
  3. null (Ed.)
    Concerns about the spread of misinformation online via news articles have led to the development of many tools and processes involving human annotation of their credibility. However, much is still unknown about how different people judge news credibility or the quality or reliability of news credibility ratings from populations of varying expertise. In this work, we consider credibility ratings from two “crowd” populations: 1) students within journalism or media programs, and 2) crowd workers on UpWork, and compare them with the ratings of two sets of experts: journalists and climate scientists, on a set of 50 climate-science articles. We find that both groups’ credibility ratings have higher correlation to journalism experts compared to the science experts, with 10-15 raters to achieve convergence. We also find that raters’ gender and political leaning impact their ratings. Among article genre of news/opinion/analysis and article source leaning of left/center/right, crowd ratings were more similar to experts respectively with opinion and strong left sources. 
    more » « less
  4. Concerns about the spread of misinformation online via news articles have led to the development of many tools and processes involving human annotation of their credibility. However, much is still unknown about how different people judge news credibility or the quality or reliability of news credibility ratings from populations of varying expertise. In this work, we consider credibility ratings from two “crowd” populations: 1) students within journalism or media programs, and 2) crowd workers on UpWork, and compare them with the ratings of two sets of experts: journalists and climate scientists, on a set of 50 climate-science articles. We find that both groups’ credibility ratings have higher correlation to journalism experts compared to the science experts, with 10-15 raters to achieve convergence. We also find that raters’ gender and political leaning impact their ratings. Among article genre of news/opinion/analysis and article source leaning of left/center/right, crowd ratings were more similar to experts respectively with opinion and strong left sources. 
    more » « less
  5. Deniz, Elif Ulutaş (Ed.)
    Effective science communication and stakeholder engagement are crucial skills for climate scientists, yet formal training in these areas remains limited in graduate education. The National Science Foundation Research Traineeship (NRT) at Auburn University (AU) addresses this gap through an innovative program combining science communication training with co-production approaches to enhance climate resiliency of built, natural, and social systems within the Southeastern United States (US). This paper evaluates the effectiveness of two novel graduate-level courses: one focused on science communication for non-technical audiences and another combining co-production methods with practical internship experience. Our research employed a mixed-methods approach, including a comprehensive analysis of course catalogs from 146 research-intensive universities and qualitative assessment of student experiences through surveys and descriptive exemplars. Analysis revealed that AU’s NRT program is unique among peer institutions in offering both specialized science communication training and co-production internship opportunities to graduate students across departments. Survey data from 11 program participants and detailed case studies of three program graduates demonstrated significant professional development benefits. Key outcomes included enhanced stakeholder engagement capabilities, improved science communication skills, and better preparation for both academic and non-academic careers. These findings suggest that integrating structured science communication training with hands-on co-production experience provides valuable preparation for climate scientists. The success of AU’s program model indicates that similar curriculum structures could benefit graduate programs nationwide, particularly in preparing students to effectively communicate complex scientific concepts to diverse audiences and engage with stakeholders in climate resilience efforts. 
    more » « less