Title: Scientific Disciplines and the Admissibility of Expert Evidence in Courts
The authors examine how people interpret expert claims when they do not share experts’ technical understanding. The authors review sociological research on the cultural boundaries of science and expertise, which suggests that scientific disciplines are heuristics nonspecialists use to evaluate experts’ credibility. To test this idea, the authors examine judicial decisions about contested expert evidence in U.S. district courts ( n = 575). Multinomial logistic regression results show that judges favor evidence from natural scientists compared with social scientists, even after adjusting for other differences among experts, judges, and court cases. Judges also favor evidence from medical and health experts compared with social scientists. These results help illustrate the assumptions held by judges about the credibility of some disciplines relative to others. They also suggest that judges may rely on tacit assumptions about the merits of different areas of science, which may reflect broadly shared cultural beliefs about expertise and credibility. more »« less
Concerns about the spread of misinformation online via news articles have led to the development of many tools and processes involving human annotation of their credibility. However, much is still unknown about how different people judge news credibility or the quality or reliability of news credibility ratings from populations of varying expertise. In this work, we consider credibility ratings from two “crowd” populations: 1) students within journalism or media programs, and 2) crowd workers on UpWork, and compare them with the ratings of two sets of experts: journalists and climate scientists, on a set of 50 climate-science articles. We find that both groups’ credibility ratings have higher correlation to journalism experts compared to the science experts, with 10-15 raters to achieve convergence. We also find that raters’ gender and political leaning impact their ratings. Among article genre of news/opinion/analysis and article source leaning of left/center/right, crowd ratings were more similar to experts respectively with opinion and strong left sources.
Concerns about the spread of misinformation online via news articles have led to the development of many tools and processes involving human annotation of their credibility. However, much is still unknown about how different people judge news credibility or the quality or reliability of news credibility ratings from populations of varying expertise. In this work, we consider credibility ratings from two “crowd” populations: 1) students within journalism or media programs, and 2) crowd workers on UpWork, and compare them with the ratings of two sets of experts: journalists and climate scientists, on a set of 50 climate-science articles. We find that both groups’ credibility ratings have higher correlation to journalism experts compared to the science experts, with 10-15 raters to achieve convergence. We also find that raters’ gender and political leaning impact their ratings. Among article genre of news/opinion/analysis and article source leaning of left/center/right, crowd ratings were more similar to experts respectively with opinion and strong left sources.
Wilson, Cristina G.; Shipley, Thomas F.; Davatzes, Alexandra K.
(, Applied Cognitive Psychology)
Summary Previous research demonstrates that domain experts, like ordinary participant populations, are vulnerable to decision bias. Here, we examine susceptibility to bias amongst expert field scientists. Field scientists operate in less predictable environments than other experts, and feedback on the consequences of their decisions is often unclear or delayed. Thus, field scientists are a population where the findings of scientific research may be particularly vulnerable to bias. In this study, susceptibility to optimism, hindsight, and framing bias was evaluated in a group of expert field geologists using descriptive decision scenarios. Experts showed susceptibility to all three biases, and susceptibility was not influenced by years of science practice. We found no evidence that participants' vulnerability to one bias was related to their vulnerability to another bias. Our findings are broadly consistent with previous research on expertise and decision bias, demonstrating that no expert, regardless their domain experience, is immune to bias.
This dataset includes information regarding 1,620 social science researchers who signed up for the Social Science Extreme Events Research (SSEER) network between July 8, 2018 and December 31, 2023. Researchers’ information is collected via an online survey that consists of 19 questions and takes approximately 7 minutes to complete. The envisioned audience for this data and other information includes those who are interested in learning more about the composition of the social science hazards and disaster research workforce, the events that they study, and their skills and expertise.This project includes a survey instrument, data, and annual census reports from the National Science Foundation (NSF)-funded Social Science Extreme Events Research (SSEER) network, which is headquartered at the Natural Hazards Engineering Research Infrastructure (NHERI) CONVERGE facility at the Natural Hazards Center at the University of Colorado Boulder. The SSEER network, which was launched in 2018, was formed, in part, to respond to the need for more specific information about the status and expertise of the social science hazards and disaster research workforce. The mission of SSEER is to identify and map social scientists involved in hazards and disaster research in order to highlight their expertise and connect social science researchers to one another, to interdisciplinary teams, and to communities at risk to hazards and affected by disasters. Ultimately, the goals of SSEER are to amplify the contributions of social scientists and to advance the field through expanding the available social science evidence base. To see the SSEER map and to learn more about the SSEER initiative, please visit: https://converge.colorado.edu/research-networks/sseer. All social and behavioral scientists and those in allied disciplines who study the human, economic, policy, and health dimensions of disasters are invited to join this network via a short online survey. This DesignSafe project includes: (1) the SSEER survey instrument; (2) de-identified data, which is updated annually as new researchers join the SSEER network and returning members update their information; and (3) SSEER annual census reports. These resources are available to all who are interested in learning more about the composition of the social science hazards and disaster workforce. SSEER is part of a larger ecosystem of NSF-funded extreme events research and reconnaissance networks designed to help coordinate disciplinary communities in engineering and the sciences, while also encouraging cross-disciplinary information sharing and interdisciplinary integration. To learn more about the networks and research ecosystem, please visit: https://converge.colorado.edu/research-networks/.
This dataset includes information regarding 648 social science researchers who signed up for the Social Science Extreme Events Research (SSEER) network between July 8, 2018 and December 31, 2018. Researchers’ information is collected via an online survey that consists of 19 questions and takes approximately 7 minutes to complete. The envisioned audience for this data and other information includes those who are interested in learning more about the composition of the social science hazards and disaster research workforce, the events that they study, and their skills and expertise.This project includes a survey instrument, data, and annual census reports from the National Science Foundation (NSF)-funded Social Science Extreme Events Research (SSEER) network, which is headquartered at the Natural Hazards Engineering Research Infrastructure (NHERI) CONVERGE facility at the Natural Hazards Center at the University of Colorado Boulder. The SSEER network, which was launched in 2018, was formed, in part, to respond to the need for more specific information about the status and expertise of the social science hazards and disaster research workforce. The mission of SSEER is to identify and map social scientists involved in hazards and disaster research in order to highlight their expertise and connect social science researchers to one another, to interdisciplinary teams, and to communities at risk to hazards and affected by disasters. Ultimately, the goals of SSEER are to amplify the contributions of social scientists and to advance the field through expanding the available social science evidence base. To see the SSEER map and to learn more about the SSEER initiative, please visit: https://converge.colorado.edu/research-networks/sseer. All social and behavioral scientists and those in allied disciplines who study the human, economic, policy, and health dimensions of disasters are invited to join this network via a short online survey. This DesignSafe project includes: (1) the SSEER survey instrument; (2) de-identified data, which is updated annually as new researchers join the SSEER network and returning members update their information; and (3) SSEER annual census reports. These resources are available to all who are interested in learning more about the composition of the social science hazards and disaster workforce. SSEER is part of a larger ecosystem of NSF-funded extreme events research and reconnaissance networks designed to help coordinate disciplinary communities in engineering and the sciences, while also encouraging cross-disciplinary information sharing and interdisciplinary integration. To learn more about the networks and research ecosystem, please visit: https://converge.colorado.edu/research-networks/.
O’Brien, Timothy L., Hawkins, Stephen L., and Loesch, Adam. Scientific Disciplines and the Admissibility of Expert Evidence in Courts. Retrieved from https://par.nsf.gov/biblio/10377974. Socius: Sociological Research for a Dynamic World 8. Web. doi:10.1177/23780231221108044.
O’Brien, Timothy L., Hawkins, Stephen L., & Loesch, Adam. Scientific Disciplines and the Admissibility of Expert Evidence in Courts. Socius: Sociological Research for a Dynamic World, 8 (). Retrieved from https://par.nsf.gov/biblio/10377974. https://doi.org/10.1177/23780231221108044
@article{osti_10377974,
place = {Country unknown/Code not available},
title = {Scientific Disciplines and the Admissibility of Expert Evidence in Courts},
url = {https://par.nsf.gov/biblio/10377974},
DOI = {10.1177/23780231221108044},
abstractNote = {The authors examine how people interpret expert claims when they do not share experts’ technical understanding. The authors review sociological research on the cultural boundaries of science and expertise, which suggests that scientific disciplines are heuristics nonspecialists use to evaluate experts’ credibility. To test this idea, the authors examine judicial decisions about contested expert evidence in U.S. district courts ( n = 575). Multinomial logistic regression results show that judges favor evidence from natural scientists compared with social scientists, even after adjusting for other differences among experts, judges, and court cases. Judges also favor evidence from medical and health experts compared with social scientists. These results help illustrate the assumptions held by judges about the credibility of some disciplines relative to others. They also suggest that judges may rely on tacit assumptions about the merits of different areas of science, which may reflect broadly shared cultural beliefs about expertise and credibility.},
journal = {Socius: Sociological Research for a Dynamic World},
volume = {8},
author = {O’Brien, Timothy L. and Hawkins, Stephen L. and Loesch, Adam},
}
Warning: Leaving National Science Foundation Website
You are now leaving the National Science Foundation website to go to a non-government website.
Website:
NSF takes no responsibility for and exercises no control over the views expressed or the accuracy of
the information contained on this site. Also be aware that NSF's privacy policy does not apply to this site.