skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 1655426

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. ABSTRACT Rapid warming could drastically alter host–parasite relationships, which is especially important for fisheries crucial to human nutrition and economic livelihoods, yet we lack a synthetic understanding of how warming influences parasite‐induced mortality in these systems. We conducted a meta‐analysis using 266 effect sizes from 52 empirical papers on harvested aquatic species and determined the relationship between parasite‐induced host mortality and temperature and how this relationship was altered by host, parasite, and study design traits. Overall, higher temperatures increased parasite‐induced host mortality; however, the magnitude of this relationship varied. Hosts from the order Salmoniformes experienced a greater increase in parasite‐induced mortality with temperature than the average response to temperature across fish orders. Opportunistic parasites were associated with a greater increase in infected host mortality with temperature than the average across parasite strategies, while bacterial parasites were associated with lower infected host mortality as temperature increased than the average across parasite types. Thus, parasites will generally increase host mortality as the environment warms; however, this effect will vary among systems. 
    more » « less
    Free, publicly-accessible full text available July 1, 2026
  2. Abstract In ecological meta‐analyses, nonindependence among observed effect sizes from the same source paper is common. If not accounted for, nonindependence can seriously undermine inferences. We compared the performance of four meta‐analysis methods that attempt to address such nonindependence and the standard random‐effect model that ignores nonindependence. We simulated data with various types of within‐paper nonindependence, and assessed the standard deviation of the estimated mean effect size and Type I error rate of each method. Although all four methods performed substantially better than the standard random‐effects model that assumes independence, there were differences in performance among the methods. A two‐step method that first summarizes the multiple observed effect sizes per paper using a weighted mean and then analyzes the reduced data in a standard random‐effects model, and a robust variance estimation method performed consistently well. A hierarchical model with both random paper and study effects gave precise estimates but had a higher Type I error rates, possibly reflecting limitations of currently available meta‐analysis software. Overall, we advocate the use of the two‐step method with a weighted paper mean and the robust variance estimation method as reliable ways to handle within‐paper nonindependence in ecological meta‐analyses. 
    more » « less
  3. Abstract Despite the wide application of meta‐analysis in ecology, some of the traditional methods used for meta‐analysis may not perform well given the type of data characteristic of ecological meta‐analyses.We reviewed published meta‐analyses on the ecological impacts of global climate change, evaluating the number of replicates used in the primary studies (ni) and the number of studies or records (k) that were aggregated to calculate a mean effect size. We used the results of the review in a simulation experiment to assess the performance of conventional frequentist and Bayesian meta‐analysis methods for estimating a mean effect size and its uncertainty interval.Our literature review showed thatniandkwere highly variable, distributions were right‐skewed and were generally small (medianni = 5, mediank = 44). Our simulations show that the choice of method for calculating uncertainty intervals was critical for obtaining appropriate coverage (close to the nominal value of 0.95). Whenkwas low (<40), 95% coverage was achieved by a confidence interval (CI) based on thetdistribution that uses an adjusted standard error (the Hartung–Knapp–Sidik–Jonkman, HKSJ), or by a Bayesian credible interval, whereas bootstrap orzdistribution CIs had lower coverage. Despite the importance of the method to calculate the uncertainty interval, 39% of the meta‐analyses reviewed did not report the method used, and of the 61% that did, 94% used a potentially problematic method, which may be a consequence of software defaults.In general, for a simple random‐effects meta‐analysis, the performance of the best frequentist and Bayesian methods was similar for the same combinations of factors (kand mean replication), though the Bayesian approach had higher than nominal (>95%) coverage for the mean effect whenkwas very low (k < 15). Our literature review suggests that many meta‐analyses that usedzdistribution or bootstrapping CIs may have overestimated the statistical significance of their results when the number of studies was low; more appropriate methods need to be adopted in ecological meta‐analyses. 
    more » « less
  4. Silva, Daniel de (Ed.)
    Quantitatively summarizing results from a collection of primary studies with meta-analysis can help answer ecological questions and identify knowledge gaps. The accuracy of the answers depends on the quality of the meta-analysis. We reviewed the literature assessing the quality of ecological meta-analyses to evaluate current practices and highlight areas that need improvement. From each of the 18 review papers that evaluated the quality of meta-analyses, we calculated the percentage of meta-analyses that met criteria related to specific steps taken in the meta-analysis process (i.e., execution) and the clarity with which those steps were articulated (i.e., reporting). We also re-evaluated all the meta-analyses available from Pappalardo et al. [1] to extract new information on ten additional criteria and to assess how the meta-analyses recognized and addressed non-independence. In general, we observed better performance for criteria related to reporting than for criteria related to execution; however, there was a wide variation among criteria and meta-analyses. Meta-analyses had low compliance with regard to correcting for phylogenetic non-independence, exploring temporal trends in effect sizes, and conducting a multifactorial analysis of moderators (i.e., explanatory variables). In addition, although most meta-analyses included multiple effect sizes per study, only 66% acknowledged some type of non-independence. The types of non-independence reported were most often related to the design of the original experiment (e.g., the use of a shared control) than to other sources (e.g., phylogeny). We suggest that providing specific training and encouraging authors to follow the PRISMA EcoEvo checklist recently developed by O’Dea et al. [2] can improve the quality of ecological meta-analyses. 
    more » « less