There has been a recent explosion of articles on minimum sample sizes needed for analyzing qualitative data. The purpose of this integrated review is to examine this literature for 10 types of qualitative data analysis (5 types of saturation and 5 common methods). Building on established reviews and expanding to new methods, our findings extract the following sample size guidelines: theme saturation (9 interviews; 4 focus groups), meaning saturation (24 interviews; 8 focus groups), theoretical saturation (20–30+ interviews), metatheme saturation (20–40 interviews per site), and saturation in salience (10 exhaustive free lists); two methods where power analysis determines sample size: classical content analysis (statistical power analysis) and qualitative content analysis (information power); and three methods with little or no sample size guidance: reflexive thematic analysis, schema analysis, and ethnography (current guidance indicates 50–81 data documents or 20–30 interviews may be adequate). Our review highlights areas in which the extant literature does not provide sufficient sample size guidance—not because it is epistemologically flawed, but because it is not yet comprehensive and nuanced enough. To address this, we conclude by proposing ways researchers can navigate and contribute to the complex literature on sample size estimates.
more »
« less
Improved methodology for the analysis of polydisperse engineered and natural colloids by single particle inductively coupled plasma mass spectrometry (spICP-MS)
Multiple dilutions allow artifact-free analysis of regions of the particle size distribution. Power law modeling leads to larger size analysis range.
more »
« less
- PAR ID:
- 10555281
- Publisher / Repository:
- Royal Sociery of Chemistry
- Date Published:
- Journal Name:
- Environmental Science: Nano
- Volume:
- 10
- Issue:
- 11
- ISSN:
- 2051-8153
- Page Range / eLocation ID:
- 3136 to 3148
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract There is a critical need to generate environmentally relevant microplastics (MPs) and nanoplastics (NPs) to better investigate their behavior in laboratory settings. Environmental MPs are heterogenous in size and shape, unlike monodisperse and uniform microspheres commonly used as surrogates. Cryogenic grinding, or cryomilling, was successfully utilized to transform polystyrene (PS) bulk material into heterogenous micro and nano fragments. Fourier-Transform Infrared (FTIR) spectroscopy confirmed that this approach did not alter polymer surface chemistry. The number of milling cycles (time of milling) and frequency of grinding (intensity of milling) were varied to investigate the role cryomilling parameters had on generated MP characteristics. The resulting particle size distributions of cryomilled samples were measured and compared. Coulter Counter and Nanoparticle Tracking Analysis (NTA) were used to measure the particle size distributions at the micro and nanoparticle size ranges, respectively. Microspheres were used to determine what camera settings yielded more accurate sizing and to reduce bias in the NTA analysis. Increasing milling cycles generally increased the number of smaller particles. The evolution of the measured size distributions indicated that small nanosized fragments broke off from larger MPs during cryomilling, steadily eroding larger MP fragments. The number of milling cycles was observed to more consistently impact the size distributions of fragments compared to the frequency of milling. This study offers both analysis of the cryomilling process and recommendations for generating more realistic PS MP/NPs for examining environmental fate and effects.more » « less
-
An appropriate heat conduction model is indispensable for experimental data analysis in nanothermometry in order to extract parameters of interests and to achieve a fundamental understanding of phonon-mediated heat transfer in nanostructures and across interfaces. Recently, nanoscale periodic metallic gratings are used as a group of distributed heaters as well as transducers in nanothermometry. However, in this technique, there are coupled hotspot-size-dependent effective thermal conductivity (ETC) and hotspot-size-dependent thermal interface resistivity, which posts a challenge for experimental data analysis using Fourier’s law that fails to extract both ETC and thermal interface resistivity simultaneously. To overcome this challenge, a novel two-parameter nondiffusive heat conduction (TPHC) model, which has been successfully applied to data analysis in different types of pump-probe experiments, is applied to analyze laser-induced nondiffusive heat transfer in nanoscale metallic grating experiments. Since the hotspot-size-dependent ETC is automatically captured by the TPHC model, the hotspot-size-dependent interface resistivity becomes the only parameter to be determined from experiments through data fitting. Thus, the hotspot-size-dependent thermal interface resistivity can be determined from experiments without the impact from the hotspot-size-dependent ETC. Currently, there is a lack of a criterion to predict when Fourier’s law breaks down in nanoscale heat transfer. To fill this gap, a criterion based the TPHC model is identified to predict the valid range of Fourier’s law, which is validated in both theoretical analyses and nanoscale metallic grating experiments.more » « less
-
Many-analysts studies explore how well an empirical claim withstands plausible alternative analyses of the same dataset by multiple, independent analysis teams. Conclusions from these studies typically rely on a single outcome metric (e.g. effect size) provided by each analysis team. Although informative about the range of plausible effects in a dataset, a single effect size from each team does not provide a complete, nuanced understanding of how analysis choices are related to the outcome. We used the Delphi consensus technique with input from 37 experts to develop an 18-item subjective evidence evaluation survey (SEES) to evaluate how each analysis team views the methodological appropriateness of the research design and the strength of evidence for the hypothesis. We illustrate the usefulness of the SEES in providing richer evidence assessment with pilot data from a previous many-analysts study.more » « less
-
Abstract This study indicates the most effective combinations of scaffolding features within computer science and technology education settings. It addresses the research question, “What combinations of scaffolding characteristics, contexts of use, and assessment levels lead to medium and large effect sizes among college‐ and graduate‐level engineering and technology learners?” To do so, studies in which scaffolding led to a medium or large effect size within the context of technology and engineering education were identified within a scaffolding meta‐analysis data set. Next, two‐step cluster analysis in SPSS 24 was used to identify distinct groups of scaffolding attributes tailored to learning computer science at the undergraduate and graduate levels. Input variables included different scaffolding characteristics, the context of use, education level, and effect size. There was an eight‐cluster solution: five clusters were associated with large effect size, two with medium effect size, and one with both medium and large effect size. The three most important predictors were the context in which scaffolding was used, if and how scaffolding is customized over time and the decision rules that govern scaffolding change. Notably, highly effective scaffolding clusters are associated with most levels of each predictor.more » « less
An official website of the United States government

