The increasing use of machine learning and Large Language Models (LLMs) opens up opportunities to use these artificially intelligent algorithms in novel ways. This article proposes a methodology using LLMs to support traditional deductive coding in qualitative research. We began our analysis with three different sample texts taken from existing interviews. Next, we created a codebook and inputted the sample text and codebook into an LLM. We asked the LLM to determine if the codes were present in a sample text provided and requested evidence to support the coding. The sample texts were inputted 160 times to record changes between iterations of the LLM response. Each iteration was analogous to a new coder deductively analyzing the text with the codebook information. In our results, we present the outputs for these recursive analyses, along with a comparison of the LLM coding to evaluations made by human coders using traditional coding methods. We argue that LLM analysis can aid qualitative researchers by deductively coding transcripts, providing a systematic and reliable platform for code identification, and offering a means of avoiding analysis misalignment. Implications of using LLM in research praxis are discussed, along with current limitations.
more »
« less
X-RAY TOMOGRAPHIC MICROSCOPY AS A MEANS TO SYSTEMATICALLY TRACK EXPERIMENTAL DECAY AND FOSSILIZATION
ABSTRACT Laboratory-based decay experiments have become commonly used to supplement our understanding of how organisms enter the fossil record. Differences in how these experiments are designed and evaluated, however, including dissimilarities in qualitative decay-scoring indices superimposed on variability in model organisms, renders any semblance of comparison between studies unreliable. Here, we introduce the utility of X-ray tomographic microscopy (μCT) as a means for reliable and repeatable analysis of soft-tissue decay experiment products. As proof-of-concept, we used a relatively simple experimental design with classic studies as comparators, and present our analytical protocol using μCT for capturing the entire volume of the decay subject. Segmentation software then allows for 3D volume analysis and high-resolution internal and external character identification. We describe the workflow from sample preparation, contrast-staining, and data collection to processing and analysis of the resulting data, using peppermint shrimp (Lysmata wurdemanni) as model organisms, and compare our results to previous taphonomic studies. These methods allow for improved visualization and quantification of decay and internal volume analysis with minimal handling as compared to traditional qualitative scoring methods. Using the same scoring criteria as previous studies, this study revealed similar decay results for certain features, while we were additionally able to detect other feature loss or alteration earlier—importantly without need for potentially distortive sample handling. We conclude that μCT is a more effective, straightforward, and exact means for extracting quantitative data on the progression of decay and should be adopted in future studies, where available, to streamline and standardize comparisons.
more »
« less
- Award ID(s):
- 1652351
- PAR ID:
- 10326940
- Date Published:
- Journal Name:
- PALAIOS
- Volume:
- 36
- Issue:
- 6
- ISSN:
- 0883-1351
- Page Range / eLocation ID:
- 216 to 224
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This study proposes and demonstrates how computer‐aided methods can be used to extend qualitative data analysis by quantifying qualitative data, and then through exploration, categorization, grouping, and validation. Computer‐aided approaches to inquiry have gained important ground in educational research, mostly through data analytics and large data set processing. We argue that qualitative data analysis methods can also be supported and extended by computer‐aided methods. In particular, we posit that computing capacities rationally applied can expand the innate human ability to recognize patterns and group qualitative information based on similarities. We propose a principled approach to using machine learning in qualitative education research based on the three interrelated elements of the assessment triangle: cognition, observation, and interpretation. Through the lens of the assessment triangle, the study presents three examples of qualitative studies in engineering education that have used computer‐aided methods for visualization and grouping. The first study focuses on characterizing students' written explanations of programming code, using tile plots and hierarchical clustering with binary distances to identify the different approaches that students used to self‐explain. The second study looks into students' modeling and simulation process and elicits the types of knowledge that they used in each step through a think‐aloud protocol. For this purpose, we used a bubble plot and a k‐means clustering algorithm. The third and final study explores engineering faculty's conceptions of teaching, using data from semi‐structured interviews. We grouped these conceptions based on coding similarities, using Jaccard's similarity coefficient, and visualized them using a treemap. We conclude this manuscript by discussing some implications for engineering education qualitative research.more » « less
-
In this paper, we investigate ultra-high-molecular-weight-polyethylene (UHMWPE) doped with conductive carbon black (CCB) nanoparticles. This nanocomposite is considered a candidate for biomedical applications such as orthopedics. Micro-computed tomography (μCT) and scanning electron microscopy studies show that the composite has a complex microstructure consisting of larger particles of UHMWPE surrounded by a thin layer containing a high concentration of CCB nano inclusions. The overall mechanical properties of these composites depend on the volume fraction of CCB and the manufacturing procedures e.g., compression molding or equal channel angular extrusion. To predict the effective elastic properties of the CCB/UHMWPE nanocomposite, we propose a multiscale modeling framework based on a combined analytical-numerical approach. μCT images are processed to extract the size, shape, and orientation distributions of UHMWPE particles as well as the volume fractions and spatial distribution of CCB containing layer. These distributions are used to develop multiscale numerical models of the composite including finite element analysis of representative volume elements on the mesoscale, and micromechanical predictions of CCB containing layer on the microscale. The predictive ability of the models is confirmed by comparison with the experimental measurements obtained by dynamic mechanical analysis.more » « less
-
There has been a recent explosion of articles on minimum sample sizes needed for analyzing qualitative data. The purpose of this integrated review is to examine this literature for 10 types of qualitative data analysis (5 types of saturation and 5 common methods). Building on established reviews and expanding to new methods, our findings extract the following sample size guidelines: theme saturation (9 interviews; 4 focus groups), meaning saturation (24 interviews; 8 focus groups), theoretical saturation (20–30+ interviews), metatheme saturation (20–40 interviews per site), and saturation in salience (10 exhaustive free lists); two methods where power analysis determines sample size: classical content analysis (statistical power analysis) and qualitative content analysis (information power); and three methods with little or no sample size guidance: reflexive thematic analysis, schema analysis, and ethnography (current guidance indicates 50–81 data documents or 20–30 interviews may be adequate). Our review highlights areas in which the extant literature does not provide sufficient sample size guidance—not because it is epistemologically flawed, but because it is not yet comprehensive and nuanced enough. To address this, we conclude by proposing ways researchers can navigate and contribute to the complex literature on sample size estimates.more » « less
-
We present a complete open-hardware and software materials acceleration platform (MAP) for sonochemical synthesis of nanocrystals using a versatile tool-changing platform (Jubilee) configured for automated ultrasound application, a liquid-handling robot (Opentrons OT2) and a well-plate spectrometer. An automated high-throughput protocol was developed demonstrating the synthesis of CdSe nanocrystals using sonochemistry and different combinations of sample conditions, including precursor and ligand compositions and concentrations. Cavitation caused by ultrasound fields causes local and transient increases in temperature and pressure sufficient to drive the decomposition of organometallic precursors to drive the chemical reaction leading to nanocrystal formation. A total of 625 unique sample conditions were prepared and analyzed in triplicate with an individual sample volume of as little as 0.5 mL, which drastically reduced chemical waste and experimental times. The rapid onset of cavitation and quick dissipation of energy result in fast nucleation with little nanocrystal growth leading to the formation of small nanocrystals or magic-size clusters (MSCs) depending on composition. Using the effective mass approximation, the calculated QD diameters obtained under all our experimental conditions ranged between 1.3 and 2.1 nm, which was also validated with small angle X-ray scattering (SAXS). Polydispersity, QD shape and optical properties largely varied depending on the concentration of ligands present in solution. Statistical analysis of the spectroscopic data corroborates the qualitative relationships observed from the optical characterization of the samples with the model-agnostic SHAP analysis. The complete workflow relies on relatively low-cost and open-source systems. Automation and the reduced volumes also allow for cost-efficient experimentation, increasing the accessibility of this MAP. The high-throughput capabilities of the automated sonication platform, the extensible nature of the Jubilee system, and the modular nature of the protocol, make the workflow adaptable to a variety of future studies, including other nanocrystal design spaces, emulsification processes, and nanoparticle re-dispersion or exfoliation.more » « less
An official website of the United States government

