skip to main content


Title: TXM-Sandbox : an open-source software for transmission X-ray microscopy data analysis
A transmission X-ray microscope (TXM) can investigate morphological and chemical information of a tens to hundred micrometre-thick specimen on a length scale of tens to hundreds of nanometres. It has broad applications in material sciences and battery research. TXM data processing is composed of multiple steps. A workflow software has been developed that integrates all the tools required for general TXM data processing and visualization. The software is written in Python and has a graphic user interface in Jupyter Notebook . Users have access to the intermediate analysis results within Jupyter Notebook and have options to insert extra data processing steps in addition to those that are integrated in the software. The software seamlessly integrates ImageJ as its primary image viewer, providing rich image visualization and processing routines. As a guide for users, several TXM specific data analysis issues and examples are also presented.  more » « less
Award ID(s):
2045570
NSF-PAR ID:
10318709
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Journal of Synchrotron Radiation
Volume:
29
Issue:
1
ISSN:
1600-5775
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. iCn3D was initially developed as a web-based 3D molecular viewer. It then evolved from visualization into a full-featured interactive structural analysis software. It became a collaborative research instrument through the sharing of permanent, shortened URLs that encapsulate not only annotated visual molecular scenes, but also all underlying data and analysis scripts in a FAIR manner. More recently, with the growth of structural databases, the need to analyze large structural datasets systematically led us to use Python scripts and convert the code to be used in Node. js scripts. We showed a few examples of Python scripts at https://github.com/ncbi/icn3d/tree/master/icn3dpython to export secondary structures or PNG images from iCn3D. Users just need to replace the URL in the Python scripts to export other annotations from iCn3D. Furthermore, any interactive iCn3D feature can be converted into a Node. js script to be run in batch mode, enabling an interactive analysis performed on one or a handful of protein complexes to be scaled up to analysis features of large ensembles of structures. Currently available Node. js analysis scripts examples are available at https://github.com/ncbi/icn3d/tree/master/icn3dnode . This development will enable ensemble analyses on growing structural databases such as AlphaFold or RoseTTAFold on one hand and Electron Microscopy on the other. In this paper, we also review new features such as DelPhi electrostatic potential, 3D view of mutations, alignment of multiple chains, assembly of multiple structures by realignment, dynamic symmetry calculation, 2D cartoons at different levels, interactive contact maps, and use of iCn3D in Jupyter Notebook as described at https://pypi.org/project/icn3dpy . 
    more » « less
  2. Neuroscientists are increasingly relying on parallel and distributed computing resources for analysis and visualization of their neuron simulations. This requires expert knowledge of programming and cyberinfrastructure configuration, which is beyond the repertoire of most neuroscience programs. This paper presents early experiences from a one-credit graduate research training course titled ECE 8001 “Software and Cyber Automation in Neuroscience” at the University of Missouri for engendering multi-disciplinary collaborations between computational neuroscience and cyberinfrastructure students and faculty. Specifically, we discuss the course organization and exemplar outcomes involving a next-generation science gateway for training novice users on exemplar neuroscience use cases that involve using tools such as NEURON and MATLAB on local as well as Neuroscience Gateway resources. We also discuss our vision towards a course sequence curriculum for graduate/undergraduate students from biological/psychological sciences and computer science/engineering to jointly build “self- service” training modules using Jupyter Notebook platforms. Thus, our efforts show how we can create scalable and sustainable cyber and software automation for fulfilling a broad set of neuroscience research and education use cases. 
    more » « less
  3. CompuCell3D (CC3D) is an open-source software framework for building and executing multi-cell biological virtual-tissue models. It represents cells using the Glazier–Graner–Hogeweg model, also known as Cellular Potts model. The primary CC3D application consists of two separate tools, a smart model editor (Twedit++) and a tool for model execution, visualization and steering (Player). The CompuCell3D version 4.x release introduces support for Jupyter Notebooks, an interactive computational environment, which brings the benefits of reproducibility, portability, and self-documentation. Since model specifications in CC3D are written in Python and CC3DML and Jupyter supports Python and other languages, Jupyter can naturally act as an integrated development environment (IDE) for CC3D users as well as a live document with embedded text and simulations. This update follows the trend in software to move away from monolithic freestanding applications to the distribution of methodologies in the form of libraries that can be used in conjunction with other libraries and packages. With these benefits, CC3D deployed inJupyter Notebook is a more natural and efficient platform for scientific publishing and education using CC3D. 
    more » « less
  4. PmagPy Online: Jupyter Notebooks, the PmagPy Software Package and the Magnetics Information Consortium (MagIC) Database Lisa Tauxe$^1$, Rupert Minnett$^2$, Nick Jarboe$^1$, Catherine Constable$^1$, Anthony Koppers$^2$, Lori Jonestrask$^1$, Nick Swanson-Hysell$^3$ $^1$Scripps Institution of Oceanography, United States of America; $^2$ Oregon State University; $^3$ University of California, Berkely; ltauxe@ucsd.edu The Magnetics Information Consortium (MagIC), hosted at http://earthref.org/MagIC is a database that serves as a Findable, Accessible, Interoperable, Reusable (FAIR) archive for paleomagnetic and rock magnetic data. It has a flexible, comprehensive data model that can accomodate most kinds of paleomagnetic data. The PmagPy software package is a cross-platform and open-source set of tools written in Python for the analysis of paleomagnetic data that serves as one interface to MagIC, accommodating various levels of user expertise. It is available through github.com/PmagPy. Because PmagPy requires installation of Python, several non-standard Python modules, and the PmagPy software package, there is a speed bump for many practitioners on beginning to use the software. In order to make the software and MagIC more accessible to the broad spectrum of scientists interested in paleo and rock magnetism, we have prepared a set of Jupyter notebooks, hosted on jupyterhub.earthref.org which serve a set of purposes. 1) There is a complete course in Python for Earth Scientists, 2) a set of notebooks that introduce PmagPy (pulling the software package from the github repository) and illustrate how it can be used to create data products and figures for typical papers, and 3) show how to prepare data from the laboratory to upload into the MagIC database. The latter will satisfy expectations from NSF for data archiving and for example the AGU publication data archiving requirements. Getting started To use the PmagPy notebooks online, go to website at https://jupyterhub.earthref.org/. Create an Earthref account using your ORCID and log on. [This allows you to keep files in a private work space.] Open the PmagPy Online - Setup notebook and execute the two cells. Then click on File = > Open and click on the PmagPy_Online folder. Open the PmagPy_online notebook and work through the examples. There are other notebooks that are useful for the working paleomagnetist. Alternatively, you can install Python and the PmagPy software package on your computer (see https://earthref.org/PmagPy/cookbook for instructions). Follow the instructions for "Full PmagPy install and update" through section 1.4 (Quickstart with PmagPy notebooks). This notebook is in the collection of PmagPy notebooks. Overview of MagIC The Magnetics Information Consortium (MagIC), hosted at http://earthref.org/MagIC is a database that serves as a Findable, Accessible, Interoperable, Reusable (FAIR) archive for paleomagnetic and rock magnetic data. Its datamodel is fully described here: https://www2.earthref.org/MagIC/data-models/3.0. Each contribution is associated with a publication via the DOI. There are nine data tables: contribution: metadata of the associated publication. locations: metadata for locations, which are groups of sites (e.g., stratigraphic section, region, etc.) sites: metadata and derived data at the site level (units with a common expectation) samples: metadata and derived data at the sample level. specimens: metadata and derived data at the specimen level. criteria: criteria by which data are deemed acceptable ages: ages and metadata for sites/samples/specimens images: associated images and plots. Overview of PmagPy The functionality of PmagPy is demonstrated within notebooks in the PmagPy repository: PmagPy_online.ipynb: serves as an introdution to PmagPy and MagIC (this conference). It highlights the link between PmagPy and the Findable Accessible Interoperable Reusabe (FAIR) database maintained by the Magnetics Information Consortium (MagIC) at https://earthref.org/MagIC. Other notebooks of interest are: PmagPy_calculations.ipynb: demonstrates many of the PmagPy calculation functions such as those that rotate directions, return statistical parameters, and simulate data from specified distributions. PmagPy_plots_analysis.ipynb: demonstrates PmagPy functions that can be used to visual data as well as those that conduct statistical tests that have associated visualizations. PmagPy_MagIC.ipynb: demonstrates how PmagPy can be used to read and write data to and from the MagIC database format including conversion from many individual lab measurement file formats. Please see also our YouTube channel with more presentations from the 2020 MagIC workshop here: https://www.youtube.com/playlist?list=PLirL2unikKCgUkHQ3m8nT29tMCJNBj4kj 
    more » « less
  5. Summary

    In recent years, geospatial data have exploded to massive volume and diversity and subsequently cause serious usability issues for researchers in various scientific areas. This paper describes a cyberGIS community data service framework to facilitate geospatial big data access, processing, and sharing based on a hybrid supercomputer architecture. Specifically, the framework aims to enhance the usability of national elevation dataset released by the U.S. Geological Survey in the contiguous United States at the resolution ofarc‐second. A community data service, namely TopoLens, is created to demonstrate the workflow integration of national elevation dataset and the associated computation and analysis. Two user‐friendly environments, including a publicly available web application and a private workspace based on the Jupyter notebook, are provided for users to access both precomputed and on‐demand computed high‐resolution elevation data. The system architecture of TopoLens is implemented by exploiting the ROGER supercomputer, the first cyberGIS supercomputer dedicated to geospatial problem‐solving. The usability of TopoLens has been acknowledged in the topographic user community evaluation.

     
    more » « less