Atomic force microscopy (AFM) image raw data, force spectroscopy raw data, data analysis/data plotting, and force modeling. File Formats The raw files of the AFM imaging scans of the colloidal probe surface are provided in NT-MDTs proprietary .mdt file format, which can be opened using the Gwyddion software package. Gwyddion has been released under the GNU public software license GPLv3 and can be downloaded free of charge at http://gwyddion.net/. The processed image files are included in Gwyddions .gwy file format. Force spectroscopy raw files are also provided in .mdt file format, which can be opened using NT-MDTs NOVA Px software (we used 3.2.5 rev. 10881). All the force data were converted to ASCII files (*.txt) using the NOVA Px software to also provide them in human readable form with this data set. The MATLAB codes used for force curve processing and data analysis are given as *.m files and can be opened by MATLAB (https://www.mathworks.com/products/matlab) or by a text editor. The raw and processed force curve data and other values used for data processing are stored in binary form in *.mat MATLAB data files, which can be opened by MATLAB. Organized by figure, all the raw and processed force curve data are given in Excel worksheets (*.xlsx), one per probe/substrate combination. Data (Folder Structure) The data in the dataverse is best viewed in Tree mode. Codes for Force Curve Processing The three MATLAB codes used for force curve processing are contained in this folder. The text file Read me.txt provides all the instructions to process raw force data using these three MATLAB codes. Figure 3B, 3C – AFM images The raw (.mdt) and processed (.gwy) AFM images of the colloidal probe before and after coating with graphene oxide (GO) are contained in this folder. Figure 4 – Force Curve GO The raw data of the force curve shown in Figure 4 and the substrate force curve data (used to find inverse optical lever sensitivity) are given as .mdt files and were exported as ASCII files given in the same folder. The raw and processed force curve data are also given in the variables_GO_Tip 18.mat and GO_Tip 18.xlsx files. The force curve processing codes and instructions can be found in the Codes for Force Curve Processing folder, as mentioned above. Figure 5A – Force–Displacement Curves GO, rGO1, rGO10 All the raw data of the force curves (GO, rGO1, rGO10) shown in Figure 5A and the corresponding substrate force curve data (used to find inverse optical lever sensitivity) are given as .mdt files and were exported as ASCII files given in the same folder. The raw and processed force curve data are also given in *.mat and *.xlsx files. Figure 5B, 5C – Averages of Force and Displacement for Snap-On and Pull-Off Events All the raw data of the force curves (GO, rGO1, rGO10) for all the probes and corresponding substrate force curve data are given as .mdt files and were exported as ASCII files given in this folder. The raw and processed force curve data are also provided in *.mat and *.xlsx files. The snap-on force, snap-on displacement, and pull-off displacement values were obtained from each force curve and averaged as in Code_Figure5B_5C.m. The same code was used for plotting the average values. Figure 6A – Force–Distance Curves GO, rGO1, rGO10 The raw data provided in Figure 5A – Force Displacement Curves GO, rGO1, rGO10 folder were processed into force-vs-distance curves. The raw and processed force curve data are also given in *.mat and *.xlsx files. Figure 6B – Average Snap-On and Pull-Off Distances The same raw data provided in Figure 5B, 5C – Average Snap on Force, Displacement, Pull off Displacement folder were processed into force-vs-distance curves. The raw and processed force curve data of GO, rGO1, rGO10 of all the probes are also given in *.mat and *.xlsx files. The snap-on distance and pull-off distance values were obtained from each force curve and averaged as in Code_Figure6B.m. The code used for plotting is also given in the same text file. Figure 6C – Contact Angles Advancing and receding contact angles were calculated using each processed force-vs-distance curve and averaged according to the reduction time. The obtained values and the code used to plot is given in Code_Figure6C.m. Figure 9A – Force Curve Repetition The raw data of all five force curves and the substrate force curve data are given as .mdt files and were exported as ASCII files given in the same folder. The raw and processed force curve data are also given in *.mat and *.xlsx files. Figure 9B – Repulsive Force Comparison The data of the zoomed-in region of Figure 9A was plotted as Experimental curve. Initial baseline correction was done using the MATLAB code bc.m, and the procedure is given in the Read Me.txt text file. All the raw and processed data are given in rGO10_Tip19_Trial1.xlsx and variables_rGO10_Tip 19.mat files. The MATLAB code used to model other forces and plot all the curves in Figure 9B is given in Exp_vdW_EDL.m.
more »
« less
Archiving of Experimental Data for LEAP-UCD-2017
This paper describes how the LEAP-UCD-2017 data is organized in DesignSafe; it is intended to help users, archivers, and curators find or organize data of interest. Several key files, folders, and documents included in the archive are discussed: (1) an Excel format data template used to document much of the data and metadata for each model (sensor data, cone penetrometer data, and surface marker data as reported by the experimenters), (2) processed sensor data files with time offsets and zero and calibration corrections that facilitate comparison of consistently formatted data from various model tests, (3) plots of processed data for quick overview and comparison of results among experiments, and (4) photographs taken during construction and testing.
more »
« less
- Award ID(s):
- 1635307
- PAR ID:
- 10197823
- Date Published:
- Journal Name:
- Model Tests and Numerical Simulations of Liquefaction and Lateral Spreading: LEAP-UCD-2017
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This data set consists of 3,244 gridded, daily averaged temperature, practical salinity, potential density, and dissolved oxygen profiles. These profiles were collected from October 2014 to May 2025 by the NSF Ocean Observatories Initiative Washington Offshore Profiler Mooring (CE09OSPM) located at 46.8517°N, 124.982°W between approximately 35 and 510 meters water depth using a McLane® Moored Profiler (MMP). The MMP was equipped with a Sea-Bird Scientific 52-MP (SBE 52-MP) CTD instrument and an associated Sea-Bird Scientific (SBE 43F) dissolved oxygen sensor. Raw binary data files [C*.DAT (CTD data); E*.DAT (engineering data plus auxiliary sensor data) and A*.DAT (current meter data)] were converted to ASCII text files using the McLane® Research Laboratories, Inc. Profile Unpacker v3.10 application. Dissolved oxygen calibration files for each of the twenty deployments were downloaded from the Ocean Observatories Initiative asset-management GitHub® repository. The unpacked C*.TXT (CTD data); E*.TXT (engineering data plus auxiliary sensors) and A*.TXT (current meter data) ASCII data files associated with each deployment were processed using a MATLAB® toolbox that was specifically created to process OOI MMP data. The toolbox imports MMP A*.TXT, C*.TXT, and E*.TXT data files, and applies the necessary calibration coefficients and data corrections, including adjusting for thermal-lag, flow, and sensor time constant effects. mmp_toolbox calculates dissolved oxygen concentration using the methods described in Owens and Millard (1985) and Garcia and Gordon (1992). Practical salinity and potential density are derived using the Gibbs-SeaWater Oceanographic Toolbox. After the corrections and calculations for each profile are complete, the data are binned in space to create a final, 0.5-dbar binned data set. The more than 24,000 individual temperature, practical salinity, pressure, potential density, and dissolved oxygen profiles were temporally averaged to form the final, daily averaged data set presented here. Using the methods described in Risien et al. (2023), daily temperature, practical salinity, potential density, and dissolved oxygen climatologies were calculated for each 0.5-dbar depth bin using a three-harmonic fit (1, 2, and 3 cycles per year) based on the 10-year period January 2015 to December 2024.more » « less
-
In December 2021, we installed four groundwater monitoring wells in Imperial Beach, California, to study the effects of sea level variability and implications for flood risks. We collected time series of groundwater table elevation (GWT) relative to a fixed vertical datum and local land surface elevation from 8 December 2021 through 14 May 2024. In each groundwater monitoring well, a single unvented pressure sensor (RBR Solo) was attached to Kevlar line and submerged ~1 m below the GWT. From 8 December 2021 through 21 November 2023, total pressure was recorded at 1 Hz; from 22 November 2023 through 14 May 2024, sampling occurred at 0.1 Hz. Gaps in sampling are a result of battery failures leading to data loss. To estimate hydrostatic pressure from total pressure measurements we subtracted atmospheric pressure measurements that were collected every 6 min from NOAA's National Data Buoy Center (NDBC) station SDBC1-9410170 at the San Diego airport and linearly interpolated to match sensor samples. Hydrostatic pressure is converted to sensor depth below the water table. We determined an average well water density, ρ, using intermittent vertical profiles of conductivity-temperature-depth (CTD) and the TEOS-10 conversion (Roquet et al. 2015). This object includes MATLAB (.mat) and HDF5 (.h5) files that contain the raw total pressure measurements from unvented RBR solos. The original Ruskin files (.rsk) are not included and have been converted to MATLAB files without loss of fidelity. Intermittent CTD profiles used to estimate well water density structure are included as CSV files. GWT that have been processed using atmospheric pressure and vertical datum measurements are included as HDF5 files. The object has five main directories, one for each of the four groundwater wells and one for data downloaded from other sources for archival and reproducibility purposes. Code for generating these files may be found on the GitHub repository (https://github.com/aubarnes/ImperialBeachGroundwater) or on Zenodo (https://doi.org/10.5281/zenodo.14969632). Code run with Python v3.12.7 Pastas v1.5.0 UTide v0.3.0 GSW v3.6.19 NumPy v1.26.4 Pandas v2.1.4 MatPlotLib v3.9.2 SciPy v 1.13.1 requests v2.32.3 intake v0.7.0 datetime pickle osmore » « less
-
Using the horizontal-to-vertical spectral-ratio (HVSR) method, we infer regolith thickness (i.e., depth to bedrock) throughout the Farmington River Watershed, CT, USA. Between Nov. 2019 and Nov. 2020, MOHO Tromino Model TEP-3C (MOHO, S.R.L.) three-component seismometers collected passive seismic recordings along the Farmington River and the upstream West Branch of Salmon Brook. From these recordings, we derived resonance frequencies using the GRILLA software (MOHO, S.R.L.), and then inferred potential regolith thicknesses based on likely shear wave velocities, Vs, intrinsic to the underlying sediment. Three potential shear wave velocities (Vs = 300m/s, 337m/s, 362 m/s) were considered for Farmington River watershed sediments, providing a range of potential depth estimates along the Farmington. This release contains raw passive seismic recording data, processed resonance frequency data, and the resulting inferred depth estimates displayed in both tabular and vector form. This dataset currently contains 3 zipped files: 1) ?Processed.zip? is a zipped directory containing .asc text files of processed passive seismic data, individual processed reports, tabulated results, and an associated summary text file, 'readme_Processed.txt'; 2) 'Raw.zip' contains .saf text files of passive seismic recordings and an associated 'readme_Raw.txt;' and 3) ?XYLegacyN_HVSR.zip'? contains ESRI shapefile of HVSR point locations with attribute data & a map image offering a visualization of the depth results (where, Vs = 300m/s). Additionally, the main folder contains LegacyN_HVSR_readme.txt which describes these sub-directories in further detail.more » « less
-
This dataset is a compressed archive that includes 2 data files in binary format, 4 files in csv format, and 2 metadata files (pdf documents) that provide information on how to interpret the data. This data was collected from instruments deployed on two stratospheric balloons, launched a day apart in late June 2021 from central Oregon. They flew on upper level winds to the west, out over the northeastern Pacific Ocean. The measurement objective was a multi-day set of vertical electric field and polar conductivity measurements at roughly a 10 minute cadence, and from widely separated locations in the stratosphere. The binary format data is comprehensive, including everything that was measured. The csv files have been processed from the raw data files into calibrated, timed, and time-ordered ASCII files containing the primary science measurements and some essential auxiliary measurements such as measurement location and time. This data was collected in an effort to (1) compare the fair-weather return current density at different geographic locations and (2) to compare the fair-weather current density with global thunderstorm activity.more » « less
An official website of the United States government

