skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Response of water isotopes in precipitation to a collapse of the West Antarctic Ice Sheet in high-resolution simulations with the Weather Research and Forecasting Model
{"Abstract":["This archive includes data and ipython notebooks to create the figures for the manuscript "Response of water isotopes in precipitation to a collapse of the West Antarctic Ice Sheet in high-resolution simulations with the Weather Research and Forecasting Model" submitted to Journal of Climate in August 2022.<\/p>\n\nModel output from WRFwiso and iCAM is in data.zip (saved as monthly means)<\/p>\n\nNotebooks and python modules are in scripts.zip<\/p>\n\nRequired python packages (all included in environment.yml):<\/p>\n\nnumpy<\/li>matplotlib<\/li>netcdf4<\/li>basemap<\/li>scipy<\/li>wrf-python<\/li>windspharm<\/li>metpy<\/li>intergrid<\/li>cmocean<\/li><\/ul>"]}  more » « less
Award ID(s):
1841844 1602435 1744649
PAR ID:
10429409
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
Zenodo
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. {"Abstract":["MCMC chains for the GWB analyses performed in the paper "The NANOGrav 15 yr Data Set: Search for Signals from New Physics<\/em>". <\/p>\n\nThe data is provided in pickle format. Each file contains a NumPy array with the MCMC chain (with burn-in already removed), and a dictionary with the model parameters' names as keys and their priors as values. You can load them as<\/p>\n\nwith open ('path/to/file.pkl', 'rb') as pick:\n temp = pickle.load(pick)\n\n params = temp[0]\n chain = temp[1]<\/code>\n\nThe naming convention for the files is the following:<\/p>\n\nigw<\/strong>: inflationary Gravitational Waves (GWs)<\/li>sigw: scalar-induced GWs\n\tsigw_box<\/strong>: assumes a box-like feature in the primordial power spectrum.<\/li>sigw_delta<\/strong>: assumes a delta-like feature in the primordial power spectrum.<\/li>sigw_gauss<\/strong>: assumes a Gaussian peak feature in the primordial power spectrum.<\/li><\/ul>\n\t<\/li>pt: cosmological phase transitions\n\tpt_bubble<\/strong>: assumes that the dominant contribution to the GW productions comes from bubble collisions.<\/li>pt_sound<\/strong>: assumes that the dominant contribution to the GW productions comes from sound waves.<\/li><\/ul>\n\t<\/li>stable: stable cosmic strings\n\tstable-c<\/strong>: stable strings emitting GWs only in the form of GW bursts from cusps on closed loops.<\/li>stable-k<\/strong>: stable strings emitting GWs only in the form of GW bursts from kinks on closed loops.<\/li>stable<\/strong>-m<\/strong>: stable strings emitting monochromatic GW at the fundamental frequency.<\/li>stable-n<\/strong>: stable strings described by numerical simulations including GWs from cusps and kinks.<\/li><\/ul>\n\t<\/li>meta: metastable cosmic strings\n\tmeta<\/strong>-l<\/strong>: metastable strings with GW emission from loops only.<\/li>meta-ls<\/strong> metastable strings with GW emission from loops and segments.<\/li><\/ul>\n\t<\/li>super<\/strong>: cosmic superstrings.<\/li>dw: domain walls\n\tdw-sm<\/strong>: domain walls decaying into Standard Model particles.<\/li>dw-dr<\/strong>: domain walls decaying into dark radiation.<\/li><\/ul>\n\t<\/li><\/ul>\n\nFor each model, we provide four files. One for the run where the new-physics signal is assumed to be the only GWB source. One for the run where the new-physics signal is superimposed to the signal from Supermassive Black Hole Binaries (SMBHB), for these files "_bhb" will be appended to the model name. Then, for both these scenarios, in the "compare" folder we provide the files for the hypermodel runs that were used to derive the Bayes' factors.<\/p>\n\nIn addition to chains for the stochastic models, we also provide data for the two deterministic models considered in the paper (ULDM and DM substructures). For the ULDM model, the naming convention of the files is the following (all the ULDM signals are superimposed to the SMBHB signal, see the discussion in the paper for more details)<\/p>\n\nuldm_e<\/strong>: ULDM Earth signal.<\/li>uldm_p: ULDM pulsar signal\n\tuldm_p_cor<\/strong>: correlated limit<\/li>uldm_p_unc<\/strong>: uncorrelated limit<\/li><\/ul>\n\t<\/li>uldm_c: ULDM combined Earth + pulsar signal direct coupling \n\tuldm_c_cor<\/strong>: correlated limit<\/li>uldm_c_unc<\/strong>: uncorrelated limit<\/li><\/ul>\n\t<\/li>uldm_vecB: vector ULDM coupled to the baryon number\n\tuldm_vecB_cor:<\/strong> correlated limit<\/li>uldm_vecB_unc<\/strong>: uncorrelated limit <\/li><\/ul>\n\t<\/li>uldm_vecBL: vector ULDM coupled to B-L\n\tuldm_vecBL_cor:<\/strong> correlated limit<\/li>uldm_vecBL_unc<\/strong>: uncorrelated limit<\/li><\/ul>\n\t<\/li>uldm_c_grav: ULDM combined Earth + pulsar signal for gravitational-only coupling\n\tuldm_c_grav_cor: correlated limit\n\t\tuldm_c_cor_grav_low<\/strong>: low mass region  <\/li>uldm_c_cor_grav_mon<\/strong>: monopole region<\/li>uldm_c_cor_grav_low<\/strong>: high mass region<\/li><\/ul>\n\t\t<\/li>uldm_c_unc<\/strong>: uncorrelated limit\n\t\tuldm_c_unc_grav_low<\/strong>: low mass region  <\/li>uldm_c_unc_grav_mon<\/strong>: monopole region<\/li>uldm_c_unc_grav_low<\/strong>: high mass region<\/li><\/ul>\n\t\t<\/li><\/ul>\n\t<\/li><\/ul>\n\nFor the substructure (static) model, we provide the chain for the marginalized distribution (as for the ULDM signal, the substructure signal is always superimposed to the SMBHB signal)<\/p>"]} 
    more » « less
  2. {"Abstract":["Data files were used in support of the research paper titled \u201cMitigating RF Jamming Attacks at the Physical Layer with Machine Learning<\/em>" which has been submitted to the IET Communications journal.<\/p>\n\n---------------------------------------------------------------------------------------------<\/p>\n\nAll data was collected using the SDR implementation shown here: https://github.com/mainland/dragonradio/tree/iet-paper. Particularly for antenna state selection, the files developed for this paper are located in 'dragonradio/scripts/:'<\/p>\n\n'ModeSelect.py': class used to defined the antenna state selection algorithm<\/li>'standalone-radio.py': SDR implementation for normal radio operation with reconfigurable antenna<\/li>'standalone-radio-tuning.py': SDR implementation for hyperparameter tunning<\/li>'standalone-radio-onmi.py': SDR implementation for omnidirectional mode only<\/li><\/ul>\n\n---------------------------------------------------------------------------------------------<\/p>\n\nAuthors: Marko Jacovic, Xaime Rivas Rey, Geoffrey Mainland, Kapil R. Dandekar\nContact: krd26@drexel.edu<\/p>\n\n---------------------------------------------------------------------------------------------<\/p>\n\nTop-level directories and content will be described below. Detailed descriptions of experiments performed are provided in the paper.<\/p>\n\n---------------------------------------------------------------------------------------------<\/p>\n\nclassifier_training: files used for training classifiers that are integrated into SDR platform<\/p>\n\n'logs-8-18' directory contains OTA SDR collected log files for each jammer type and under normal operation (including congested and weaklink states)<\/li>'classTrain.py' is the main parser for training the classifiers<\/li>'trainedClassifiers' contains the output classifiers generated by 'classTrain.py'<\/li><\/ul>\n\npost_processing_classifier: contains logs of online classifier outputs and processing script<\/p>\n\n'class' directory contains .csv logs of each RTE and OTA experiment for each jamming and operation scenario<\/li>'classProcess.py' parses the log files and provides classification report and confusion matrix for each multi-class and binary classifiers for each observed scenario - found in 'results->classifier_performance'<\/li><\/ul>\n\npost_processing_mgen: contains MGEN receiver logs and parser<\/p>\n\n'configs' contains JSON files to be used with parser for each experiment<\/li>'mgenLogs' contains MGEN receiver logs for each OTA and RTE experiment described. Within each experiment logs are separated by 'mit' for mitigation used, 'nj' for no jammer, and 'noMit' for no mitigation technique used. File names take the form *_cj_* for constant jammer, *_pj_* for periodic jammer, *_rj_* for reactive jammer, and *_nj_* for no jammer. Performance figures are found in 'results->mitigation_performance'<\/li><\/ul>\n\nray_tracing_emulation: contains files related to Drexel area, Art Museum, and UAV Drexel area validation RTE studies.<\/p>\n\nDirectory contains detailed 'readme.txt' for understanding.<\/li>Please note: the processing files and data logs present in 'validation' folder were developed by Wolfe et al. and should be cited as such, unless explicitly stated differently. \n\tS. Wolfe, S. Begashaw, Y. Liu and K. R. Dandekar, "Adaptive Link Optimization for 802.11 UAV Uplink Using a Reconfigurable Antenna," MILCOM 2018 - 2018 IEEE Military Communications Conference (MILCOM), 2018, pp. 1-6, doi: 10.1109/MILCOM.2018.8599696.<\/li><\/ul>\n\t<\/li><\/ul>\n\nresults: contains results obtained from study<\/p>\n\n'classifier_performance' contains .txt files summarizing binary and multi-class performance of online SDR system. Files obtained using 'post_processing_classifier.'<\/li>'mitigation_performance' contains figures generated by 'post_processing_mgen.'<\/li>'validation' contains RTE and OTA performance comparison obtained by 'ray_tracing_emulation->validation->matlab->outdoor_hover_plots.m'<\/li><\/ul>\n\ntuning_parameter_study: contains the OTA log files for antenna state selection hyperparameter study<\/p>\n\n'dataCollect' contains a folder for each jammer considered in the study, and inside each folder there is a CSV file corresponding to a different configuration of the learning parameters of the reconfigurable antenna. The configuration selected was the one that performed the best across all these experiments and is described in the paper.<\/li>'data_summary.txt'this file contains the summaries from all the CSV files for convenience.<\/li><\/ul>"]} 
    more » « less
  3. {"Abstract":["This dataset contains machine learning and volunteer classifications from the Gravity Spy project. It includes glitches from observing runs O1, O2, O3a and O3b that received at least one classification from a registered volunteer in the project. It also indicates glitches that are nominally retired from the project using our default set of retirement parameters, which are described below. See more details in the Gravity Spy Methods paper. <\/p>\n\nWhen a particular subject in a citizen science project (in this case, glitches from the LIGO datastream) is deemed to be classified sufficiently it is "retired" from the project. For the Gravity Spy project, retirement depends on a combination of both volunteer and machine learning classifications, and a number of parameterizations affect how quickly glitches get retired. For this dataset, we use a default set of retirement parameters, the most important of which are: <\/p>\n\nA glitches must be classified by at least 2 registered volunteers<\/li>Based on both the initial machine learning classification and volunteer classifications, the glitch has more than a 90% probability of residing in a particular class<\/li>Each volunteer classification (weighted by that volunteer's confusion matrix) contains a weight equal to the initial machine learning score when determining the final probability<\/li><\/ol>\n\nThe choice of these and other parameterization will affect the accuracy of the retired dataset as well as the number of glitches that are retired, and will be explored in detail in an upcoming publication (Zevin et al. in prep). <\/p>\n\nThe dataset can be read in using e.g. Pandas: \n```\nimport pandas as pd\ndataset = pd.read_hdf('retired_fulldata_min2_max50_ret0p9.hdf5', key='image_db')\n```\nEach row in the dataframe contains information about a particular glitch in the Gravity Spy dataset. <\/p>\n\nDescription of series in dataframe<\/strong><\/p>\n\n['1080Lines', '1400Ripples', 'Air_Compressor', 'Blip', 'Chirp', 'Extremely_Loud', 'Helix', 'Koi_Fish', 'Light_Modulation', 'Low_Frequency_Burst', 'Low_Frequency_Lines', 'No_Glitch', 'None_of_the_Above', 'Paired_Doves', 'Power_Line', 'Repeating_Blips', 'Scattered_Light', 'Scratchy', 'Tomte', 'Violin_Mode', 'Wandering_Line', 'Whistle']\n\tMachine learning scores for each glitch class in the trained model, which for a particular glitch will sum to unity<\/li><\/ul>\n\t<\/li>['ml_confidence', 'ml_label']\n\tHighest machine learning confidence score across all classes for a particular glitch, and the class associated with this score<\/li><\/ul>\n\t<\/li>['gravityspy_id', 'id']\n\tUnique identified for each glitch on the Zooniverse platform ('gravityspy_id') and in the Gravity Spy project ('id'), which can be used to link a particular glitch to the full Gravity Spy dataset (which contains GPS times among many other descriptors)<\/li><\/ul>\n\t<\/li>['retired']\n\tMarks whether the glitch is retired using our default set of retirement parameters (1=retired, 0=not retired)<\/li><\/ul>\n\t<\/li>['Nclassifications']\n\tThe total number of classifications performed by registered volunteers on this glitch<\/li><\/ul>\n\t<\/li>['final_score', 'final_label']\n\tThe final score (weighted combination of machine learning and volunteer classifications) and the most probable type of glitch<\/li><\/ul>\n\t<\/li>['tracks']\n\tArray of classification weights that were added to each glitch category due to each volunteer's classification<\/li><\/ul>\n\t<\/li><\/ul>\n\n <\/p>\n\n```\nFor machine learning classifications on all glitches in O1, O2, O3a, and O3b, please see Gravity Spy Machine Learning Classifications on Zenodo<\/p>\n\nFor the most recently uploaded training set used in Gravity Spy machine learning algorithms, please see Gravity Spy Training Set on Zenodo.<\/p>\n\nFor detailed information on the training set used for the original Gravity Spy machine learning paper, please see Machine learning for Gravity Spy: Glitch classification and dataset on Zenodo. <\/p>"]} 
    more » « less
  4. This dataset consists of 1,000 coordinate files (in the CHARMM psf/cor format) for the QM/MM minimum energy pathways of the deacylation reactions between a Class A beta-lactamases (GES-5) and the imipenem antibiotic molecules.</p> All pathway conformations were optimized at DFTB3/3OB-f/CHARMM36 level with 36 replicas.</p> All single point calculations and charge population analysis were done at B3LYP-D3/6-31+G(d,p)/CHARMM36 level.</p> 0.paths_ges_imi_d1.tar.gz: 500 pathway conformations for GES-5/IPM-Delta1 deacylation reactions.</li>0.paths_ges_imi_d2.tar.gz: 500 pathway conformations for GES-5/IPM-Delta1 deacylation reactions.</li>1.eners.zip: The single point replica energies along all GES-5/IPM pathways.</li>1.chrgs.zip: The NBO charges of the QM region of all replica conformations along all GES-5/IPM pathways.</li>2.datasets.zip: The Python codes to postprocess the molecular data and the featurized the NumPy arrays.</li>3.gnn.zip: The Python codes that implements the edge-conditioned graph convolutional NN to predict the deacylation barriers.</li>5.representative_conf.zip: The pathway conformations of all cluster centroids and an energetic representative (pathway id 22) pathway. Note: This file also serves as a peephole of how the pathway conformations from Reaction Path with Holonomic Constrains calculations looks like.</li>6.benchmark.zip: The benchmark calculations that validates the DFTB3/3OB-f/CHARMM36 against DFTB3/3OB/CHARMM36 and B3LYP/6-31G(d,p)/CHARMM36 level of theory on the energetic representative (pathway id 22) pathway conformations. </li></ul> 
    more » « less
  5. <h1 id="summary">Summary</h1> <p>Title: Data Release for A search for extremely-high-energy neutrinos and first constraints on the ultra-high-energy cosmic-ray proton fraction with IceCube</p> <p>The IceCube observatory analyzed 12.6 years of data in search of extremely-high-energy (EHE) neutrinos above 5 PeV. The resultant limit of the search (Fig 1), and the effective area of the event selection (Fig 7), are provided in this data release.</p> <h1 id="contents">Contents</h1> <ul> <li><p>README file: this file</p> </li> <li><p><code>differential_limit_and_sensitivity.csv</code>: a comma separated value file, giving the observed experimental differential limit, and sensitivity, of the search as a function of neutrino energy. This is the content of Fig 1 in the paper. The first column is the neutrino energy in GeV. The second column is the limit in units of GeV/cm2/s/sr. The third column is the sensitivity in units of GeV/cm2/s/sr.</p> </li> <li><p><code>effective_area.csv</code>: a comma separated value file, giving the effective area of the search as a function of energy. This is the content of Fig 7 in the paper. The first column is the neutrino energy in GeV. The second column is the total effective area of the search, summed across neutrino flavors, and averaged across neutrinos and antineutrinos, in meters-squared. The third column is the effective area of the search for the average of electron neutrino and electron antineutrinos in units of meters-squared. The fourth column is the same as the third, but for muon-flavor neutrinos. The fifth column is the same as the third and fourth, but for tau-flavor neutrinos.</p> </li> <li><p><code>demo.py</code>: a short python script to demonstrate how to read the files. Run like <code>python demo.py</code>. A standard base python installation is sufficient, as the only dependencies are numpy and matplotlib.</p> </li> </ul> <h1 id="contacts">Contacts</h1> <p>For any questions about this data release, please write to analysis@icecube.wisc.edu</p> 
    more » « less