skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Data and analysis script for channel measurement campaign at POWDER-RENEW using Iris SDRs
This repository contains our raw datasets from channel measurements performed at the University of Utah campus. In addition, we have included a document that explains the setup and methodology used to collect this data, as well as a very brief discussion of results.  File organization: * documentation/ - Contains a .docx with the description of the setup and evaluation. * data/ - HDF5 files containing both metadata and raw IQ samples for each location at which data was collected. Notice we collected data at 14  different client locations. See map in the attached docx (skipped locations 12 and 16). We deployed 5 different receivers at 5 different rooftops. Due to resource constraints, one set of files contains data from 4 different locations whereas another set  contains information from the single remaining location. We have developed a set of python scripts that allow us to parse and analyze the data. Although not included here, they can be found in our public repository: https://github.com/renew-wireless/RENEWLab You can find the top script here.</p> For more information on the POWDER-RENEW project please visit the POWDER website. The RENEW part of the project focuses on the deployment of an open-source massive MIMO system. Please visit our website for more information.</p>  more » « less
Award ID(s):
1827940
PAR ID:
10320280
Author(s) / Creator(s):
; ;
Publisher / Repository:
Zenodo
Date Published:
Edition / Version:
0.2
Subject(s) / Keyword(s):
channel measurements wireless POWDER RENEW
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Data from: Stone and Wessinger 2023, "Ecological diversification in an adaptive radiation of plants: the role of de novo mutation and introgression"DOI: 10.1101/2023.11.01.565185The code used to conduct analyses from this study can be found here: https://github.com/benstemon/MBE-23-0936The raw sequencing reads generated from this study have been deposited on the SRA under Project number: PRJNA1057825This repository contains a README.md file, which contains information on all files included. 
    more » « less
  2. This dataset contains meteorology and snow observation data collected at sites in the southwestern Colorado Rocky Mountains during water years 2019-2021. Data collection had an emphasis on paired open-forest sites and included three forested elevations. In total, we present 270 snow pit observations, 4,019 snow depth measurements, and three years of meteorological forcing from two weather stations (one in a meadow, the other in an adjacent forest). The dataset is described in a forthcoming publication of the same name: A meteorology and snow dataset from adjacent forested and meadow sites at Crested Butte, CO, USA</em> (Bonner et al., 2022).</p> All snow observation and meteorological forcing data are available as both .nc and .mat files. Additionally, original digitized copies of snow pit observations are provided as .gsheet/.xlxs files.</p> This dataset will continue to be updated, via this repository, as additional years of data are collected.</p> 
    more » « less
  3. This dataset contains raw data, processed data, and the codes used for data processing in our manuscript from our Fourier-transform infrared (FTIR) spectroscopy, Nuclear magnetic resonance (NMR), Raman spectroscopy, and X-ray diffraction (XRD) experiments. The data and codes for the fits of our unpolarized Raman spectra to polypeptide spectra is also included. The following explains the folder structure of the data provided in this dataset, which is also explained in the file ReadMe.txt. Browsing the data in Tree view is recommended. Folder contents Codes Raman Data Processing: The MATLAB script file RamanDecomposition.m contains the code to decompose the sub-peaks across different polarized Raman spectra (XX, XZ, ZX, ZZ, and YY), considering a set of pre-determined restrictions. The helper functions used in RamanDecomposition.m are included in the Helpers folder. RamanDecomposition.pdf is a PDF printout of the MATLAB code and output. P Value Simulation: 31_helix.ipynb and a_helix.ipynb: These two Jupyter Notebook files contain the intrinsic P value simulation for the 31-helix and alpha-helix structures. The simulation results were used to prepare Supplementary Table 4. See more details in the comments contained. Vector.py, Atom.py, Amino.py, and Helpers.py: These python files contains the class definitions used in 31_helix.ipynb and a_helix.ipynb. See more details in the comments contained. FTIR FTIR Raw Transmission.opj: This Origin data file contains the raw transmission data measured on single silk strand and used for FTIR spectra analysis. FTIR Deconvoluted Oscillators.opj: This Origin data file was generated from the data contained in the previous file using W-VASE software from J. A. Woollam, Inc. FTIR Unpolarized MultiStrand Raw Transmission.opj: This Origin data file contains the raw transmission data measured on multiple silk strands. The datasets contained in the first two files above were used to plot Figure 2a-b and the FTIR data points in Figure 4a, and Supplementary Figure 6. The datasets contained in the third file above were used to plot Supplementary Figure 3a. The datasets contained in the first two files above were used to plot Figure 2a-b, FTIR data points in Figure 4a, and Supplementary Figure 6. NMR Raw data files of the 13C MAS NMR spectra: ascii-spec_CP.txt: cross-polarized spectrum ascii-spec_DP.txt: direct-polarized spectrum Data is in ASCII format (comma separated values) using the following columns: Data point number Intensity Frequency [Hz] Frequency [ppm] Polypeptide Spectrum Fits MATLAB scripts (.m files) and Helpers: The MATLAB script file Raman_Fitting_Process_Part_1.m and Raman_Fitting_Process_Part_2.m contains the step-by-step instructions to perform the fitting process of our calculated unpolarized Raman spectrum, using digitized model polypeptide Raman spectra. The Helper folder contains two helper functions used by the above scripts. See the scripts for further instruction and information. Data aPA.csv, bPA.csv, GlyI.csv, GlyII.csv files: These csv files contain the digitized Raman spectra of poly-alanine, beta-alanine, poly-glycine-I, and poly-glycine-II. Raman_Exp_Data.mat: This MATLAB data file contains the processed, polarized Raman spectra obtained from our experiments. Variable freq is the wavenumber information of each collected spectrum. The variables xx, yy, zz, xz, zx represent the polarized Raman spectra collected. These variables are used to calculate the unpolarized Raman spectrum in Raman_Fitting_Process_Part_2.m. See the scripts for further instruction and information. Raman Raman Raw Data.mat: This MATLAB data file contains all the raw data used for Raman spectra analysis. All variables are of MATLAB structure data type. Each variable has fields called Freq and Raw, with Freq contains the wavenumber information of the measured spectra and Raw contains 5 measured Raman signal strengths. Variable XX, XZ, ZX, ZZ, and YY were used to plot and sub-peak analysis for Figure 2c-d, Raman data points in Figure 4a, Figure 5b, Supplementary Figure 2, and Supplementary Figure 7. Variable WideRange was used to plot and identify the peaks for Supplementary Figure 3b. X-Ray X-Ray.mat: This MATLAB data file contains the raw X-ray data used for the diffraction analysis in Supplementary Figure 5. 
    more » « less
  4. Context This research was conducted within the NSF-SEEKCommons Project, a research initiative dedicated to supporting Open Science and Open Access in disciplinary research. The project has a special interest in understanding the role that critical infrastructure has in supporting open initiatives. The Open Journal System (OJS) serves as a long-standing fundamental piece for Open Access throughout the globe. Hence, it provides valuable information about experiences developing, deploying, and maintaining open technologies.  Methods We used mixed methods for our research, triangulating repository data, installation data, interviews, and documentary analysis. We collected repository data using a report generator (Kopp [2018] 2024) that uses repository metadata to present general statistics about a Git project. The resulting information was manually curated, disambiguated, and annotated to have a homogeneous set of developers with information about their institutional affiliation and country.    Names are normalized based on the information in qualitative interviews and by browsing the full-extent commits in the GitHub repository. Other sources for this were the institutional materials (available in current and archived versions of the PKP website), meeting minutes, the user forum, and further project documentation available online. GitHub handles are homologated to their most comprehensive version. For institutional and country affiliation, we resorted to GitHub profiles, PKP documentation and forums, institutional domains available in emails, and researchers' ORCID IDs.  Available files Information about the codebase (number of files, lines of code, and timestamp) organized by month, quarter, and semester. See file: OJS_GitStats_04-24.csv Information about the historical evolution of the codebase (number of files, lines of code, and timestamp), including a description of the top committers for each month. Commiters are described by including their institutional affiliation and country of origin. See file: OJS_DevStats_Institution-Country_1.tsv Information about the historical evolution of the codebase focusing on top committers, along with their institution and country. This file is formatted to map the co-occurrence of developers and attributes by month between 2004-2024.See file: OJS_DevStats_Institution-Country_2.tsv Selected fields to describe working and regularly maintained plugins for OJS as of October 2024. Includes name of the plugin, homepage, description, maintainer, and institutional affiliation. See file: OJS_Plugins_2024_Processed.tsv Details of the aggregated information included in Table 5 of the article.See file: OJS_Plugins_2024_Table5.tsv Snapshot to XML information of the plugin gallery of OJS (October 21) retrieved from PKP website (Smecher 2024)See file: OJS_Plugins_2024.csv Funding The SEEKCommons Project is funded by the U.S. National Science Foundation (NSF), grant #2226425 
    more » « less
  5. This dataset is a compressed archive that includes 2 data files in binary format, 4 files in csv format, and 2 metadata files (pdf documents) that provide information on how to interpret the data. This data was collected from instruments deployed on two stratospheric balloons, launched a day apart in late June 2021 from central Oregon. They flew on upper level winds to the west, out over the northeastern Pacific Ocean. The measurement objective was a multi-day set of vertical electric field and polar conductivity measurements at roughly a 10 minute cadence, and from widely separated locations in the stratosphere. The binary format data is comprehensive, including everything that was measured. The csv files have been processed from the raw data files into calibrated, timed, and time-ordered ASCII files containing the primary science measurements and some essential auxiliary measurements such as measurement location and time. This data was collected in an effort to (1) compare the fair-weather return current density at different geographic locations and (2) to compare the fair-weather current density with global thunderstorm activity. 
    more » « less