ALERT Doctoral School 2022: Data for photoelasticity lesson
{"Abstract":["Data used for the two-hour photoelasticity lesson on September 29, 2022 at the 2022 ALERT Doctoral School in Aussois, France. <\/p>\n\nIn the Data directory, you will find the PEGS-master, PhotoelasticDisks, and Results subdirectories. You will also find the Jupyter notebook ALERTPhotoelasticity_220929_v1.ipynb.<\/p>\n\nPhotoelasticity data is in the PhotoelasticDisks subdirectory. N_Image and P_Image contain a sequence of images of 511 bidisperse birefringent disks in simple shear as viewed with unpolarized light and polarized light, respectively. The Positions subdirectory contains the position and radii of the disks in disks in each image. The G2images and radii_highlighted subdirectories contain, respectively: (1) images of each particle colored by G^2 as computed from the photoelasticity images via methods described in (Daniels, et al., Review of Scientific Instruments, 88, 051808 (2017)); (2) images of deformation of the particle with the outlines of each particle highlighted. Computations are performed in the accompanying ALERTPhotoelasticity_220929_v1.ipynb Jupyter notebook, which may be opened on any computer supporting jupyter notebooks or through Google Colab.<\/p>\n\nWithin PEGS-master, you can open PeGSDiskSolve.m to solve for inter-particle forces using methods described in Sec. V of (Daniels, et al., Review of Scientific Instruments, 88, 051808 (2017)) and in the thesis of James Puckett (thesis titled "State Variables in Granular Materials: an Investigation of Volume and Stress Fluctuations" and completed at North Carolina State University in 2012). You can also find a script titled "PlotExpVsSynth.m" that compares results from G^2 calculations; results are put into the Results subdirectory.<\/p>\n\nPaths may need to be changed in all scripts.<\/p>\n\nRelated content from the doctoral school can be found here: https://github.com/alert-geomaterials/2022-doctoral-school. <\/p>"]}
more »
« less
- Award ID(s):
- 1942096
- PAR ID:
- 10407708
- Publisher / Repository:
- Zenodo
- Date Published:
- Subject(s) / Keyword(s):
- photoelasticity, granular materials
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
This software repository provides the Python functions and a Jupyter notebook that implement the latent-variable bias-mitigating inference methods for the Tully-Fisher Relation. The methods are described in Fu (2025), titled "Mitigating Malmquist and Eddington Biases in Latent-Inclination Regression of the Tully-Fisher Relation". Repository DOI: https://doi.org/10.5281/zenodo.16378199more » « less
-
{"Abstract":["The PoseASL dataset consists of color and depth videos collected from ASL signers at the Linguistic and Assistive Technologies Laboratory under the direction of Matt Huenerfauth, as part of a collaborative research project with researchers at the Rochester Institute of Technology, Boston University, and the University of Pennsylvania.\n\nAccess: After becoming an authorized user of Databrary, please contact Matt Huenerfauth if you have difficulty accessing this volume. \n\nWe have collected a new dataset consisting of color and depth videos of fluent American Sign Language signers performing sequences ASL signs and sentences. Given interest among sign-recognition and other computer-vision researchers in red-green-blue-depth (RBGD) video, we release this dataset for use by the research community. In addition to the video files, we share depth data files from a Kinect v2 sensor, as well as additional motion-tracking files produced through post-processing of this data.\n\nOrganization of the Dataset: The dataset is organized into sub-folders, with codenames such as "P01" or "P16" etc. These codenames refer to specific human signers who were recorded in this dataset. Please note that there was no participant P11 nor P14; those numbers were accidentally skipped during the process of making appointments to collect video stimuli.\n\nTask: During the recording session, the participant was met by a member of our research team who was a native ASL signer. No other individuals were present during the data collection session. After signing the informed consent and video release document, participants responded to a demographic questionnaire. Next, the data-collection session consisted of English word stimuli and cartoon videos. The recording session began with showing participants stimuli consisting of slides that displayed English word and photos of items, and participants were asked to produce the sign for each (PDF included in materials subfolder). Next, participants viewed three videos of short animated cartoons, which they were asked to recount in ASL:\n- Canary Row, Warner Brothers Merrie Melodies 1950 (the 7-minute video divided into seven parts)\n- Mr. Koumal Flies Like a Bird, Studio Animovaneho Filmu 1969\n- Mr. Koumal Battles his Conscience, Studio Animovaneho Filmu 1971\nThe word list and cartoons were selected as they are identical to the stimuli used in the collection of the Nicaraguan Sign Language video corpora - see: Senghas, A. (1995). Children\u2019s Contribution to the Birth of Nicaraguan Sign Language. Doctoral dissertation, Department of Brain and Cognitive Sciences, MIT.\n\nDemographics: All 14 of our participants were fluent ASL signers. As screening, we asked our participants: Did you use ASL at home growing up, or did you attend a school as a very young child where you used ASL? All the participants responded affirmatively to this question. A total of 14 DHH participants were recruited on the Rochester Institute of Technology campus. Participants included 7 men and 7 women, aged 21 to 35 (median = 23.5). All of our participants reported that they began using ASL when they were 5 years old or younger, with 8 reporting ASL use since birth, and 3 others reporting ASL use since age 18 months. \n\nFiletypes:\n\n*.avi, *_dep.bin: The PoseASL dataset has been captured by using a Kinect 2.0 RGBD camera. The output of this camera system includes multiple channels which include RGB, depth, skeleton joints (25 joints for every video frame), and HD face (1,347 points). The video resolution produced in 1920 x 1080 pixels for the RGB channel and 512 x 424 pixels for the depth channels respectively. Due to limitations in the acceptable filetypes for sharing on Databrary, it was not permitted to share binary *_dep.bin files directly produced by the Kinect v2 camera system on the Databrary platform. If your research requires the original binary *_dep.bin files, then please contact Matt Huenerfauth.\n\n*_face.txt, *_HDface.txt, *_skl.txt: To make it easier for future researchers to make use of this dataset, we have also performed some post-processing of the Kinect data. To extract the skeleton coordinates of the RGB videos, we used the Openpose system, which is capable of detecting body, hand, facial, and foot keypoints of multiple people on single images in real time. The output of Openpose includes estimation of 70 keypoints for the face including eyes, eyebrows, nose, mouth and face contour. The software also estimates 21 keypoints for each of the hands (Simon et al, 2017), including 3 keypoints for each finger, as shown in Figure 2. Additionally, there are 25 keypoints estimated for the body pose (and feet) (Cao et al, 2017; Wei et al, 2016).\n\nReporting Bugs or Errors:\n\nPlease contact Matt Huenerfauth to report any bugs or errors that you identify in the corpus. We appreciate your help in improving the quality of the corpus over time by identifying any errors.\n\nAcknowledgement: This material is based upon work supported by the National Science Foundation under award 1749376: "Collaborative Research: Multimethod Investigation of Articulatory and Perceptual Constraints on Natural Language Evolution.""]}more » « less
-
{"Abstract":["Results of the study titled "Disentangling the role of phylogeny and climate on joint leaf trait distributions across Eastern United States". The archive contains three csv files with estimates of N%, C%, LMA, Carotenoids%, ChlorophyllA, ChloropyllB, Lignin and Cellulose for ~1.2 million individual trees sampled by the Forest Inventory and Analysis from 2015 to 2019. Each estimate was produced sampling the posterior distribution of the three models saved as .rds files, and contain the associated uncertainty (estimate error and 95 prediction intervals).<\/p>"]}more » « less
-
This dataset adds satellite parameters and dynamic pressure calculations to the event list at https://osf.io/7rjs4/ You can reference this data set as follows: Plaschke, F., Hietala, H., & LaMoury, A. T. (2020, October 27). THEMIS magnetosheath jet data set 2012-2018. Retrieved from osf.io/7rjs4. The description of the identification processes of the THEMIS data sets is published in: Plaschke, F., Hietala, H., and Angelopoulos, V.: Anti-sunward high-speed jets in the subsolar magnetosheath, Ann. Geophys., 31, 1877–1889, https://doi.org/10.5194/angeo-31-1877-2013, 2013. The original THEMIS list should be downloaded locally from the link above before running the python code in the Jupyter notebook. The columns contain: column 1: jet number column 2: observing spacecraft (A: THEMIS-A, ..., E: THEMIS-E) column 3: start of identified jet interval in UT column 4: time of maximum dynamic pressure ratio in UT column 5: end of identified jet interval in UT In the final version of the list, the column is indexed by "Max" (col 4 in original), and the other columns are named 'Jet Number', 'Ref Spacecraft', 'Start', and 'End'. From CDAWeb, we add the following columns: SM_LAT, SM_LON: Latitude and longitude in GSM coordinates SM_X, SM_Y, SM_Z: Cartesian position in GSM coordinates GEO_X_1, GEO_Y_1, GEO_Z_1: Cartesian position in geographic coordinates DIST_FROM_P93_BOW_SHOCK: Distance from the P93 Bow Shock DIST_FROM_MAGNETOPAUSE: Distance from the RS93 Magnetopause DIST_FROM_T95_NS: Distance to the Tsyganenko 1995 model Neutral Sheet L_VALUE: Dipole L value INVAR_LAT: Dipole Invariant Latitude MAGNETIC_STRENGTH: Magnetic Field Strength Dynamic Pressure (nPa): Peak dynamic pressure of jet Electron density: Vx, Vy, Vz: Cartesian coordinates of ion velocity. Used in computing dynamic pressure These are pulled from orbit parameters and on-board moment data, using ai.cdas and the second using pyspedas respectively.more » « less
An official website of the United States government
