skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on December 17, 2025

Title: Design and testing of photon-based hardware random number generator
Hardware random number generators (HRNG) are widely used in the computer world for security purposes as well as in the science world as a source of the high-quality randomness for the models and simulations. Currently existing HRNG are either costly or very slow and of questionable quality. This work proposes a simple design of the HRNG based on the low-number photon absorption by a detector (a photo-multiplier tube of a silicon-based one i.e. SiPM, MPPC, etc.) that can provide a large volume of high-quality random numbers. The prototype design, different options of processing and the testing of quality of the generator output are presented.  more » « less
Award ID(s):
2316097
PAR ID:
10631988
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
Sissa Medialab
Date Published:
Page Range / eLocation ID:
1215
Format(s):
Medium: X
Location:
Prague, Czech Republic
Sponsoring Org:
National Science Foundation
More Like this
  1. True random number generator (TRNG) plays a vital role in a variety of security applications and protocols. The security and privacy of an asset rely on encryption, which solely depends on the quality of random numbers. Memory chips are widely used for generating random numbers because of their prevalence in modern electronic systems. Unfortunately, existing Dynamic Random-access Memory (DRAM)-based TRNGs produce random numbers with either limited entropy or poor throughput. In this paper, we propose a DRAM-latency based TRNG that generates high-quality random numbers. The silicon results from Samsung and Micron DDR3 DRAM modules show that our proposed DRAM-latency based TRNG is robust (against different operating conditions and environmental variations) and acceptably fast. 
    more » « less
  2. We propose a novel framework for the systematic design of lensless imaging systems based on the hyperuniform random field solutions of nonlinear reaction-diffusion equations from pattern formation theory. Specifically, we introduce a new class of imaging point-spread functions (PSFs) with enhanced isotropic behavior and controllable sparsity. We investigate PSFs and modulated transfer functions for a number of nonlinear models and demonstrate that two-phase isotropic random fields with hyperuniform disorder are ideally suited to construct imaging PSFs with improved performances compared to PSFs based on Perlin noise. Additionally, we introduce a phase retrieval algorithm based on non-paraxial Rayleigh–Sommerfeld diffraction theory and introduce diffractive phase plates with PSFs designed from hyperuniform random fields, called hyperuniform phase plates (HPPs). Finally, using high-fidelity object reconstruction, we demonstrate improved image quality using engineered HPPs across the visible range. The proposed framework is suitable for high-performance lensless imaging systems for on-chip microscopy and spectroscopy applications. 
    more » « less
  3. Signal maps are essential for the planning and operation of cellular networks. However, the measurements needed to create such maps are expensive, often biased, not always reflecting the performance metrics of interest, and posing privacy risks. In this paper, we develop a unified framework for predicting cellular performance maps from limited available measurements. Our framework builds on a state-of-the-art random-forest predictor, or any other base predictor. We propose and combine three mechanisms that deal with the fact that not all measurements are equally important for a particular prediction task. First, we design quality-of-service functions (Q), including signal strength (RSRP) but also other metrics of interest to operators, such as number of bars, coverage (improving recall by 76%-92%) and call drop probability (reducing error by as much as 32%). By implicitly altering the loss function employed in learning, quality functions can also improve prediction for RSRP itself where it matters (e.g., MSE reduction up to 27% in the low signal strength regime, where high accuracy is critical). Second, we introduce weight functions (W) to specify the relative importance of prediction at different locations and other parts of the feature space. We propose re-weighting based on importance sampling to obtain unbiased estimators when the sampling and target distributions are different. This yields improvements up to 20% for targets based on spatially uniform loss or losses based on user population density. Third, we apply the Data Shapley framework for the first time in this context: to assign values (ϕ) to individual measurement points, which capture the importance of their contribution to the prediction task. This can improve prediction (e.g., from 64% to 94% in recall for coverage loss) by removing points with negative values and storing only the remaining data points (i.e., as low as 30%), which also has the side-benefit of helping privacy. We evaluate our methods and demonstrate significant improvement in prediction performance, using several real-world datasets. 
    more » « less
  4. Experimental design techniques such as active search and Bayesian optimization are widely used in the natural sciences for data collection and discovery. However, existing techniques tend to favor exploitation over exploration of the search space, which causes them to get stuck in local optima. This collapse problem prevents experimental design algorithms from yielding diverse high-quality data. In this paper, we extend the Vendi scores—a family of interpretable similarity-based diversity metrics—to account for quality. We then leverage these quality-weighted Vendi scores to tackle experimental design problems across various applications, including drug discovery, materials discovery, and reinforcement learning. We found that quality-weighted Vendi scores allow us to construct policies for experimental design that flexibly balance quality and diversity, and ultimately assemble rich and diverse sets of high-performing data points. Our algorithms led to a 70%–170% increase in the number of effective discoveries compared to baselines. 
    more » « less
  5. Computational modeling and simulation of real-world problems, e.g., various applications in the automotive, aerospace, and biomedical industries, often involve geometric objects which are bounded by curved surfaces. The geometric modeling of such objects can be performed via high-order meshes. Such a mesh, when paired with a high-order partial differential equation (PDE) solver, can realize more accurate solution results with a decreased number of mesh elements (in comparison to a low-order mesh). There are several types of high-order mesh generation approaches, such as direct methods, a posteriori methods, and isogeometric analysis (IGA)-based spline modeling approaches. In this paper, we propose a direct, high-order, curvilinear tetrahedral mesh generation method using an advancing front technique. After generating the mesh, we apply mesh optimization to improve the quality and to take advantage of the degrees of freedom available in the initially straight-sided quadratic elements. Our method aims to generate high-quality tetrahedral mesh elements from various types of boundary representations including the cases where no computer-aided design files are available. Such a method is essential, for example, for generating meshes for various biomedical models where the boundary representation is obtained from medical images instead of CAD files. We present several numerical examples of second-order tetrahedral meshes generated using our method based on input triangular surface meshes. 
    more » « less