skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: NIST's Software Un-Standards
The National Institute of Standards and Technology (NIST) has become a beacon of hope for those who trust in federal standards for software and AI safety. Moreover, lawmakers and commentators have indicated that compliance with NIST standards ought to shield entities from liability. With more than a century of expertise in scientific research and standard-setting, NIST would seem to be uniquely qualified to develop such standards. But as I argue in this paper, this faith is misplaced. NIST’s latest forays in risk management frameworks disavow concrete metrics or outcomes, and solicit voluntary participation instead of providing stable mandates. That open-ended approach can be attributed to the reversal of NIST’s prior efforts to promulgate federal software standards during the 1970s and 1980s. The failure of those federal regulatory efforts highlights fundamental challenges inherent in software development that continue to persist today. Policymakers should draw upon the lessons of NIST’s experience and recognize that federal standards are unlikely to be the silver bullet. Instead, they should heed NIST’s admonition that the practice of software development remains deeply fragmented for other intrinsic reasons. Any effort to establish a universal standard of care must grapple with the need to accommodate the broad heterogeneity of accepted practices in the field.  more » « less
Award ID(s):
2131531
PAR ID:
10543920
Author(s) / Creator(s):
Publisher / Repository:
Lawfare
Date Published:
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The pursuit of software safety standards has stalled. In response, commentators and policymakers have looked increasingly to federal agencies to deliver new hope. Some place their faith in existing agencies while others propose a new super agency to oversee software-specific issues. This turn reflects both optimism in the agency model as well as pessimism in other institutions such as the judiciary or private markets. This Essay argues that the agency model is not a silver bullet. Applying a comparative institutional choice lens, this Essay explains that the characteristic strengths of the agency model—expertise, uniformity, and efficiency—offer less advantage than one might expect in the software domain. Because software complexity exceeds the capacity of software expertise, software experts have been unable to devise standards that meaningfully assure safety. That root limitation is unlikely to change by amassing more software experts in a central agency. This Essay argues further that the institutional choice literature should embrace an information-centered approach, rather than a participation-centered approach, when confronting an area of scientific impotence. While participation is a useful proxy when each stakeholder has relevant information to contribute, it loses its efficacy when the complexity of the problem escapes the ability of the participants. Instead, the focus should shift to constructing an empirical body of knowledge regarding the norms and customary practices in the field. 
    more » « less
  2. null (Ed.)
    We study the security of CTR-DRBG , one of NIST’s recommended Pseudorandom Number Generator (PRNG) designs. Recently, Woodage and Shumow (Eurocrypt’ 19), and then Cohney et al. (S&P’ 20) point out some potential vulnerabilities in both NIST specification and common implementations of CTR-DRBG . While these researchers do suggest counter-measures, the security of the patched CTR-DRBG is still questionable. Our work fills this gap, proving that CTR-DRBG satisfies the robustness notion of Dodis et al. (CCS’13), the standard security goal for PRNGs. 
    more » « less
  3. As the field of zirconium (Zr) stable isotopes is rapidly expanding from the study of mass-independent to that of mass-dependent isotope effects, a variety of Zr standards have appeared in the literature. While several of these standards have been proposed as the ideal isotope reference material (iRM) against which all data should be reported, none of them have been shown to meet the compositional and/or conflict-of-interest-free distribution requirements put forth by the community. To remedy this situation, we report on a community-led effort to develop and calibrate a scale defining iRM for Zr isotopes: NIST RM 8299. Developed in partnership with the National Institute of Standards and Technology (NIST) from the widely used SRM 3169 Zirconium Standard Solution (certified for mass fraction), the candidate RM 8299 was calibrated through an inter-laboratory study involving three laboratories. Our data show that candidate RM 8299 meets all requirements of an ideal iRM. It is an isotopically homogeneous, high-purity reference material, that is free of isotope anomalies, and whose composition is identical to that of a major geological reservoir (Ocean Island Basalts). Furthermore, RM 8299 will be curated and distributed by NIST, a neutral, conflict-of-interest free organization, and was produced in sufficient quantities to last multiple decades. We recommend that all Zr isotope data be reported against RM 8299. Our results also show that SRM 3169 lots #130920 and #071226 have indistinguishable composition compared to candidate RM 8299. Therefore, using RM 8299 as the scale defining iRM will enable direct comparison of all future data with the vast majority of the existing literature data, both for mass-independent and mass-dependent isotope effects. To facilitate conversion of δ94/90Zr values reported against other Zr standards, we provide high-precision conversion factors to the RM 8299 scale obtained using the double-spike method. 
    more » « less
  4. Jacobson v. Massachusetts has long stood for the proposition that courts should generally uphold the government’s public health policies even when they incidentally infringe constitutional rights protections. But the COVID-19 pandemic disrupted this traditional understanding, as many federal courts struck down or enjoined state and local pandemic-response policies, downplaying the applicability of Jacobson. Meanwhile, prominent legal scholars argued that judicial deference premised on Jacobson should be completely abandoned. This article argues that Jacobson must be reconsidered in light of COVID-19, but its posture of deference should not be abandoned. Instead, this article proposes a new theory of “Public Health Deference,” which is the deference that courts should afford to the government’s pandemic-response policies. This article argues that Public Health Deference should be premised on the quality of the processes by which the government creates and implements public health policies, even during an emergency. Courts should not blindly defer to the government’s pandemic response; instead, they should evaluate the government’s decision-making processes to ensure that they meet standards of transparency, accountability, public justification, and community engagement. 
    more » « less
  5. When quantum computers become scalable and reliable, they are likely to break all public-key cryptography standards, such as RSA and Elliptic Curve Cryptography. The projected threat of quantum computers has led the U.S. National Institute of Standards and Technology (NIST) to an effort aimed at replacing existing public-key cryptography standards with new quantum-resistant alternatives. In December 2017, 69 candidates were accepted by NIST to Round 1 of the NIST Post-Quantum Cryptography (PQC) standardization process. NTRUEncrypt is one of the most well-known PQC algorithms that has withstood cryptanalysis. The speed of NTRUEncrypt in software, especially on embedded software platforms, is limited by the long execution time of its primary operation, polynomial multiplication. In this paper, we investigate speeding up NTRUEncrypt using software/hardware codesign on a Xilinx Zynq UltraScale+ multiprocessor system-on-chip (MPSoC). Polynomial multiplication is implemented in the Programmable Logic (PL) of Zynq using two approaches: traditional Register-Transfer Level (RTL) and High-Level Synthesis (HLS). The remaining operations of NTRUEncrypt are executed in software on the Processing System (PS) of Zynq, using the bare-metal mode. The speed-up of our software/hardware codesigns vs. purely software implementations is determined experimentally and analyzed in the paper. The results are reported for the RTL-based and HLS-based hardware accelerators, and compared to the best available software implementation, included in the NIST submission package. The speed-ups for encryption were 2.4 and 3.9, depending on the selected parameter set. For decryption, the corresponding speed-ups were 4.0 and 6.8. In addition, for the polynomial multiplication operation itself, the speed up was in excess of 75. Our code for the NTRUEncrypt polynomial multiplier accelerator is being made open-source for further evaluation on multiple software/hardware platforms. 
    more » « less