skip to main content

Search for: All records

Creators/Authors contains: "McHenry, Kenton"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. null (Ed.)
  2. Abstract

    Significant investments to upgrade and construct large-scale scientific facilities demand commensurate investments in R&D to design algorithms and computing approaches to enable scientific and engineering breakthroughs in the big data era. Innovative Artificial Intelligence (AI) applications have powered transformational solutions for big data challenges in industry and technology that now drive a multi-billion dollar industry, and which play an ever increasing role shaping human social patterns. As AI continues to evolve into a computing paradigm endowed with statistical and mathematical rigor, it has become apparent that single-GPU solutions for training, validation, and testing are no longer sufficient for computational grand challenges brought about by scientific facilities that produce data at a rate and volume that outstrip the computing capabilities of available cyberinfrastructure platforms. This realization has been driving the confluence of AI and high performance computing (HPC) to reduce time-to-insight, and to enable a systematic study of domain-inspired AI architectures and optimization schemes to enable data-driven discovery. In this article we present a summary of recent developments in this field, and describe specific advances that authors in this article are spearheading to accelerate and streamline the use of HPC platforms to design and apply accelerated AI algorithms in academiamore »and industry.

    « less
  3. Abstract In Fall 2020, universities saw extensive transmission of SARS-CoV-2 among their populations, threatening health of the university and surrounding communities, and viability of in-person instruction. Here we report a case study at the University of Illinois at Urbana-Champaign, where a multimodal “SHIELD: Target, Test, and Tell” program, with other non-pharmaceutical interventions, was employed to keep classrooms and laboratories open. The program included epidemiological modeling and surveillance, fast/frequent testing using a novel low-cost and scalable saliva-based RT-qPCR assay for SARS-CoV-2 that bypasses RNA extraction, called covidSHIELD, and digital tools for communication and compliance. In Fall 2020, we performed >1,000,000 covidSHIELD tests, positivity rates remained low, we had zero COVID-19-related hospitalizations or deaths amongst our university community, and mortality in the surrounding Champaign County was reduced more than 4-fold relative to expected. This case study shows that fast/frequent testing and other interventions mitigated transmission of SARS-CoV-2 at a large public university.
    Free, publicly-accessible full text available December 1, 2023
  4. Recent advances in cyber-infrastructure have enabled digital data sharing and ubiquitous network connectivity between scientific instruments and cloud-based storage infrastructure for uploading, storing, curating, and correlating of large amounts of materials and semiconductor fabrication data and metadata. However, there is still a significant number of scientific instruments running on old operating systems that are taken offline and cannot connect to the cloud infrastructure, due to security and network performance concerns. In this paper, we propose BRACELET - an edge-cloud infrastructure that augments the existing cloud-based infrastructure with edge devices and helps to tackle the unique performance & security challenges that scientific instruments face when they are connected to the cloud through public network. With BRACELET, we put a networked edge device, called cloudlet, in between the scientific instruments and the cloud as the middle tier of a three-tier hierarchy. The cloudlet will shape and protect the data traffic from scientific instruments to the cloud, and will play a foundational role in keeping the instruments connected throughout its lifetime, and continuously providing the otherwise missing performance and security features for the instrument as its operating system ages.