skip to main content


Search for: All records

Award ID contains: 1843025

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available June 1, 2024
  2. Biomaterials and biomedical implants have revolutionized the way medicine is practiced. Technologies, such as 3D printing and electrospinning, are currently employed to create novel biomaterials. Most of the synthesis techniques are ad-hoc, time taking, and expensive. These shortcomings can be overcome greatly with the employment of computational techniques. In this paper we consider the problem of bone tissue engineering as an example and show the potentials of machine learning approaches in biomaterial construction, in which different models was built to predict the elastic modulus of the scaffold at given an arbitrary material composition. Likewise, the methodology was extended to cell-material interaction and prediction at an arbitrary process parameter.

     
    more » « less
  3. The large model size, high computational operations, and vulnerability against membership inference attack (MIA) have impeded deep learning or deep neural networks (DNNs) popularity, especially on mobile devices. To address the challenge, we envision that the weight pruning technique will help DNNs against MIA while reducing model storage and computational operation. In this work, we propose a pruning algorithm, and we show that the proposed algorithm can find a subnetwork that can prevent privacy leakage from MIA and achieves competitive accuracy with the original DNNs. We also verify our theoretical insights with experiments. Our experimental results illustrate that the attack accuracy using model compression is up to 13.6% and 10% lower than that of the baseline and Min-Max game, accordingly.

     
    more » « less
  4. null (Ed.)
  5. null (Ed.)
  6. null (Ed.)