skip to main content

Search for: All records

Creators/Authors contains: "Yan, Feng"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available August 1, 2024
  2. Free, publicly-accessible full text available July 1, 2024
  3. High entropy oxide nanoparticles (HEO NPs) with multiple component elements possess improved stability and multiple uses for functional applications, including catalysis, data memory, and energy storage. However, the synthesis of homogenous HEO NPs containing five or more immiscible elements with a single-phase structure is still a great challenge due to the strict synthetic conditions. In particular, several synthesis methods of HEO NPs require extremely high temperatures. In this study, we demonstrate a low cost, facile, and effective method to synthesize three- to eight-element HEO nanoparticles by a combination of electrospinning and low-temperature ambient annealing. HEO NPs were generated by annealing nanofibers at 330 °C for 30 minutes under air conditions. The average size of the HEO nanoparticles was ∼30 nm and homogenous element distribution was obtained from post-electrospinning thermal decomposition. The synthesized HEO NPs exhibited magnetic properties with the highest saturation magnetization at 9.588 emu g −1 and the highest coercivity at 147.175 Oe for HEO NPs with four magnetic elements while integrating more nonmagnetic elements will suppress the magnetic response. This electrospun and low-temperature annealing method provides an easy and flexible design for nanoparticle composition and economic processing pathway, which offers a cost- and energy-effective, and high throughput entropymore »nanoparticle synthesis on a large scale.« less
    Free, publicly-accessible full text available May 30, 2024
  4. Free, publicly-accessible full text available May 1, 2024
  5. Free, publicly-accessible full text available May 1, 2024
  6. Free, publicly-accessible full text available May 1, 2024
  7. The rise of deep neural networks offers new opportunities in optimizing recommender systems. However, optimizing recommender systems using deep neural networks requires delicate architecture fabrication. We propose NASRec, a paradigm that trains a single supernet and efficiently produces abundant models/sub-architectures by weight sharing. To overcome the data multi-modality and architecture heterogeneity challenges in the recommendation domain, NASRec establishes a large supernet (i.e., search space) to search the full architectures. The supernet incorporates versatile choice of operators and dense connectivity to minimize human efforts for finding priors. The scale and heterogeneity in NASRec impose several challenges, such as training inefficiency, operator-imbalance, and degraded rank correlation. We tackle these challenges by proposing single-operator any-connection sampling, operator-balancing interaction modules, and post-training fine-tuning. Our crafted models, NASRecNet, show promising results on three Click-Through Rates (CTR) prediction benchmarks, indicating that NASRec outperforms both manually designed models and existing NAS methods with state-of-the-art performance. Our work is publicly available here.
    Free, publicly-accessible full text available April 30, 2024
  8. Chen, Wei R. (Ed.)
    Free, publicly-accessible full text available March 14, 2024
  9. Chen, Wei R. (Ed.)
    Free, publicly-accessible full text available March 14, 2024
  10. Free, publicly-accessible full text available February 1, 2024