Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract Metamaterials are architected cellular networks with solid struts, plates, or shells that constitute the edges and faces of building cells. Certain metamaterial designs can balance light weight and high stiffness requirements, which are otherwise mutually exclusive in their bulk form. Existing studies on these materials typically focus on their mechanical response under uniaxial compression, but it is unclear whether a strut-based metastructure design with high compressive stiffness can exhibit high torsional stiffness simultaneously. Designing lightweight metastructures with both high compressive and torsional stiffnesses could save time and cost in future material development. To explore the effect of unit cell design, unit cell number, and density distribution on both compressive and torsional stiffnesses, a computational design space was presented. Seven different unit cells, including three basic building blocks: body-centered cubic (BCC), face-centered cubic (FCC), and simple cubic (SC) were analyzed. All samples had a relative density of approximately 7%. It was found that a high compressive stiffness required a high concentration of struts along the loading direction, while a high torsional stiffness needed diagonal struts distributed on the outer face. Increasing unit cell numbers from 1 to 64 affected stiffness by changing the stress distribution globally. Non-uniform metastructure designs with strengthened vertical and diagonal struts towards the outer surface exhibited higher stiffness under either compressive or torsional loading. This study provides valuable guidelines for designing and manufacturing metamaterials for complex mechanical environments.more » « less
-
Free, publicly-accessible full text available July 13, 2026
-
Freezing layers in deep neural networks has been shown to enhance generalization and accelerate training, yet the underlying mechanisms remain unclear. This paper investigates the impact of frozen layers from the perspective of linear separability, examining how untrained, randomly initialized layers influence feature representations and model performance. Using multilayer perceptrons trained on MNIST, CIFAR-10, and CIFAR-100, we systematically analyze the effects freezing layers and network architecture. While prior work attributes the benefits of frozen layers to Cover’s theorem, which suggests that nonlinear transformations improve linear separability, we find that this explanation is insufficient. Instead, our results indicate that the observed improvements in generalization and convergence stem from other mechanisms. We hypothesize that freezing may have similar effects to other regularization techniques and that it may smooth the loss landscape to facilitate training. Furthermore, we identify key architectural factors---such as network overparameterization and use of skip connections---that modulate the effectiveness of frozen layers. These findings offer new insights into the conditions under which freezing layers can optimize deep learning performance, informing future work on neural architecture search.more » « lessFree, publicly-accessible full text available June 3, 2026
-
Giacobini, M; Xue, B; Manzoni, L (Ed.)
An official website of the United States government

Full Text Available