skip to main content


Title: Steering self-organisation through confinement
Self-organisation is the spontaneous emergence of spatio-temporal structures and patterns from the interaction of smaller individual units. Examples are found across many scales in very different systems and scientific disciplines, from physics, materials science and robotics to biology, geophysics and astronomy. Recent research has highlighted how self-organisation can be both mediated and controlled by confinement. Confinement is an action over a system that limits its units’ translational and rotational degrees of freedom, thus also influencing the system's phase space probability density; it can function as either a catalyst or inhibitor of self-organisation. Confinement can then become a means to actively steer the emergence or suppression of collective phenomena in space and time. Here, to provide a common framework and perspective for future research, we examine the role of confinement in the self-organisation of soft-matter systems and identify overarching scientific challenges that need to be addressed to harness its full scientific and technological potential in soft matter and related fields. By drawing analogies with other disciplines, this framework will accelerate a common deeper understanding of self-organisation and trigger the development of innovative strategies to steer it using confinement, with impact on, e.g. , the design of smarter materials, tissue engineering for biomedicine and in guiding active matter.  more » « less
Award ID(s):
1955210
NSF-PAR ID:
10436426
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; more » ; ; ; ; ; ; ; ; ; « less
Date Published:
Journal Name:
Soft Matter
Volume:
19
Issue:
9
ISSN:
1744-683X
Page Range / eLocation ID:
1695 to 1704
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Emerging interest to synthesize active, engineered matter suggests a future where smart material systems and structures operate autonomously around people, serving diverse roles in engineering, medical, and scientific applications. Similar to biological organisms, a realization of active, engineered matter necessitates functionality culminating from a combination of sensory and control mechanisms in a versatile material frame. Recently, metamaterial platforms with integrated sensing and control have been exploited, so that outstanding non‐natural material behaviors are empowered by synergistic microstructures and controlled by smart materials and systems. This emerging body of science around active mechanical metamaterials offers a first glimpse at future foundations for autonomous engineered systems referred to here as soft, smart matter. Using natural inspirations, synergy across disciplines, and exploiting multiple length scales as well as multiple physics, researchers are devising compelling exemplars of actively controlled metamaterials, inspiring concepts for autonomous engineered matter. While scientific breakthroughs multiply in these fields, future technical challenges remain to be overcome to fulfill the vision of soft, smart matter. This Review surveys the intrinsically multidisciplinary body of science targeted to realize soft, smart matter via innovations in active mechanical metamaterials and proposes ongoing research targets that may deliver the promise of autonomous, engineered matter to full fruition.

     
    more » « less
  2. BACKGROUND Optical sensing devices measure the rich physical properties of an incident light beam, such as its power, polarization state, spectrum, and intensity distribution. Most conventional sensors, such as power meters, polarimeters, spectrometers, and cameras, are monofunctional and bulky. For example, classical Fourier-transform infrared spectrometers and polarimeters, which characterize the optical spectrum in the infrared and the polarization state of light, respectively, can occupy a considerable portion of an optical table. Over the past decade, the development of integrated sensing solutions by using miniaturized devices together with advanced machine-learning algorithms has accelerated rapidly, and optical sensing research has evolved into a highly interdisciplinary field that encompasses devices and materials engineering, condensed matter physics, and machine learning. To this end, future optical sensing technologies will benefit from innovations in device architecture, discoveries of new quantum materials, demonstrations of previously uncharacterized optical and optoelectronic phenomena, and rapid advances in the development of tailored machine-learning algorithms. ADVANCES Recently, a number of sensing and imaging demonstrations have emerged that differ substantially from conventional sensing schemes in the way that optical information is detected. A typical example is computational spectroscopy. In this new paradigm, a compact spectrometer first collectively captures the comprehensive spectral information of an incident light beam using multiple elements or a single element under different operational states and generates a high-dimensional photoresponse vector. An advanced algorithm then interprets the vector to achieve reconstruction of the spectrum. This scheme shifts the physical complexity of conventional grating- or interference-based spectrometers to computation. Moreover, many of the recent developments go well beyond optical spectroscopy, and we discuss them within a common framework, dubbed “geometric deep optical sensing.” The term “geometric” is intended to emphasize that in this sensing scheme, the physical properties of an unknown light beam and the corresponding photoresponses can be regarded as points in two respective high-dimensional vector spaces and that the sensing process can be considered to be a mapping from one vector space to the other. The mapping can be linear, nonlinear, or highly entangled; for the latter two cases, deep artificial neural networks represent a natural choice for the encoding and/or decoding processes, from which the term “deep” is derived. In addition to this classical geometric view, the quantum geometry of Bloch electrons in Hilbert space, such as Berry curvature and quantum metrics, is essential for the determination of the polarization-dependent photoresponses in some optical sensors. In this Review, we first present a general perspective of this sensing scheme from the viewpoint of information theory, in which the photoresponse measurement and the extraction of light properties are deemed as information-encoding and -decoding processes, respectively. We then discuss demonstrations in which a reconfigurable sensor (or an array thereof), enabled by device reconfigurability and the implementation of neural networks, can detect the power, polarization state, wavelength, and spatial features of an incident light beam. OUTLOOK As increasingly more computing resources become available, optical sensing is becoming more computational, with device reconfigurability playing a key role. On the one hand, advanced algorithms, including deep neural networks, will enable effective decoding of high-dimensional photoresponse vectors, which reduces the physical complexity of sensors. Therefore, it will be important to integrate memory cells near or within sensors to enable efficient processing and interpretation of a large amount of photoresponse data. On the other hand, analog computation based on neural networks can be performed with an array of reconfigurable devices, which enables direct multiplexing of sensing and computing functions. We anticipate that these two directions will become the engineering frontier of future deep sensing research. On the scientific frontier, exploring quantum geometric and topological properties of new quantum materials in both linear and nonlinear light-matter interactions will enrich the information-encoding pathways for deep optical sensing. In addition, deep sensing schemes will continue to benefit from the latest developments in machine learning. Future highly compact, multifunctional, reconfigurable, and intelligent sensors and imagers will find applications in medical imaging, environmental monitoring, infrared astronomy, and many other areas of our daily lives, especially in the mobile domain and the internet of things. Schematic of deep optical sensing. The n -dimensional unknown information ( w ) is encoded into an m -dimensional photoresponse vector ( x ) by a reconfigurable sensor (or an array thereof), from which w′ is reconstructed by a trained neural network ( n ′ = n and w′   ≈   w ). Alternatively, x may be directly deciphered to capture certain properties of w . Here, w , x , and w′ can be regarded as points in their respective high-dimensional vector spaces ℛ n , ℛ m , and ℛ n ′ . 
    more » « less
  3. null (Ed.)
    Active colloidal fluids, biological and synthetic, often demonstrate complex self-organization and the emergence of collective behavior. Spontaneous formation of multiple vortices has been recently observed in a variety of active matter systems, however, the generation and tunability of the active vortices not controlled by geometrical confinement remain challenging. Here, we exploit the persistence length of individual particles in ensembles of active rollers to tune the formation of vortices and to orchestrate their characteristic sizes. We use two systems and employ two different approaches exploiting shape anisotropy or polarization memory of individual units for control of the persistence length. We characterize the dynamics of emergent multi-vortex states and reveal a direct link between the behavior of the persistence length and properties of the emergent vortices. We further demonstrate common features between the two systems including anti-ferromagnetic ordering of the neighboring vortices and active turbulent behavior with a characteristic energy cascade in the particles velocity field energy spectra. Our findings provide insights into the onset of spatiotemporal coherence in active roller systems and suggest a control knob for manipulation of dynamic self-assembly in active colloidal ensembles. 
    more » « less
  4. Abstract

    Spatial confinement of matter in functional nanostructures has propelled these systems to the forefront of nanoscience, both as a playground for exotic physics and quantum phenomena and in multiple applications including plasmonics, optoelectronics, and sensing. In parallel, the emergence of monochromated electron energy loss spectroscopy (EELS) has enabled exploration of local nanoplasmonic functionalities within single nanoparticles and the collective response of nanoparticle assemblies, providing deep insight into associated mechanisms. However, modern synthesis processes for plasmonic nanostructures are often limited in the types of accessible geometry, and materials and are limited to spatial precisions on the order of tens of nm, precluding the direct exploration of critical aspects of the structure‐property relationships. Here, the atomic‐sized probe of the scanning transmission electron microscope is used to perform precise sculpting and design nanoparticle configurations. Using low‐loss EELS, dynamic analyses of the evolution of the plasmonic response are provided. It is shown that within self‐assembled systems of nanoparticles, individual nanoparticles can be selectively removed, reshaped, or patterned with nanometer‐level resolution, effectively modifying the plasmonic response in both space and energy. This process significantly increases the scope for design possibilities and presents opportunities for unique structure development, which are ultimately the key for nanophotonic design.

     
    more » « less
  5. Background: Text recycling (hereafter TR)—the reuse of one’s own textual materials from one document in a new document—is a common but hotly debated and unsettled practice in many academic disciplines, especially in the context of peer-reviewed journal articles. Although several analytic systems have been used to determine replication of text—for example, for purposes of identifying plagiarism—they do not offer an optimal way to compare documents to determine the nature and extent of TR in order to study and theorize this as a practice in different disciplines. In this article, we first describe TR as a common phenomenon in academic publishing, then explore the challenges associated with trying to study the nature and extent of TR within STEM disciplines. We then describe in detail the complex processes we used to create a system for identifying TR across large corpora of texts, and the sentence-level string-distance lexical methods used to refine and test the system (White & Joy, 2004). The purpose of creating such a system is to identify legitimate cases of TR across large corpora of academic texts in different fields of study, allowing meaningful cross-disciplinary comparisons in future analyses of published work. The findings from such investigations will extend and refine our understanding of discourse practices in academic and scientific settings. Literature Review: Text-analytic methods have been widely developed and implemented to identify reused textual materials for detecting plagiarism, and there is considerable literature on such methods. (Instead of taking up space detailing this literature, we point readers to several recent reviews: Gupta, 2016; Hiremath & Otari, 2014; and Meuschke & Gipp, 2013). Such methods include fingerprinting, term occurrence analysis, citation analysis (identifying similarity in references and citations), and stylometry (statistically comparing authors’ writing styles; see Meuschke & Gipp, 2013). Although TR occurs in a wide range of situations, recent debate has focused on recycling from one published research paper to another—particularly in STEM fields (see, for example, Andreescu, 2013; Bouville, 2008; Bretag & Mahmud, 2009; Roig, 2008; Scanlon, 2007). An important step in better understanding the practice is seeing how authors actually recycle material in their published work. Standard methods for detecting plagiarism are not directly suitable for this task, as the objective is not to determine the presence or absence of reuse itself, but to study the types and patterns of reuse, including materials that are syntactically but not substantively distinct—such as “patchwriting” (Howard, 1999). In the present account of our efforts to create a text-analytic system for determining TR, we take a conventional alphabetic approach to text, in part because we did not aim at this stage of our project to analyze non-discursive text such as images or other media. However, although the project adheres to conventional definitions of text, with a focus on lexical replication, we also subscribe to context-sensitive approaches to text production. The results of applying the system to large corpora of published texts can potentially reveal varieties in the practice of TR as a function of different discourse communities and disciplines. Writers’ decisions within what appear to be canonical genres are contingent, based on adherence to or deviation from existing rules and procedures if and when these actually exist. Our goal is to create a system for analyzing TR in groups of texts produced by the same authors in order to determine the nature and extent of TR, especially across disciplinary areas, without judgment of scholars’ use of the practice. 
    more » « less