skip to main content

Search for: All records

Creators/Authors contains: "Wang, Yifei"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available July 12, 2023
  2. Knowledge graph (KG) representation learning aims to encode entities and relations into dense continuous vector spaces such that knowledge contained in a dataset could be consistently represented. Dense embeddings trained from KG datasets benefit a variety of downstream tasks such as KG completion and link prediction. However, existing KG embedding methods fell short to provide a systematic solution for the global consistency of knowledge representation. We developed a mathematical language for KG based on an observation of their inherent algebraic structure, which we termed as Knowledgebra. By analyzing five distinct algebraic properties, we proved that the semigroup is the most reasonable algebraic structure for the relation embedding of a general knowledge graph. We implemented an instantiation model, SemE, using simple matrix semigroups, which exhibits state-of-the-art performance on standard datasets. Moreover, we proposed a regularization-based method to integrate chain-like logic rules derived from human knowledge into embedding training, which further demonstrates the power of the developed language. As far as we know, by applying abstract algebra in statistical learning, this work develops the first formal language for general knowledge graphs, and also sheds light on the problem of neural-symbolic integration from an algebraic perspective.
    Free, publicly-accessible full text available June 1, 2023
  3. Abstract

    Pattern formation in plasma–solid interaction represents a great research challenge in many applications from plasma etching to surface treatment, whereby plasma attachments on electrodes (arc roots) are constricted to self-organized spots. Gliding arc discharge in a Jacob’s Ladder, exhibiting hopping dynamics, provides a unique window to probe the nature of pattern formation in plasma–surface interactions. In this work, we find that the existence of negative differential resistance (NDR) across the sheath is responsible for the observed hopping pattern. Due to NDR, the current density and potential drop behave as activator and inhibitor, the dynamic interactions of which govern the surface current density re-distribution and the formation of structured spots. In gliding arc discharges, new arc roots can form separately in front of the existing root(s), which happens periodically to constitute the stepwise hopping. From the instability phase-diagram analysis, the phenomenon that arc attachments tend to constrict itself spontaneously in the NDR regime is well explained. Furthermore, we demonstrate via a comprehensive magnetohydrodynamics (MHD) computation that the existence of a sheath NDR can successfully reproduce the arc hopping as observed in experiments. Therefore, this work uncovers the essential role of sheath NDR in the plasma–solid surface pattern formation and opensmore »up a hitherto unexplored area of research for manipulating the plasma–solid interactions.

    « less
  4. We propose a randomized algorithm with quadratic convergence rate for convex optimization problems with a self-concordant, composite, strongly convex objective function. Our method is based on performing an approximate Newton step using a random projection of the Hessian. Our first contribution is to show that, at each iteration, the embedding dimension (or sketch size) can be as small as the effective dimension of the Hessian matrix. Leveraging this novel fundamental result, we design an algorithm with a sketch size proportional to the effective dimension and which exhibits a quadratic rate of convergence. This result dramatically improves on the classical linear-quadratic convergence rates of state-of-theart sub-sampled Newton methods. However, in most practical cases, the effective dimension is not known beforehand, and this raises the question of how to pick a sketch size as small as the effective dimension while preserving a quadratic convergence rate. Our second and main contribution is thus to propose an adaptive sketch size algorithm with quadratic convergence rate and which does not require prior knowledge or estimation of the effective dimension: at each iteration, it starts with a small sketch size, and increases it until quadratic progress is achieved. Importantly, we show that the embedding dimension remainsmore »proportional to the effective dimension throughout the entire path and that our method achieves state-of-the-art computational complexity for solving convex optimization programs with a strongly convex component. We discuss and illustrate applications to linear and quadratic programming, as well as logistic regression and other generalized linear models.« less