skip to main content


This content will become publicly available on August 1, 2024

Title: Nucleus I: Adjunction spectra in recommender systems and descent
Recommender systems build user profiles using concept analysis of usage matrices. The concepts are mined as spectra and form Galois connections. Descent is a general method for spectral decomposition in algebraic geometry and topology which also leads to generalized Galois connections. Both recommender systems and descent theory are vast research areas, separated by a technical gap so large that trying to establish a link would seem foolish. Yet a formal link emerged, all on its own, bottom-up, against authors’ intentions and better judgment. Familiar problems of data analysis led to a novel solution in category theory. The present paper arose from a series of earlier efforts to provide a top-down account of these developments.  more » « less
Award ID(s):
1662487
NSF-PAR ID:
10483795
Author(s) / Creator(s):
;
Publisher / Repository:
Springer
Date Published:
Journal Name:
Logical Methods in Computer Science
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    In this paper we develop methods for classifying Baker, Richter, and Szymik's Azumaya algebras over a commutative ring spectrum, especially in the largely inaccessible case where the ring is nonconnective. We give obstruction-theoretic tools, constructing and classifying these algebras and their automorphisms with Goerss–Hopkins obstruction theory, and give descent-theoretic tools, applying Lurie's work on $\infty$ -categories to show that a finite Galois extension of rings in the sense of Rognes becomes a homotopy fixed-point equivalence on Brauer spaces. For even-periodic ring spectra $E$ , we find that the ‘algebraic’ Azumaya algebras whose coefficient ring is projective are governed by the Brauer–Wall group of $\pi _0(E)$ , recovering a result of Baker, Richter, and Szymik. This allows us to calculate many examples. For example, we find that the algebraic Azumaya algebras over Lubin–Tate spectra have either four or two Morita equivalence classes, depending on whether the prime is odd or even, that all algebraic Azumaya algebras over the complex K-theory spectrum $KU$ are Morita trivial, and that the group of the Morita classes of algebraic Azumaya algebras over the localization $KU[1/2]$ is $\mathbb {Z}/8\times \mathbb {Z}/2$ . Using our descent results and an obstruction theory spectral sequence, we also study Azumaya algebras over the real K-theory spectrum $KO$ which become Morita-trivial $KU$ -algebras. We show that there exist exactly two Morita equivalence classes of these. The nontrivial Morita equivalence class is realized by an ‘exotic’ $KO$ -algebra with the same coefficient ring as $\mathrm {End}_{KO}(KU)$ . This requires a careful analysis of what happens in the homotopy fixed-point spectral sequence for the Picard space of $KU$ , previously studied by Mathew and Stojanoska. 
    more » « less
  2. Network embedding has gained more attentions in recent years. It has been shown that the learned low-dimensional node vector representations could advance a myriad of graph mining tasks such as node classification, community detection, and link prediction. A vast majority of the existing efforts are overwhelmingly devoted to single-layered networks or homogeneous networks with a single type of nodes and node interactions. However, in many real-world applications, a variety of networks could be abstracted and presented in a multilayered fashion. Typical multi-layered networks include critical infrastructure systems, collaboration platforms, social recommender systems, to name a few. Despite the widespread use of multi-layered networks, it remains a daunting task to learn vector representations of different types of nodes due to the bewildering combination of both within-layer connections and cross-layer network dependencies. In this paper, we study a novel problem of multi-layered network embedding. In particular, we propose a principled framework – MANE to model both within-layer connections and cross-layer network dependencies simultaneously in a unified optimization framework for embedding representation learning. Experiments on real-world multi-layered networks corroborate the effectiveness of the proposed framework. 
    more » « less
  3. null (Ed.)
    An adjunction is a pair of functors related by a pair of natural transformations, and relating a pair of categories. It displays how a structure, or a concept, projects from each category to the other, and back. Adjunctions are the common denominator of Galois connections, representation theories, spectra, and generalized quantifiers. We call an adjunction nuclear when its categories determine each other. We show that every adjunction can be resolved into a nuclear adjunction. The resolution is idempotent in a strict sense. The resulting nucleus displays the concept that was implicit in the original adjunction, just as the singular value decomposition of an adjoint pair of linear operators displays their canonical bases. In his seminal early work, Ross Street described an adjunction between monads and comonads in 2-categories. Lifting the nucleus construction, we show that the resulting Street monad on monads is strictly idempotent, and extracts the nucleus of a monad. A dual treatment achieves the same for comonads. This uncovers remarkably concrete applications behind a notable fragment of pure 2-category theory. The other way around, driven by the tasks and methods of machine learning and data analysis, the nucleus construction also seems to uncover remarkably pure and general mathematical content lurking behind the daily practices of network computation and data analysis. 
    more » « less
  4. An adjunction is a pair of functors related by a pair of natural transformations, and relating a pair of categories. It displays how a structure, or a concept, projects from each category to the other, and back. Adjunctions are the common denominator of Galois connections, representation theories, spectra, and generalized quantifiers. We call an adjunction nuclear when its categories determine each other. We show that every adjunction can be resolved into a nuclear adjunction. This resolution is idempotent in a strong sense. The nucleus of an adjunction displays its conceptual core, just as the singular value decomposition of an adjoint pair of linear operators displays their canonical bases. The two composites of an adjoint pair of functors induce a monad and a comonad. Monads and comonads generalize the closure and the interior operators from topology, or modalities from logic, while providing a saturated view of algebraic structures and compositions on one side, and of coalgebraic dynamics and decompositions on the other. They are resolved back into adjunctions over the induced categories of algebras and of coalgebras. The nucleus of an adjunction is an adjunction between the induced categories of algebras and coalgebras. It provides new presentations for both, revealing the meaning of constructing algebras for a comonad and coalgebras for a monad. In his seminal early work, Ross Street described an adjunction between monads and comonads in 2-categories. Lifting the nucleus construction, we show that the resulting Street monad on monads is strongly idempotent, and extracts the nucleus of a monad. A dual treatment achieves the same for comonads. Applying a notable fragment of pure 2-category theory on an acute practical problem of data analysis thus led to new theoretical result. 
    more » « less
  5. Deep learning models are often trained on datasets that contain sensitive information such as individuals' shopping transactions, personal contacts, and medical records. An increasingly important line of work therefore has sought to train neural networks subject to privacy constraints that are specified by differential privacy or its divergence-based relaxations. These privacy definitions, however, have weaknesses in handling certain important primitives (composition and subsampling), thereby giving loose or complicated privacy analyses of training neural networks. In this paper, we consider a recently proposed privacy definition termed \textit{f-differential privacy} [18] for a refined privacy analysis of training neural networks. Leveraging the appealing properties of f-differential privacy in handling composition and subsampling, this paper derives analytically tractable expressions for the privacy guarantees of both stochastic gradient descent and Adam used in training deep neural networks, without the need of developing sophisticated techniques as [3] did. Our results demonstrate that the f-differential privacy framework allows for a new privacy analysis that improves on the prior analysis~[3], which in turn suggests tuning certain parameters of neural networks for a better prediction accuracy without violating the privacy budget. These theoretically derived improvements are confirmed by our experiments in a range of tasks in image classification, text classification, and recommender systems. Python code to calculate the privacy cost for these experiments is publicly available in the \texttt{TensorFlow Privacy} library. 
    more » « less