 Home
 Search Results
 Page 1 of 1
Search for: All records

Total Resources2
 Resource Type

00000020000
 More
 Availability

20
 Author / Contributor
 Filter by Author / Creator


Frady, E. Paxon (2)

Kleyko, Denis (2)

Kymn, Christopher J. (2)

Olshausen, Bruno A. (2)

Sommer, Friedrich T. (2)

Bybee, Connor (1)

Huang, PingChen (1)

#Tyler Phillips, Kenneth E. (0)

#Willis, Ciara (0)

& AbreuRamos, E. D. (0)

& Abramson, C. I. (0)

& AbreuRamos, E. D. (0)

& Adams, S.G. (0)

& Ahmed, K. (0)

& Ahmed, Khadija. (0)

& Aina, D.K. Jr. (0)

& AkcilOkan, O. (0)

& Akuom, D. (0)

& Aleven, V. (0)

& AndrewsLarson, C. (0)

 Filter by Editor


null (1)

& Spizer, S. M. (0)

& . Spizer, S. (0)

& Ahn, J. (0)

& Bateiha, S. (0)

& Bosch, N. (0)

& Brennan K. (0)

& Brennan, K. (0)

& Chen, B. (0)

& Chen, Bodong (0)

& Drown, S. (0)

& Ferretti, F. (0)

& Higgins, A. (0)

& J. Peters (0)

& Kali, Y. (0)

& RuizArias, P.M. (0)

& S. Spitzer (0)

& Sahin. I. (0)

& Spitzer, S. (0)

& Spitzer, S.M. (0)


Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to nonfederal websites. Their policies may differ from this site.

We investigate the task of retrieving information from compositional distributed representations formed by hyperdimensional computing/vector symbolic architectures and present novel techniques that achieve new information rate bounds. First, we provide an overview of the decoding techniques that can be used to approach the retrieval task. The techniques are categorized into four groups. We then evaluate the considered techniques in several settings that involve, for example, inclusion of external noise and storage elements with reduced precision. In particular, we find that the decoding techniques from the sparse coding and compressed sensing literature (rarely used for hyperdimensional computing/vector symbolic architectures) are also well suited for decoding information from the compositional distributed representations.Combining these decoding techniqueswith interference cancellation ideas from communications improves previously reported bounds (Hersche et al., 2021) of the information rate of the distributed representations from 1.20 to 1.40 bits per dimension for smaller codebooks and from 0.60 to 1.26 bits per dimension for larger codebooks.more » « less

Frady, E. Paxon ; Kleyko, Denis ; Kymn, Christopher J. ; Olshausen, Bruno A. ; Sommer, Friedrich T. ( , ArXivorg)null (Ed.)Vector space models for symbolic processing that encode symbols by random vectors have been proposed in cognitive science and connectionist communities under the names Vector Symbolic Architecture (VSA), and, synonymously, Hyperdimensional (HD) computing. In this paper, we generalize VSAs to function spaces by mapping continuousvalued data into a vector space such that the inner product between the representations of any two data points represents a similarity kernel. By analogy to VSA, we call this new function encoding and computing framework Vector Function Architecture (VFA). In VFAs, vectors can represent individual data points as well as elements of a function space (a reproducing kernel Hilbert space). The algebraic vector operations, inherited from VSA, correspond to welldefined operations in function space. Furthermore, we study a previously proposed method for encoding continuous data, fractional power encoding (FPE), which uses exponentiation of a random base vector to produce randomized representations of data points and fulfills the kernel properties for inducing a VFA. We show that the distribution from which elements of the base vector are sampled determines the shape of the FPE kernel, which in turn induces a VFA for computing with bandlimited functions. In particular, VFAs provide an algebraic framework for implementing largescale kernel machines with random features, extending Rahimi and Recht, 2007. Finally, we demonstrate several applications of VFA models to problems in image recognition, density estimation and nonlinear regression. Our analyses and results suggest that VFAs constitute a powerful new framework for representing and manipulating functions in distributed neural systems, with myriad applications in artificial intelligence.more » « less