skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: On the Fundamental Limit of Coded Caching Systems with a Single Demand Type
Caching is a technique to reduce the communication load in peak hours by prefetching contents during off-peak hours. Recently Maddah-Ali and Niesen introduced an information theoretic framework for coded caching, and showed that significant improvement can be obtained compared to uncoded caching. Considerable efforts have been devoted to identify the precise information theoretic fundamental limit of such systems, however the difficulty of this task has also become clear. One of the reasons for this difficulty is that the original coded caching setting allows multiple demand types during delivery, which in fact introduces tension in the coding strategy to accommodate all of them. In this paper, we seek to develop a better understanding of the fundamental limit of coded caching by investigating single demand type systems. We first show that in the canonical three-user three-file systems, such single demand type systems already provide important insights. Motivated by these findings, we focus on systems where the number of users and the number of files are the same, and the demand type is when all files are being requested. A novel coding scheme is proposed, which provides several optimal memory-transmission operating points. Outer bounds for this class of systems are also considered, and their relation with existing bounds is discussed.  more » « less
Award ID(s):
1816546
PAR ID:
10121852
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
2019 IEEE Information Theory Workshop (ITW)
Page Range / eLocation ID:
1 to 5
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Coded Caching, proposed by Maddah-Ali and Niesen (MAN), has the potential to reduce network traffic by pre-storing content in the users’ local memories when the network is underutilized and transmitting coded multicast messages that simultaneously benefit many users at once during peak-hour times. This paper considers the linear function retrieval version of the original coded caching setting, where users are interested in retrieving a number of linear combinations of the data points stored at the server, as opposed to a single file. This extends the scope of the authors’ past work that only considered the class of linear functions that operate element-wise over the files. On observing that the existing cache-aided scalar linear function retrieval scheme does not work in the proposed setting, this paper designs a novel coded caching scheme that outperforms uncoded caching schemes that either use unicast transmissions or let each user recover all files in the library. 
    more » « less
  2. Coded caching is a technique for reducing congestion in communication networks by prefetching content during idle periods and exploiting multicasting opportunities during periods of heavy traffic. Most of the existing research in this area has focused on minimizing the worst case (i.e., peak) rate in a broadcast link with multiple identically distributed user requests. However, modern content delivery networks are investing very heavily in profiling their users and predicting their preferences. The minimal achievable rate of a coded caching scheme with heterogeneous user profiles is still unknown in general. This paper presents the first steps towards solving that problem by analyzing the case of two users with distinct but overlapping demand sets. Specifically, it provides a complete characterization of the uniform-average-rate capacity when the sets overlap in just one file and shows that such capacity can be achieved with selfish and uncoded prefetching. Then, it characterizes the same capacity under selfish and uncoded prefetching when the demand sets overlap in two or more files. The paper also provides explicit prefetching schemes that achieve those capacities. All our results allow for arbitrary (and not necessarily identical) users’ cache sizes and number of files in each demand set. 
    more » « less
  3. null (Ed.)
    In this paper, we show that caching can aid in achieving secure communications by considering a wiretap scenario where the transmitter and legitimate receiver share access to a secure cache, and an eavesdropper is able to tap transmissions over a binary erasure wiretap channel during the delivery phase of a caching protocol. The scenario under consideration gives rise to a new channel model for wiretap coding that allows the transmitter to effectively choose a subset of bits to erase at the eavesdropper by caching the bits ahead of time. The eavesdropper observes the remainder of the coded bits through the wiretap channel for the general case. In the wiretap type-II scenario, the eavesdropper is able to choose a set of revealed bits only from the subset of bits not cached. We present a coding approach that allows efficient use of the cache to realize a caching gain in the network, and show how to use the cache to optimize the information theoretic security in the choice of a finite blocklength code and the choice of the cached bit set. To our knowledge, this is the first work on explicit algorithms for secrecy coding in any type of caching network. 
    more » « less
  4. In the coded caching literature, the notion of privacy is considered only against demands. On the motivation that multi-round transmissions almost appear everywhere in real communication systems, this paper formulates the coded caching problem with private demands and caches. Only one existing private caching scheme, which is based on introducing virtual users, can preserve the privacy of demands and caches simultaneously, but at the cost of an extremely large subpacketization exponential in the product of the number of users (K) and files (N) in the system. In order to reduce the subpacketization while satisfying the privacy constraints, we propose a novel approach which constructs private coded caching schemes through private information retrieval (PIR). Based on this approach, we propose novel schemes with private demands and caches which have a subpacketization level in the order exponential with K instead of NK in the virtual user scheme. As a by-product, for the coded caching problem with private demands, a private coded caching scheme could be obtained from the proposed approach, which generally improves the memory-load tradeoff of the private coded caching scheme by Yan and Tuninetti. 
    more » « less
  5. Coded caching is a new approach to decrease the communication load during the peak hours of the network. It provides a significant gain, that is maximized in the centralized setting, where the server controls the placement. In many situations, each user fills its cache without any information about the placement of other users. We show that subspace precoding for placement improves the delivery load of a decentralized caching system compared to uncoded placement. Surprisingly, the proposed scheme achieves the delivery load of the centralized placement for K = 3 users for the entire range of cache size. 
    more » « less