In point cloud compression, exploiting temporal redundancy for inter predictive coding is challenging because of the irregular geometry. This paper proposes an efficient block-based inter-coding scheme for color attribute compression. The scheme includes integer-precision motion estimation and an adaptive graph based in-loop filtering scheme for improved attribute prediction. The proposed block-based motion estimation scheme consists of an initial motion search that exploits geometric and color attributes, followed by a motion refinement that only minimizes color prediction error. To further improve color prediction, we propose a vertex-domain low-pass graph filtering scheme that can adaptively remove noise from predictors computed from motion estimation with different accuracy. Our experiments demonstrate significant coding gain over state-of-the-art coding methods.
more »
« less
A Systematic Approach to Incremental Redundancy over Erasure Channels
As sensing and instrumentation play an increasingly important role in systems controlled over wired and wireless networks, the need to better understand delay-sensitive communication becomes a prime issue. Along these lines, this article studies the operation of data links that employ incremental redundancy as a practical means to protect information from the effects of unreliable channels. Specifically, this work extends a powerful methodology termed sequential differential optimization to choose near-optimal block sizes for hybrid ARQ over erasure channels. Furthermore, results show that the impact of the coding strategy adopted and the propensity of the channel to erase symbols naturally decouple when analyzing throughput. Overall, block size selection is motivated by normal approximations on the probability of decoding success at every stage of the incremental transmission process. This novel perspective, which rigorously bridges hybrid ARQ and coding, offers a pragmatic means to select code rates and blocklengths for incremental redundancy.
more »
« less
- Award ID(s):
- 1642983
- PAR ID:
- 10076575
- Date Published:
- Journal Name:
- 2018 IEEE International Symposium on Information Theory (ISIT)
- Page Range / eLocation ID:
- 1176 to 1180
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Low-capacity scenarios have become increasingly important in the technology of the In- ternet of Things (IoT) and the next generation of wireless networks. Such scenarios require efficient and reliable transmission over channels with an extremely small capacity. Within these constraints, the state-of-the-art coding techniques may not be directly applicable. More- over, the prior work on the finite-length analysis of optimal channel coding provides inaccurate predictions of the limits in the low-capacity regime. In this paper, we study channel coding at low capacity from two perspectives: fundamental limits at finite length and code construc- tions. We first specify what a low-capacity regime means. We then characterize finite-length fundamental limits of channel coding in the low-capacity regime for various types of channels, including binary erasure channels (BECs), binary symmetric channels (BSCs), and additive white Gaussian noise (AWGN) channels. From the code construction perspective, we charac- terize the optimal number of repetitions for transmission over binary memoryless symmetric (BMS) channels, in terms of the code blocklength and the underlying channel capacity, such that the capacity loss due to the repetition is negligible. Furthermore, it is shown that capacity- achieving polar codes naturally adopt the aforementioned optimal number of repetitions.more » « less
-
A status updating system is considered in which a variable length code is used to transmit messages to a receiver over a noisy channel. The goal is to optimize the codewords lengths such that successfully-decoded messages are timely. That is, such that the age-of-information (AoI) at the receiver is minimized. A hybrid ARQ (HARQ) scheme is employed, in which variable-length incremental redundancy (IR) bits are added to the originally-transmitted codeword until decoding is successful. With each decoding attempt, a non-zero processing delay is incurred. The optimal codewords lengths are analytically derived utilizing a sequential differential optimization (SDO) framework. The framework is general in that it only requires knowledge of an analytical expression of the positive feedback (ACK) probability as a function of the codeword length.more » « less
-
In this paper, we investigate the necessity of finite blocklength codes in distributed transmission of independent message sets over channels with feedback. We provide two examples of three user interference channels with feedback where codes with asymptotically large effective lengths are sub-optimal. As a result, we conclude that coded transmission using finite effective length codes is necessary to achieve optimality. We argue that the sub-optimal performance of large effective length codes is due to their inefficiency in preserving the correlation between the inputs to the distributed terminals in the communication system. This correlation is made available by the presence of feedback at the terminals and is used as a means for coordination between them when using finite effective length coding strategies.more » « less
-
In this paper, we introduce Jenga, a new scheme for protecting 3D DRAM, specifically high bandwidth memory (HBM), from failures in bits, rows, banks, channels, dies, and TSVs. By providing redundancy at the granularity of a cache block—rather than across blocks, as in the current state of the art—Jenga achieves greater error-free performance and lower error recovery latency. We show that Jenga’s runtime is on average only 1.03× the runtime of our Baseline across a range of benchmarks. Additionally, for memory intensive benchmarks, Jenga is on average 1.11× faster than prior work.more » « less
An official website of the United States government

