Exploiting (near-)optimal MIMO signal processing algorithms in the next generation (NextG) cellular systems holds great promise in achieving significant wireless performance gains in spectral efficiency and device connectivity, to name a few. However, it is extremely difficult to enable optimal processing methods in the systems, since the required computational amount increases exponentially with more users and higher data rates, while available processing time is strictly limited. In this regard, quantum signal processing has been recently identified as a promising potential enabler of the (near-)optimal algorithms in the systems, since quantum computing could dramatically speed up the computation via non-conventional effects based on quantum mechanics. Given existing quantum decoherence and noise on quantum hardware, parallel quantum optimization could accelerate the process even further at the expense of more qubit usage. In this paper, we discuss the parallelization of quantum MIMO processing and investigate a spin-level preprocessing method for relatively finer-grained decomposition that can support more flexible parallel quantum signal processing, compared to the recently reported symbol-level decomposition method. We evaluate the method on the state-of-the-art analog D-Wave Advantage quantum processor.
more »
« less
This content will become publicly available on June 29, 2026
Quantum Image Processing: A Comparative Study of NEQR and FRQI Encoding Schemes with Hybrid Processing
Quantum image processing (QIP) is an emerging field that integrates image processing with the principles of quantum computing (QC). As quantum technologies advance, researchers face new opportunities and challenges in developing efficient QIP techniques. This paper provides an overview of quantum image representations, with a focus on two prominent encoding schemes: Novel Enhanced Quantum Representation (NEQR) and Fourier-based Quantum Image Representation (FRQI). We compare their performance in noisy quantum environments by evaluating qubit requirements, image quality, and computational efficiency. The study further analyzes the impact of quantum gate errors and qubit limitations on image reconstruction fidelity. We also compare GPU and QPU performance to highlight their strengths and weaknesses. Our findings stress the importance of error mitigation, advancements in quantum hardware, and the advancements of quantum-classical hybrid systems to drive future progress in QIP.
more »
« less
- Award ID(s):
- 2339701
- PAR ID:
- 10656866
- Publisher / Repository:
- ACM
- Date Published:
- Page Range / eLocation ID:
- 575 to 580
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Quantum networks providing shared entanglement over a mesh of quantum nodes will revolutionize the field of quantum information science by offering novel applications in quantum computation, enhanced precision in networks of sensors and clocks, and efficient quantum communication over large distances. Recent experimental progress with individual neutral atoms demonstrates a high potential for implementing the crucial components of such networks. We highlight latest developments and near-term prospects on how arrays of individually controlled neutral atoms are suited for both efficient remote entanglement generation and large-scale quantum information processing, thereby providing the necessary features for sharing high-fidelity and error-corrected multi-qubit entangled states between the nodes. We describe both the functionality requirements and several examples for advanced, large-scale quantum networks composed of neutral atom processing nodes.more » « less
-
Abstract Quantum networks providing shared entanglement over a mesh of quantum nodes will revolutionize the field of quantum information science by offering novel applications in quantum computation, enhanced precision in networks of sensors and clocks, and efficient quantum communication over large distances. Recent experimental progress with individual neutral atoms demonstrates a high potential for implementing the crucial components of such networks. We highlight latest developments and near-term prospects on how arrays of individually controlled neutral atoms are suited for both efficient remote entanglement generation and large-scale quantum information processing, thereby providing the necessary features for sharing high-fidelity and error-corrected multi-qubit entangled states between the nodes. We describe both the functionality requirements and several examples for advanced, large-scale quantum networks composed of neutral atom processing nodes.more » « less
-
The ability to make high-fidelity qubit measurements with minimal collateral disruption to the system is not only relevant to initialization and final read-out -- it is also essential to achieving quantum error correction on a universal quantum computation. Qubit state measurements in a neutral atom array are achieved by probing the array with light detuned from a cycling transition and capturing resulting fluorescence with a high quantum efficiency imaging device, producing a greyscale image of the neutral atom array. Conventionally, to achieve a fidelity above 99%, the typical probing period is several ms. This is a significant delay, given that the longest gate operation only takes several micros. In this poster, we demonstrate qubit state measurements assisted by a supervised convolutional neural network (CNN) in a neutral atom quantum processor. We present two CNN architectures for analyzing neutral atom qubit readout data: a compact 5-layer single-qubit CNN architecture and a 6-layer multi-qubit CNN architecture. We benchmark both architectures against a conventional Gaussian threshold analysis method. We demonstrate up to 56% reduction of measurement infidelity using the CNN compared to a conventional analysis method. This work presents a proof of concept for a CNN network to be implemented as a real-time readout processing method on a neutral atom quantum computer, enabling faster readout time and improved fidelity.more » « less
-
Entanglement is essential for quantum information processing, but is limited by noise. We address this by developing high-yield entanglement distillation protocols with several advancements. (1) We extend the 2-to-1 recurrence entanglement distillation protocol to higher-rate n-to-(n−1) protocols that can correct any single-qubit errors. These protocols are evaluated through numerical simulations focusing on fidelity and yield. We also outline a method to adapt any classical error-correcting code for entanglement distillation, where the code can correct both bit-flip and phase-flip errors by incorporating Hadamard gates. (2) We propose a constant-depth decoder for stabilizer codes that transforms logical states into physical ones using single-qubit measurements. This decoder is applied to entanglement distillation protocols, reducing circuit depth and enabling protocols derived from high-performance quantum error-correcting codes. We demonstrate this by evaluating the circuit complexity for entanglement distillation protocols based on surface codes and quantum convolutional codes. (3) Our stabilizer entanglement distillation techniques advance quantum computing. We propose a fault-tolerant protocol for constant-depth encoding and decoding of arbitrary states in surface codes, with potential extensions to more general quantum low-density parity-check codes. This protocol is feasible with state-of-the-art reconfigurable atom arrays and surpasses the limits of conventional logarithmic depth encoders. Overall, our study integrates stabilizer formalism, measurement-based quantum computing, and entanglement distillation, advancing both quantum communication and computing.more » « less
An official website of the United States government
