skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Generating Cryptographically-Strong Random Lattice Bases and Recognizing Rotations of Zn
Lattice-based cryptography relies on generating random bases which are difficult to fully reduce. Given a lattice basis (such as the private basis for a cryptosystem), all other bases are related by multiplication by matrices in GL(n,Z). We compare the strengths of various methods to sample random elements of GL(n,Z), finding some are stronger than others with respect to the problem of recognizing rotations of the Zn lattice. In particular, the standard algorithm of multiplying unipotent generators together (as implemented in Magma’s RandomSLnZ command) generates instances of this last problem which can be efficiently broken, even in dimensions nearing 1,500. Likewise, we find that the random basis generation method in one of the NIST Post-Quantum Cryptography competition submissions (DRS) generates instances which can be efficiently broken, even at its 256-bit security settings. Other random basis generation algorithms (some older, some newer) are described which appear to be much stronger.  more » « less
Award ID(s):
1815562
PAR ID:
10278845
Author(s) / Creator(s):
;
Editor(s):
Cheon, Jung Hee; Tillich, Jean-Pierre
Date Published:
Journal Name:
Lecture notes in computer science
Volume:
12841
ISSN:
0302-9743
Page Range / eLocation ID:
319-338
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We consider the high-dimensional linear regression problem, where the algorithmic goal is to efficiently infer an unknown feature vector $$\beta^*\in\mathbb{R}^p$$ from its linear measurements, using a small number $$n$$ of samples. Unlike most of the literature, we make no sparsity assumption on $$\beta^*$$, but instead adopt a different regularization: In the noiseless setting, we assume $$\beta^*$$ consists of entries, which are either rational numbers with a common denominator $$Q\in\mathbb{Z}^+$$ (referred to as $Q-$$rationality); or irrational numbers taking values in a rationally independent set of bounded cardinality, known to learner; collectively called as the mixed-range assumption. Using a novel combination of the Partial Sum of Least Squares (PSLQ) integer relation detection, and the Lenstra-Lenstra-Lov\'asz (LLL) lattice basis reduction algorithms, we propose a polynomial-time algorithm which provably recovers a $$\beta^*\in\mathbb{R}^p$ enjoying the mixed-range assumption, from its linear measurements $$Y=X\beta^*\in\mathbb{R}^n$$ for a large class of distributions for the random entries of $$X$$, even with one measurement ($n=1$). In the noisy setting, we propose a polynomial-time, lattice-based algorithm, which recovers a $$\beta^*\in\mathbb{R}^p$$ enjoying the $Q-$rationality property, from its noisy measurements $$Y=X\beta^*+W\in\mathbb{R}^n$$, even from a single sample ($n=1$). We further establish that for large $$Q$$, and normal noise, this algorithm tolerates information-theoretically optimal level of noise. We then apply these ideas to develop a polynomial-time, single-sample algorithm for the phase retrieval problem. Our methods address the single-sample ($n=1$) regime, where the sparsity-based methods such as the Least Absolute Shrinkage and Selection Operator (LASSO) and the Basis Pursuit are known to fail. Furthermore, our results also reveal algorithmic connections between the high-dimensional linear regression problem, and the integer relation detection, randomized subset-sum, and shortest vector problems. 
    more » « less
  2. Cryptography is largely based on unproven assumptions, which, while believable, might fail. Notably if P=NP, or if we live in Pessiland, then all current cryptographic assumptions will be broken. A compelling question is if any interesting cryptography might exist in Pessiland. A natural approach to tackle this question is to base cryptography on an assumption from fine-grained complexity. Ball, Rosen, Sabin, and Vasudevan [BRSV’17] attempted this, starting from popular hardness assumptions, such as the Orthogonal Vectors (OV) Conjecture. They obtained problems that are hard on average, assuming that OV and other problems are hard in the worst case. They obtained proofs of work, and hoped to use their average-case hard problems to build a fine-grained one-way function. Unfortunately, they proved that constructing one using their approach would violate a popular hardness hypothesis. This motivates the search for other fine-grained average-case hard problems. The main goal of this paper is to identify sufficient properties for a fine-grained average-case assumption that imply cryptographic primitives such as fine-grained public key cryptography (PKC). Our main contribution is a novel construction of a cryptographic key exchange, together with the definition of a small number of relatively weak structural properties, such that if a computational problem satisfies them, our key exchange has provable fine-grained security guarantees, based on the hardness of this problem. We then show that a natural and plausible average-case assumption for the key problem Zero-k-Clique from fine-grained complexity satisfies our properties. We also develop fine-grained one-way functions and hardcore bits even under these weaker assumptions. Where previous works had to assume random oracles or the existence of strong one-way functions to get a key-exchange computable in O(n) time secure against O(n^2) time adversaries (see [Merkle’78] and [BGI’08]), our assumptions seem much weaker. Our key exchange has a similar gap between the computation of the honest party and the adversary as prior work, while being non-interactive, implying fine-grained PKC. 
    more » « less
  3. Syndrome decoding (SD), and equivalently Learning Parity with Noise (LPN), is a fundamental problem in cryptography, which states that for a field F, some compressing public matrix G ∈ F^k×n ,and a secret sparse vector e ∈ F^n sampled from some noise distribution, G e is indistinguishable from uniform. Recently, the SD has gained significant interest due to its use in pseudorandom correlation generators (PCGs). In pursuit of better efficiency, we propose a new assumption called Stationary Syndrome Decoding (SSD). In SSD,weconsider q correlated noise vectors e1,... , eq ∈ F^n and associated instances G1 e1,..., Gq eq where the noise vectors are restricted to having non-zeros in the same small subset of t positions L ⊂ [n]. That is, for all i ∈ L, ej,i is uniformly random, while for all other i, ej,i =0. Although naively reusing the noise vector renders SD and LPN insecure via simple Gaussian elimination, we observe known attacks do not extend to our correlated noise. We show SSD is unconditionally secure against so-called linear attacks, e.g., advanced information set decoding and representation techniques (Esser and Santini, Crypto 2024). We further adapt the state-of-the-art nonlinear attack (Briaud and Øygarden, Eurocrypt 2023) to SSD and demonstrate both theoretically and exper-imentally resistance to the attack. We apply SSD to PCGs to amortize the cost of noise generation pro-tocol. For OT and VOLE generation, each instance requires O(t)com-munication instead of O(t log n). For suggested parameters, we observe a1.5× improvement in the running time or between 6 and 18× reduc-tion in communication. For Beaver triple generation using Ring LPN, our techniques have the potential for substantial amortization due to the high concrete overhead of the Ring LPN noise generation. 
    more » « less
  4. We consider a variant of the vehicle routing problem (VRP) where each customer has a unit demand and the goal is to minimize the total cost of routing a fleet of capacitated vehicles from one or multiple depots to visit all customers. We propose two parallel algorithms to efficiently solve the column-generation-based linear-programming relaxation for this VRP. Specifically, we focus on algorithms for the “pricing problem,” which corresponds to the resource-constrained elementary shortest path problem. The first algorithm extends the pulse algorithm for which we derive a new bounding scheme on the maximum load of any route. The second algorithm is based on random coloring from parameterized complexity which can be also combined with other techniques in the literature for improving VRPs, including cutting planes and column enumeration. We conduct numerical studies using VRP benchmarks (with 50–957 nodes) and instances of a medical home care delivery problem using census data in Wayne County, Michigan. Using parallel computing, both pulse and random coloring can significantly improve column generation for solving the linear programming relaxations and we can obtain heuristic integer solutions with small optimality gaps. Combining random coloring with column enumeration, we can obtain improved integer solutions having less than 2% optimality gaps for most VRP benchmark instances and less than 1% optimality gaps for the medical home care delivery instances, both under a 30-minute computational time limit. The use of cutting planes (e.g., robust cuts) can further reduce optimality gaps on some hard instances, without much increase in the run time. Summary of Contribution: The vehicle routing problem (VRP) is a fundamental combinatorial problem, and its variants have been studied extensively in the literature of operations research and computer science. In this paper, we consider general-purpose algorithms for solving VRPs, including the column-generation approach for the linear programming relaxations of the integer programs of VRPs and the column-enumeration approach for seeking improved integer solutions. We revise the pulse algorithm and also propose a random-coloring algorithm that can be used for solving the elementary shortest path problem that formulates the pricing problem in the column-generation approach. We show that the parallel implementation of both algorithms can significantly improve the performance of column generation and the random coloring algorithm can improve the solution time and quality of the VRP integer solutions produced by the column-enumeration approach. We focus on algorithmic design for VRPs and conduct extensive computational tests to demonstrate the performance of various approaches. 
    more » « less
  5. Abstract An $$n \times n$$ matrix with $$\pm 1$$ entries that acts on $${\mathbb {R}}^{n}$$ as a scaled isometry is called Hadamard. Such matrices exist in some, but not all dimensions. Combining number-theoretic and probabilistic tools, we construct matrices with $$\pm 1$$ entries that act as approximate scaled isometries in $${\mathbb {R}}^{n}$$ for all $$n \in {\mathbb {N}}$$. More precisely, the matrices we construct have condition numbers bounded by a constant independent of $$n$$. Using this construction, we establish a phase transition for the probability that a random frame contains a Riesz basis. Namely, we show that a random frame in $${\mathbb {R}}^{n}$$ formed by $$N$$ vectors with independent identically distributed coordinate having a nondegenerate symmetric distribution contains many Riesz bases with high probability provided that $$N \ge \exp (Cn)$$. On the other hand, we prove that if the entries are sub-Gaussian, then a random frame fails to contain a Riesz basis with probability close to $$1$$ whenever $$N \le \exp (cn)$$, where $c<C$ are constants depending on the distribution of the entries. 
    more » « less