Quantum Period is Compression Robust

In summary, the paper discusses the use of a universal hash function to reduce the number of qubits needed for modular exponentiation in Shor's algorithm. This results in a constant number of qubits required for the last register, although it introduces a large number of collisions. A semi-classical version of Shor's algorithm can also achieve this reduction, but at the cost of intermediate measurements. The paper suggests that if a universal homomorphic hash function exists that can be composed with g(x) = b^x mod N, then integer factorization and the discrete logarithm problem can be solved in polynomial time. There is also a question about the accuracy of the paper, to which the answer is that it is likely accurate, but it is
  • #1
Botsina
4
0
TL;DR Summary
https://arxiv.org/abs/1905.10074
Has anyone read the above prepint by Alexander May and Lars Schlieper? What do you think about the prospect of Integer Factorization the Discrete Logarithm problem being in P if an appropriate homomorphic universal hash function can be found?
https://arxiv.org/abs/1905.10074

The paper finds that one can reduce the number of qubits to a constant (just one works) used in the last, modular exponential register of the variants of Shor's algorithm, used to factor integers and find discrete logarithms, by applying a universal hash function to the modular exponential.
(A universal hash function (https://en.wikipedia.org/wiki/Universal_hashing) maps a large number states to a handful of states, where the probability of two states colliding (i.e. producing the same output) is equal over members of the hash function family. A simple example of a family of hash functions is h_{a,b}(x) = (ax + b) mod p mod m, where p is prime.) Despite causing a huge number of collisions in the last register, the period can still be recovered in only constantly more runs of the circuit and this constant is at most two (if one qubit is used) and drops quickly to near one if more qubits are used.

Hashing reduces the number of qubits needed for modular exponentiation to a constant, so if we could reduce the number of period-containing qubits (the first one or two registers) down to polynomial size, presumably one could get period-finding into BPP (in polynomial time). Using a semi-classical version of Shor's algorithm due to Mosca and Ekert in the 1990s, the number of period qubits can be reduced to a constant, but at the price of introducing intermediate measurements. Since the results for compression robustness rely on the standard QFT-function-QFT circuit, the don't apply directly to the Mosca-Ekert algorithm and to recover the result the universal hash function needs to be homomorphic (so that the hashed result can be computed from the composition of already hashed components) in order for the qubit savings to work. If there is a universal homomorphic hash function which uses few enough qubits, then integer factorization and the discrete logarithm problem is in polynomial time.

Can anyone vouch for the accuracy of this paper?
 
Physics news on Phys.org
  • #2
I think the paper is accurate, but it's unlikely there exists a workable function f that can be composed with g(x) = b^x mod N to form an h(x) = f(g(x)) that can be computed using only constant workspace.
 

FAQ: Quantum Period is Compression Robust

What is "Quantum Period is Compression Robust"?

"Quantum Period is Compression Robust" refers to a quantum algorithm that can efficiently find the period of a periodic function, even when the function is compressed or has missing data. This algorithm has applications in fields such as cryptography and signal processing.

How does the "Quantum Period is Compression Robust" algorithm work?

The algorithm works by using quantum Fourier transforms and phase estimation to find the period of a function. It is able to handle compressed or incomplete data by using a technique called amplitude amplification, which amplifies the probability of finding the correct period.

What makes the "Quantum Period is Compression Robust" algorithm unique?

The algorithm is unique because it is able to handle compressed or missing data, which was previously a challenge for classical algorithms. It also has a faster runtime compared to classical algorithms, making it a promising tool for solving certain problems.

What are some potential applications of the "Quantum Period is Compression Robust" algorithm?

Some potential applications include breaking certain types of encryption, such as the RSA algorithm, and improving the efficiency of signal processing tasks, such as finding the period of a signal in a noisy environment.

What are the limitations of the "Quantum Period is Compression Robust" algorithm?

The algorithm is currently limited by the physical constraints of quantum computers, such as the number of qubits and the error rates of operations. It also may not be applicable to all types of periodic functions and may require additional classical processing for certain applications.

Similar threads

Replies
2
Views
2K
Replies
13
Views
1K
Replies
3
Views
1K
Replies
11
Views
2K
Replies
1
Views
1K
Replies
1
Views
2K
Replies
5
Views
2K
Replies
4
Views
949
Replies
1
Views
1K
Back
Top