'Pseudocubality' - fast tests for large possible-cubes

  • Thread starter CRGreathouse
  • Start date
In summary, the conversation revolves around finding a fast way to test if a large number is a cube. The numbers being tested are of the form a^b + c^d, with a and c being less than a million and b and d being under a hundred. The suggested method involves checking a+c mod 8 and the sum of the powers mod 9, 13, and 19. It is also suggested to use a cubic reciprocity test, but there is difficulty implementing it quickly for large numbers. Another suggestion is to use a lookup table to save on modular exponentiation. The conversation also discusses the usefulness of checking modulo primes of the form 3k+1 and the potential of finding conditions that eliminate most false positives
  • #1
CRGreathouse
Science Advisor
Homework Helper
2,844
0
'Pseudocubality' -- fast tests for large possible-cubes

I'm looking for a way to quickly show that a huge number is not a cube. The numbers are of the form a^b + c^d where a and c are less than a million and b and d are under a hundred (give or take on both). a and c are also known to be odd, and the exponents are at least 3.

At the moment I'm checking a+c mod 8 (it must be 0) and the sum of the powers, calculated with fast modular exponentiation, mod 9, 13, and 19. (I'm also checking mod 7, but this doesn't seem to help -- any ideas on why that is the case?)

What I want are more tests, since far too many non-cubes slip through these tests. Since most of the inputs will be handled by one of these, the test doesn't have to be blazingly fast -- although faster is better, to be sure.

Is there a cubic reciprocity test that can practically handle numbers of this size? Can it be done (like modular exponentiation) without expanding the numbers?
 
Physics news on Phys.org
  • #3
Is it not enough to simply check modulo more primes of the form 3k+1?

I suspect you could do the computation faster if you didn't check 7, 9, 13, and 19 separately... but instead computed it once modulo 7*9*13*19, and uesd that to compute the residue modulo your other moduli. (Or even just stored an array indicating whether an element modulo 7*9*13*19 is a cube)
 
Last edited:
  • #4
To save on modular exponentiation, observe that

(a^b) mod 13 = a^(b mod 12) mod 13

Even better, you could avoid doing any multiplies at all -- just store a 13 by 12 lookup table of the value of a^b. (Note that, since you have a fixed modulus, there are clever ways to compute the remainder without doing any division)

Even better -- why use a clever way to compute the remainder when you can just use a lookup table? :biggrin: You could store an array of the value of x mod 13 for every x under 2^20, and x mod 12 for x under 2^10. It's unclear to me which would win.
 
Last edited:
  • #5
Hurkyl said:
Is it not enough to simply check modulo more primes of the form 3k+1?

Not really; the higher the primes get the less useful they are, and the more modulus operations are needed.

Hurkyl said:
I suspect you could do the computation faster if you didn't check 7, 9, 13, and 19 separately... but instead computed it once modulo 7*9*13*19, and uesd that to compute the residue modulo your other moduli. (Or even just stored an array indicating whether an element modulo 7*9*13*19 is a cube)

I'm already doing this, actually. It seemed like an obvious speedup, so I took it. The array hasn't proved useful yet, since the later tests don't come up much.
 
  • #6
Hurkyl said:

I've read that and a number of other sources on cubic reciprocity (although I'll blush to admit I didn't look it up in my Ireland & Rosen), but I'm actually having trouble figuring out how to implement that test quickly for large numbers.

I'm still working on it.
 
Last edited:
  • #7
CRGreathouse said:
Not really; the higher the primes get the less useful they are, and the more modulus operations are needed.
This doesn't sound right -- a random number modulo a prime p = 3k+1 has a precisely one in three chance of being a cube, I would expect that each additional prime you tried kills off another 66% of the false positives. If either exponent you're using is relatively prime to 3k and the corresponding base is uniformly distributed mod p, then your final number should be uniformly distributed mod p!

So, I'm surprised to hear that larger primes are less useful. Nor can I explain why 7 isn't helping. :frown: Admittedly, I'm more inclined to suspect a bug in the program than number theoretic reasons... I'm going to have to code this up later, so that I can see it for myself!
 
  • #8
Hurkyl said:
This doesn't sound right -- a random number modulo a prime p = 3k+1 has a precisely one in three chance of being a cube, I would expect that each additional prime you tried kills off another 66% of the false positives.

Yes, but the larger the primes the more tests are needed to check the residue classes.

What I'd really like is to find some conditions that knock out 'most' in some sense of the false positives, then a test that will remove 'almost all' of the false positives. The whole program is designed to search a very large space for cubes -- although this is of itself just a multi-day preprocessing step for a larger search for powers.
 
  • #9
CRGreathouse said:
Yes, but the larger the primes the more tests are needed to check the residue classes.
I'm not sure what you mean -- as long as you stick to small primes, once you have the residue modulo the prime, it's just a single table lookup to tell if you have a cube or not. It would take a small miracle to pass the first 100 tests, so it doesn't really matter if you take a millisecond to test the next few thousand primes. (Or whatever method you intend to use to decide if it really is a cube)
 

Related to 'Pseudocubality' - fast tests for large possible-cubes

What is 'Pseudocubality'?

'Pseudocubality' refers to a set of fast tests that can be used to quickly and efficiently determine whether a large number could potentially be a perfect cube.

Why is it important to have fast tests for large possible cubes?

One of the main reasons for having fast tests for large possible cubes is to save time and resources. These tests can help scientists and researchers quickly sift through large amounts of data to identify potential perfect cubes, rather than having to manually check each number.

How do these tests work?

These tests use mathematical algorithms and principles to determine whether a given number could be a perfect cube. They involve techniques such as factoring, modular arithmetic, and number patterns to quickly identify potential cubes.

What are the benefits of using 'Pseudocubality'?

The main benefit of using 'Pseudocubality' is the time and resource efficiency it offers. These tests can help scientists and researchers quickly identify potential perfect cubes, which can then be further investigated and analyzed in more detail.

Can these tests be applied to other types of numbers?

Yes, these tests can also be applied to other types of numbers, such as perfect squares or prime numbers. With some modifications, they can be adapted to quickly identify potential numbers in different mathematical patterns or sequences.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Quantum Physics
Replies
3
Views
1K
Replies
9
Views
1K
Replies
8
Views
993
Replies
16
Views
2K
  • Special and General Relativity
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
7
Views
3K
  • Differential Equations
Replies
3
Views
2K
  • MATLAB, Maple, Mathematica, LaTeX
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
1
Views
1K
Back
Top