Reducing Number of Multiplications in a Loop

  • Thread starter rwinston
  • Start date
  • Tags
    Loop
In summary, reducing the number of multiplications in a loop can greatly improve the efficiency and speed of a program. This can be achieved by using techniques such as loop unrolling, strength reduction, and loop fusion. By minimizing the number of multiplications, the overall runtime of the loop can be reduced, resulting in a more efficient and optimized program. Additionally, using alternate data structures and algorithms can also help reduce the number of multiplications needed in a loop. Overall, reducing the number of multiplications in a loop is a crucial optimization technique for improving the performance of a program.
  • #1
rwinston
36
0
Hi

Following on from my palindromic number question, if I am calculating a set of palindromic numbers generated as the product of two three-digit numbers, I could generate the entire product set and then test each number for the palindrome property.

the naive approach would be a loop like

for (i in 999:100)
for (j in 999:100)
p = i * j

However, we do many redundant multiplications this way, so the obvious approach is to do

for (i in 999:100)
for (j in i:100)
p= i * j

However, I still a lot of palindromic numbers that are duplicated. for instance, the palindrome 444444 is generated by:

962 * 462 = 444444
924 * 481 = 444444
858 * 518 = 444444
814 * 546 = 444444
777 * 572 = 444444

Is there any way to avoid or predict these multiplications of two 3-digit numbers that will produce duplicate palindromes?
 
Last edited:
Physics news on Phys.org
  • #2
rwinston said:
Hi

Following on from my palindromic number question, if I am calculating a set of palindromic numbers generated as the product of two three-digit numbers, I could generate the entire product set and then test each number for the palindrome property.

the naive approach would be a loop like

for (i in 999:100)
for (j in 999:100)
p = i * j

However, we do many redundant multiplications this way, so the obvious approach is to do

for (i in 999:100)
for (j in i:100)
p= i * j

However, I still a lot of palindromic numbers that are duplicated. for instance, the palindrome 444444 is generated by:

962 * 462 = 444444
924 * 481 = 444444
858 * 518 = 444444
814 * 546 = 444444
777 * 572 = 444444

Is there any way to avoid or predict these multiplications of two 3-digit numbers that will produce duplicate palindromes?

You asked a very tough question which I doubt has an answer. It would be much simpler to just automatically sort your results and remove duplicates. I would be interested in determining how many of the palindromes between the smallest and largest in your list are unaccounted for by your program and how many of these are prime. Each answer is going to be different.
 
Last edited:
  • #3
Thanks Ramsey. I didnt think there would be a straightforward answer to this one.
 
  • #4
rwinston said:
Thanks Ramsey. I didnt think there would be a straightforward answer to this one.

There are two reasonably efficient ways of doing this that immediately spring to mind. You could loop through the three-digit numbers, letting i <= j (405,050 loops), possibly skipping ij = 0 mod 10 (reducing the possibilities slightly at the cost of overhead). Alternately, you could loop through the 5 and 6-digit palindromes (1800 loops), factoring at each step. You have to decide which is faster -- in essence, is factoring harder or easier than 225 multiplications?
 
  • #5
CRGreathouse said:
There are two reasonably efficient ways of doing this that immediately spring to mind. You could loop through the three-digit numbers, letting i <= j (405,050 loops), possibly skipping ij = 0 mod 10 (reducing the possibilities slightly at the cost of overhead). Alternately, you could loop through the 5 and 6-digit palindromes (1800 loops), factoring at each step. You have to decide which is faster -- in essence, is factoring harder or easier than 225 multiplications?
If you skip ij= 0 mod 10 then you would miss 012210 = 110*111 or are leading zeros prohibited?
 
  • #6
ramsey2879 said:
If you skip ij= 0 mod 10 then you would miss 012210 = 110*111 or are leading zeros prohibited?

Typically multiples of b aren't considered base-b palindromes. For example, Sloane's http://www.research.att.com/~njas/sequences/A002113 doesn't include 10 = 010 or 100 = 00100. But if you wanted to consider them, you'd of course need to include the multiples of 10.
 
Last edited by a moderator:

FAQ: Reducing Number of Multiplications in a Loop

How can I reduce the number of multiplications in a loop?

One way to reduce the number of multiplications in a loop is to use precomputed values. This means calculating the values outside of the loop and storing them in an array or variable, so that the loop can simply access and use them instead of performing the multiplication each time.

What is loop unrolling and how does it reduce multiplications?

Loop unrolling is a technique where the loop is rewritten to perform multiple iterations of the loop body in each iteration. This reduces the number of times the loop needs to be executed, thus reducing the number of multiplications needed in the loop.

Can using bitwise operations reduce the number of multiplications in a loop?

Yes, bitwise operations can help reduce the number of multiplications in a loop. For example, instead of multiplying by 2 in each iteration, you can use a bitwise left shift operation which is more efficient.

What is loop fusion and how does it help reduce multiplications?

Loop fusion is a technique where multiple loops that perform similar operations are combined into a single loop. This reduces the number of iterations and thus the number of multiplications needed in the loop.

Is it always necessary to reduce the number of multiplications in a loop?

No, it is not always necessary to reduce the number of multiplications in a loop. It depends on the specific problem and the efficiency of the code. Sometimes, the cost of performing the multiplication is minimal and the benefits of reducing it may not outweigh the effort and complexity of implementing optimization techniques.

Similar threads

Back
Top