- #1
matineesuxxx
- 77
- 6
Last week on my computer science assignment I had to write a division algorithm using only addition and subtraction to tackle a problem. My first attempt was the simple and naive repeated subtraction, although I quickly discovered it was not nearly efficient enough for how I needed to use it, So I developed a quicker one. (I have attached a PDF of the algorithm, and the C code for any who are interested in helping)
The problem is that I want to find the runtime, however I cannot predict when the division will reset with a new numerator, thus I have not even the slightest clue as to the worst case scenario.
Can anybody give me some advice on how to estimate the runtime of such algorithms?
Code:
[edit] I don't know if there is a way to properly post code here, so the whitespace at the beginning of each line of code doesn't show up.
The problem is that I want to find the runtime, however I cannot predict when the division will reset with a new numerator, thus I have not even the slightest clue as to the worst case scenario.
Can anybody give me some advice on how to estimate the runtime of such algorithms?
Code:
Code:
int div_help(int a, int b, int bconst, int quotient, int acc){
if(a-b<0){return 0;}
if(b==a){return quotient+acc;}
return b+b > a ? div_help(a-b,bconst, bconst,1,acc+quotient):div_help(a,b+b, bconst,quotient+quotient,acc);
}
int div(int a, int b){
div_help(a,b,b,1,0);
}
[edit] I don't know if there is a way to properly post code here, so the whitespace at the beginning of each line of code doesn't show up.
Attachments
Last edited: