- #1
- 2,194
- 198
Consider this sequence:
1, 2, 3, 4, 5, ...
We can calculate the n'th term of the sequence by the function t(n) = n. We could define s(n), the sum to n terms, recursively as s(1) = 1 and s(n) = n + s(n-1). The time bound of this procedure is O(n), but it isn't efficient because we can find an O(1) formula: s(n) = n(n+1)/2
Now consider this sequence:
1, 3=(1+2), 6=(1+2+3), 10, 15, 21, ...
Here the n'th term of the sequence is given by s(n) = n(n+1)/2, the same function as above. We can define u(n), the sum to n terms of this sequence, recursively as u(1) = 1 and u(n) = u(n) + s(n-1). This procedure has a time bound of O(n).
Can we do better than this? We can use simple variations, but the time stays linear. For example, where n>2, 'u(n) = u(n-2) + 2.s(n-1) + n' could be used. This nearly doubles the speed on average, but the time taken is still proportional to n.
Is there a method to calculate this sum in time less than proportional to n? As a matter of fact, we can do it in time which is O(logn).
s(n) = n(n+1)/2.
Where n = 1: u(1) = 1
Where n is even: u(n) = 2.u(n/2) + n.s(n/2)
Where n is odd: s2(n) = 2.u((n-1)/2) + ((n+1)/2) * [s((n-1)/2) + s((n+1)/2)]
In case you are wondering how I came up with these formulas, notice that (1) below corresponds to u(4). It can be split up into (2), which corresponds to 'u(2) + 2.s(2) + 3 + (3+4)'. In (3) you can see that '3 + (3+4)' can be split into u(2) + 2.s(2)
1: 1 + (1+2) + (1+2+3) + (1+2+3+4)
2: 1 + (1+2)
+ (1+2) + (1+2)
+ 3 + (3+4)
3: 3 + (3+4)
= 1 + (1+2)
+ 2 + (2+2)
Putting it together:
u(4) = u(2) + 2.s(2) + u(2) + 2.s(2)
= 2.u(2) + 2.(2.s(2))
This corresponds to:
u(n) = 2.u(n/2) + (n/2)(2.s(n/2))
= 2.u(n/2) + n.s(n/2)
I derived the formula for when n is odd the same way. I will now generalise this method to work for all arithmetic series:
t(n) = a+(n-1)d
s(n) = (n/2)[2a+(n-1)d]
f(n) = n(n+1)/2
where n = 1: g(n) = 1
where n is even: g(n) = 2.g(n/2) + n.f(n/2)
where n is odd: g(n) = 2.g((n-1)/2) + ((n+1)/2) * [f((n-1)/2) + f((n+1)/2)]
s2(n) = a.f(n) + d.g(n-1)
Let's call this the 'second sum'. I wonder whether it is possible to find a formula for the 2nd sum that isn't recursive. I don't think the 'second sum' as I have defined it has any application, because I can't envision any place where the second sum might be useful. Can this be applied somewhere? What about the 3rd or 4th sum?
Mathematics is only useful where it can be applied, or so I think. Does mathematics attain value only when you can apply it? This question is more philosophical than mathematical, I think. Is it only worthwhile to find formulas for such abstract things as the 'second sum' when we see the need for them?
The point of this post is because I don't see the value of such abstract things, unless they can be applied. Is this fair?
1, 2, 3, 4, 5, ...
We can calculate the n'th term of the sequence by the function t(n) = n. We could define s(n), the sum to n terms, recursively as s(1) = 1 and s(n) = n + s(n-1). The time bound of this procedure is O(n), but it isn't efficient because we can find an O(1) formula: s(n) = n(n+1)/2
Now consider this sequence:
1, 3=(1+2), 6=(1+2+3), 10, 15, 21, ...
Here the n'th term of the sequence is given by s(n) = n(n+1)/2, the same function as above. We can define u(n), the sum to n terms of this sequence, recursively as u(1) = 1 and u(n) = u(n) + s(n-1). This procedure has a time bound of O(n).
Can we do better than this? We can use simple variations, but the time stays linear. For example, where n>2, 'u(n) = u(n-2) + 2.s(n-1) + n' could be used. This nearly doubles the speed on average, but the time taken is still proportional to n.
Is there a method to calculate this sum in time less than proportional to n? As a matter of fact, we can do it in time which is O(logn).
s(n) = n(n+1)/2.
Where n = 1: u(1) = 1
Where n is even: u(n) = 2.u(n/2) + n.s(n/2)
Where n is odd: s2(n) = 2.u((n-1)/2) + ((n+1)/2) * [s((n-1)/2) + s((n+1)/2)]
In case you are wondering how I came up with these formulas, notice that (1) below corresponds to u(4). It can be split up into (2), which corresponds to 'u(2) + 2.s(2) + 3 + (3+4)'. In (3) you can see that '3 + (3+4)' can be split into u(2) + 2.s(2)
1: 1 + (1+2) + (1+2+3) + (1+2+3+4)
2: 1 + (1+2)
+ (1+2) + (1+2)
+ 3 + (3+4)
3: 3 + (3+4)
= 1 + (1+2)
+ 2 + (2+2)
Putting it together:
u(4) = u(2) + 2.s(2) + u(2) + 2.s(2)
= 2.u(2) + 2.(2.s(2))
This corresponds to:
u(n) = 2.u(n/2) + (n/2)(2.s(n/2))
= 2.u(n/2) + n.s(n/2)
I derived the formula for when n is odd the same way. I will now generalise this method to work for all arithmetic series:
t(n) = a+(n-1)d
s(n) = (n/2)[2a+(n-1)d]
f(n) = n(n+1)/2
where n = 1: g(n) = 1
where n is even: g(n) = 2.g(n/2) + n.f(n/2)
where n is odd: g(n) = 2.g((n-1)/2) + ((n+1)/2) * [f((n-1)/2) + f((n+1)/2)]
s2(n) = a.f(n) + d.g(n-1)
Let's call this the 'second sum'. I wonder whether it is possible to find a formula for the 2nd sum that isn't recursive. I don't think the 'second sum' as I have defined it has any application, because I can't envision any place where the second sum might be useful. Can this be applied somewhere? What about the 3rd or 4th sum?
Mathematics is only useful where it can be applied, or so I think. Does mathematics attain value only when you can apply it? This question is more philosophical than mathematical, I think. Is it only worthwhile to find formulas for such abstract things as the 'second sum' when we see the need for them?
The point of this post is because I don't see the value of such abstract things, unless they can be applied. Is this fair?