Big O question: f(n) = n^(1.01) + n(log(n))^5, g(n) = n^(1.01)

  • Thread starter s3a
  • Start date
In summary: No, there is no exponent w small enough that the limit as x-->inf of x^w <= limit as x-->inf of log_2(x).
  • #1
s3a
818
8

Homework Statement


Given that:

f(n) = n^(1.01) + n(log(n))^5
g(n) = n^(1.01)

Show that:
a) f(n) ∈ O(g(n))
b) f(n) ∈ Ω(g(n))
c) f(n) ∈ Θ(g(n))


Homework Equations


Any method as long as it's correct. It doesn't have to be strictly by using the definition of the big O notation but it could be using limits or any other correct method.


The Attempt at a Solution


I can see how b is true. Also, given that I know the answer to a is true because the "solution" my teacher gave states it, I can automatically conclude that c is true because if both a and b are true, then c must also be true.

I tried plugging in n = 1,000,000 and n(log(n))^5 grows larger than n^(1.01) which makes intuitive sense but seems to contradict a.

Any help in showing that a is in fact true would be greatly appreciated!
Thanks in advance!
 
Physics news on Phys.org
  • #2
Try taking the limit of the ratio [tex]\frac{f(n)}{g(n)}[/tex]. If the limit converges, you use the result to get a relationship between the two.
 
  • #3
Actually, I already knew that. I just missed something logically in the procedure involving limits which I now get. Thankfully the problem just asks for the answer because I can foresee the outcome of very many applications of l'Hopital's rule and don't actually have to differentiate it so many times and hurt my poor brain and hand! :D

Now, I get it algebraically thanks to the limit but, conceptually, I still can't see this! The n[log_2(n)]^5 term wins out against n^(1.01) not to mention that in this case, it has an additional n^(1.01) to "help it win the fight" on the numerator. Could you please help me get it fundamentally without directly relying on a mathematical theorem?
 
  • #4
If f(n) ∈ Ω(g(n)), then you already know that the log term does not in fact win against the power term in growth. It may feel counterintuitive because g(n) has the power term as well, but if the log term insignificant in comparison then you're really only comparing the powers.
 
  • #5
1) f(n) ∈ Ω(g(n)) is not given in the question but rather in the solution that the teacher gave. In other words, you're not supposed to know that when you attempt the question.


2) I get what you're saying about if it were insignificant, I'd only have to compare the powers but I can't accept that it is insignificant. Let n = 10^9:

http://www.wolframalpha.com/input/?i=(10^9*(log_2(10^9))^5)/(10^9)^1.01

the result from Wolfram Alpha is clearly greater than 1 (implying that the log wins).


What am I doing wrong?
 
  • #6
My mistake, I meant to say f(n) ∈ O(g(n)) in my previous post.

Your n isn't large enough, try comparing 10^100 vs 10^1000.
 
Last edited:
  • #7
I think I typed the question in a confusing way initially but the point is that none of the three things asking to be shown are known in the question (I have the answers given by my teacher - that's why I know them all.)

Could you comment on the Wolfram Alpha part of my previous post specifically please?
 
  • #8
s3a said:
I think I typed the question in a confusing way initially but the point is that none of the three things asking to be shown are known in the question (I have the answers given by my teacher - that's why I know them all.)

Could you comment on the Wolfram Alpha part of my previous post specifically please?

You didn't choose n large enough. The power n^(.01) will win against log(n), but will take much longer. If you solve for their intersections, the larger intersection when the power term wins against log_2(n) is around 7*10^(299)

http://www.wolframalpha.com/input/?i=n^(.01)=log_2+n

Be careful when choosing values to check for growth; even though 10^9 seems like a large number for daily use it's still a small finite number when compared to infinity.
 
  • #9
My god!

I had to use 10^9999! 10^999 didn't work either to "prove" the point!:
http://www.wolframalpha.com/input/?i=(10^9999*(log_2(10^9999))^5)/(10^9999)^1.01


Okay so, basically, any finite power (regardless of how small the finite power is) always eventually wins against a logarithm to any finite power (no matter how large the finite power is)?

(Assume that the powers are positive and that the only variables are the base of the power for the polynomial as well as the argument for the logarithmic function for the question I just asked.)


Is there an exponent, w, small enough such that the limit as x-->inf of x^w <= limit as x-->inf of log_2(x)? If so, what is it? (If the answer to my question above is yes then this second question - technically two questions close together - could automatically be ignored.)
 
  • #10
You can try to prove the first claim with the same method as before:
[tex]\lim_{n\to\infty}\frac{\log(n)}{n^w}[/tex] for some w>0. The base of the log doesn't matter, so working with the natural log will make it easier.
 
  • #11
Is what I just did (in the attachment) nonsense? If not, what do I do next? If it is, then what's the first mistake I made and how can I correct it?
 

Attachments

  • MyWork.jpg
    MyWork.jpg
    32.8 KB · Views: 623
  • #12
You did the limit incorrectly; here's a (counter) proof of a similar claim to help you get started:

If we start by assuming (for contradiction) that [tex]n^w\in O(\ln(n))[/tex] for some w>0, then there exist c,N>0 such that [tex]c\ln(n)\geq n^w[/tex] for all n>N, or equivalently [tex]\lim_{n\to\infty}\frac{\ln(n)}{n^w}\geq \frac{1}{c}>0.[/tex] However,
[tex]\lim_{n\to\infty}\frac{\ln(n)}{n^w}=\lim_{n\to\infty}\frac{\frac{1}{n}}{wn^{w-1}}=\lim_{n\to\infty}\frac{1}{wn^w}=0[/tex]
 
  • #13
Ok so what you just said means that w can be a super small (positive and non-zero) finite number and that the monomial would win out against the logarithm no matter which base and power it's raised to like I said earlier, right?

Edit: I think it seems like I'm repeating what you implied but I just need an explicit confirmation just to be sure because I don't want to incorrectly assume the above.
 
Last edited:
  • #14
There's a subtle difference here. What I proved was that any positive power is not in O(log n), or that log n doesn't provide an upper bound to power functions. It does not necessarily immediately imply that any power function gives an upper bound for log(n).
 
  • #15
Okay, and stating that n^w ∈ O(ln(n)) is false means n^w <= c * ln(n) is false which in turn means n^w > c * ln(n) is true which indirectly means (using a proof by contradiction) that w can be a super small (positive and non-zero) finite number and that the monomial would win out against the logarithm no matter which base and power it's raised to, right?
 
  • #16
That's not necessarily true either. The statement n^w ∈ O(ln(n)) is equivalent to "∃c,N>0 such that ∀n>N, n^w< c*ln(n)". Since this is false, we can only assume its negation, which is "∀c,N>0 ∃n>N, n^w>= c*ln(n)". This is not the same as saying n^w always outgrows against ln(n); it only says that ln(n) does not outgrow n^w. You'll have to do a separate proof (in the same pattern) of your specific claim.
 
  • #17
How about this proof (that I just attached)? Is it good?

Edit: I forgot to mention w is a real so imagine that I stated it in the proof.
 

Attachments

  • MyWork.jpg
    MyWork.jpg
    51 KB · Views: 591
  • #18
That's the gist of it. If you really wanted to be rigorous, l'hopital's rule can't be applied to functions of integers (it's a calculus result) so define functions [tex]f,g:\mathbb{R}\to\mathbb{R}\quad\text{as}\quad f(x)=x^w,g(x)=\ln(x)[/tex] and then apply l'hopital's rule. Since the integers are a subset of the reals, the result will hold true if true in the continuous analog. You also don't want to assume the theorem you're proving in the beginning since it will end up as circular logic.
 
  • #19
Thanks for adding since I care about rigour. I got the integer/real statement you made but, as for the circular logic part, I don't see how what I did is circular reasoning because I started with an inequality that did not isolate the constant and I then: 1) took the limit of both sides 2) isolated for the constant 3) applied l'Hopital's rule. So, what, specifically, is circular?
 
  • #20
The circular reasoning lies in the fact that you assumed your proposition is true before proceeding to prove it is true.
 
  • #21
Sorry, double posted.
 
Last edited:
  • #22
So which part of my work should I change mechanically though?

Is it the "Let's assume [. . .]" part? If so, should I change that to "If [. . .]" instead?
 
Last edited:
  • #23
I know it seems like a redundant question but I really need to know the exact details so I can fully comprehend the solution to the problem so please tell me.
 
  • #24
Oh, wait! I think it's just that the wording was confusing. When I said "Let's assume "[. . .]" I was saying to assume that the big omega statement is equivalent to the inequality and not saying to assume that the big omega statement is true.

Having said that, is my statement fully correct?
 
  • #25
Oh, wait! I think it's just that the wording was confusing. When I said "Let's assume "[. . .]" I was saying to assume that the big omega statement is equivalent to the inequality and not saying to assume that the big omega statement is true.

Having said that, is my statement fully correct?
 

FAQ: Big O question: f(n) = n^(1.01) + n(log(n))^5, g(n) = n^(1.01)

1. What is the time complexity of f(n) and g(n)?

The time complexity of f(n) and g(n) is O(n^(1.01)), as the highest degree term dominates the overall time complexity.

2. Are f(n) and g(n) considered to have the same time complexity?

Yes, since both functions have the same highest degree term, they have the same time complexity.

3. How does the logarithmic term in f(n) affect its time complexity?

The logarithmic term in f(n) does not significantly affect its time complexity, as it is raised to the fifth power and therefore has a smaller impact compared to the polynomial term.

4. How does the constant factor of 1.01 in both functions affect their time complexity?

The constant factor does not affect the time complexity, as it only scales the overall function by a constant amount and does not change the overall growth rate.

5. Can these functions be optimized to have a lower time complexity?

No, since both functions have the same highest degree term, they have reached the lowest possible time complexity for this type of algorithm. Any further optimization would likely require a different algorithm or approach.

Similar threads

Replies
4
Views
2K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
2
Views
1K
Replies
1
Views
4K
Replies
7
Views
7K
Back
Top