Prove that log(n) ∈ Θ( nlog(n) )

  • Thread starter s3a
  • Start date
In summary: Thanks for checking!In summary, the goal was to find a smaller term from each curly brace such that it's less than or equal to the original log(n!). If n >= 4, then 1/2 log(n) >= log(2) and the solution is correct. If n <= 4, then n log(n) <= 4 log(n!).
  • #1
s3a
818
8

Homework Statement


The question and solution are in the attachment.


Homework Equations


Big O, Ω, Θ definitions. Also, I think logarithmic identities.


The Attempt at a Solution


It's the part shown in red that I am stuck at. I tried to expand log(n!) to log(n) + log(n-1) + ... + log(1) but I can't see how to get log(n) + log(n-1) + log(n-2) + ... + log(n/2 + 1) + log(n/2) + ... + log(2) + log(1) = n/2 * log(n/2) + n/2 * log(1) out of that.

Any help getting past this step would be greatly appreciated!
Thanks in advance!
 

Attachments

  • P5.jpg
    P5.jpg
    46 KB · Views: 716
Physics news on Phys.org
  • #2
s3a said:
I can't see how to get log(n) + log(n-1) + log(n-2) + ... + log(n/2 + 1) + log(n/2) + ... + log(2) + log(1) = n/2 * log(n/2) + n/2 * log(1) out of that.
The book isn't stating those are equal instead it's stating it's greater than or equal:

log(n) + log(n-1) + log(n-2) + ... + log(n/2 + 1) + log(n/2) + ... + log(2) + log(1) >= n/2 * log(n/2) + n/2 * log(1)
 
  • #3
Thanks for clarifying that it's an inequality but I'm still confused as to the reasoning with all the n/2 terms as well as why those curly braces are being chosen in their entirety (because that seems to imply that it's exactly equal to the thing under each respective curly brace).

If I am being unclear, please tell me.
 
  • #4
s3a said:
all the n/2 terms
It's because a list of n terms from 1 to n was separated into two lists, {1 to n/2} {n/2+1 to n}.

s3a said:
seems to imply that it's exactly equal to the thing under each respective curly brace.
Not equal but greater than or equal. For example {log(1) + log(2) + ... + log(n/2)} is greater than {log(1) + log(1) + ... + log(1)}.
 
Last edited:
  • #5
Okay, I now see that the goal was to choose the smallest (or even a smaller) term from each curly brace such that it's less than or equal to the initial log(n!).

I am now stuck at the step:
n/2 [log(n) - 1] >= n/2 [log(n) - 1/2 * log(n)]

How do I justify that step?
 
  • #6
It's ok up to this point:

n/2 log(n/2) + n/2 log(1), since log(1) == 0, this becomes
n/2 log(n/2) + 0 == n/2 log(n/2)

n/2 log(n/2) ==
n/2 (log(n) - log(2))

Then there's an issue:

n/2 (log(n) - log(2)) ?= n/2 (log(n) - 1)

This means log(2) == 1, which is only true if this is log base 2 (log2(...)

So ignoring this substitution and continuing

n/2 (log(n) - log(2))

note that 1/2 log(4) == log(2), { since sqrt(4) == 2, so 1/2 log(4) == sqrt(log(4)) == log(2) }

if n >= 4, then 1/2 log(n) >= log(2) and

n/2 (log(n) - log(2)) >= n/2(log(n) - 1/2 log(n))

since log(n) - 1/2 log(n) = 1/2 log(n),
n/2(log(n) - 1/2 log(n)) = n/2(1/2 log(n)) = 1/4 n log n
 
Last edited:
  • #7
Basically the question is prove: lim n->infty log(n!)/ (nlog(n)) = c, for some constant c.
If you expand n! you get n*(n-1)*(n-2)...etc, which will be n^n-O(n^(n-1)). For large n, n^n will be the dominant term.
 
  • #8
Thanks for the answers.

rcgldr, the whole 1/2 log(4) == log(2) thing was just an attempt to match my teacher's solution, right?

For example, if I replaced log(2) with log 1/4 * log(16) instead of replacing it with 1/2 * log(4) and then replaced log(16) with log(n) as follows:
n/2 * [log(n) - 1/4 * log(n)]
= 4n/8 log(n) - n/8 log(n)
= 3/8 * n * log(n)

it would be a correct yet slightly different approach to what the teacher did, right?
 
  • #9
s3a said:
3/8 * n * log(n) ...
it would be a correct yet slightly different approach to what the teacher did, right?
Yes, but in this case only if n >= 16.
 
  • #10
I have another mini-question: Is the ending of the solution provided a typo?

More specifically, should n log(n) <= 4 n log(n) be n log(n) <= 4 log(n!) instead?
 
  • #11
s3a said:
Should n log(n) <= 4 n log(n) be n log(n) <= 4 log(n!) instead?
That's what I think it should be.
 

FAQ: Prove that log(n) ∈ Θ( nlog(n) )

What is the definition of log(n) ∈ Θ( nlog(n) )?

The notation log(n) ∈ Θ( nlog(n) ) means that the growth rate of log(n) is similar to the growth rate of nlog(n). In other words, the function log(n) is bounded above and below by the function nlog(n).

How do you prove that log(n) ∈ Θ( nlog(n) )?

To prove that log(n) ∈ Θ( nlog(n) ), we need to show that log(n) is both O(nlog(n)) and Ω(nlog(n)). This can be done by finding appropriate constants c1 and c2 and a value n0 such that for all n ≥ n0, c1nlog(n) ≤ log(n) ≤ c2nlog(n). This shows that log(n) grows at a similar rate as nlog(n) and is therefore bounded by it.

What is the time complexity of log(n) ∈ Θ( nlog(n) )?

The time complexity of a function that satisfies log(n) ∈ Θ( nlog(n) ) is O(nlog(n)). This means that the function has a time complexity that is directly proportional to nlog(n) and is therefore considered efficient in terms of time complexity.

Can you provide an example of a function that satisfies log(n) ∈ Θ( nlog(n) )?

One example of a function that satisfies log(n) ∈ Θ( nlog(n) ) is f(n) = nlog(n) + 2. This function has a growth rate that is similar to log(n) and is therefore bounded by it.

What is the significance of proving log(n) ∈ Θ( nlog(n) )?

Proving that log(n) ∈ Θ( nlog(n) ) is significant because it helps us understand the growth rate of the function log(n) and compare it to other functions. It also allows us to make efficient algorithmic choices based on the time complexity of the function.

Similar threads

Replies
1
Views
1K
Replies
2
Views
1K
Replies
11
Views
2K
Replies
1
Views
4K
Replies
5
Views
3K
Replies
1
Views
1K
Replies
10
Views
3K
Replies
2
Views
407
Back
Top