Proving n log n is Big-Oh of log(n!)

In summary, Big-Oh notation is used in computer science to analyze the efficiency of algorithms by describing the limiting behavior of a function as the input size approaches infinity. To prove that n log n is Big-Oh of log(n!), a positive constant needs to be found that satisfies the inequality n log n ≤ c log(n!) for all values of n greater than some minimum value. Both n log n and log(n!) have a time complexity of O(n log n), and it is important to prove their relationship in order to understand the efficiency of algorithms and compare them. n log n and log(n!) cannot be equal because the factorial function grows at a faster rate than the logarithmic function.
  • #1
Kartik Yadav
1
0
Moved from a non-homework section, so missing the template
To prove that n log n is big oh of log(n!), I did:
n log n <= C log(n!)
n log n/ log(n!) <= C

Let k = 1
n > k, so for n = 2
2 log 2 / log 2 <= C
2 <= C
C is an element of [2, infinity)
Taking C = 2 and k = 1
can we say, n log n <= 2 log(n!)
and hence n log n is big oh of log(n!) ?
 
Physics news on Phys.org
  • #2
  • Like
Likes Pepper Mint
  • #3
Hi,
You can use Stirling's approximation as follows:
log(n)≅nlog(n)-n ∈O(nlog(n)) .
 

FAQ: Proving n log n is Big-Oh of log(n!)

What is the definition of Big-Oh notation?

Big-Oh notation is a mathematical notation used to describe the limiting behavior of a function when the input size grows towards infinity. It is commonly used in computer science to analyze the efficiency of algorithms.

How do you prove that n log n is Big-Oh of log(n!)?

To prove that n log n is Big-Oh of log(n!), we need to show that there exists a positive constant c such that n log n ≤ c log(n!) for all values of n greater than some minimum value. This can be done by using the properties of logarithms and the definition of factorial.

What is the time complexity of n log n and log(n!)?

The time complexity of n log n is O(n log n), which means it grows in proportion to n times the logarithm of n. The time complexity of log(n!) is also O(n log n), as it involves calculating the logarithm of n numbers. Therefore, n log n and log(n!) have the same time complexity.

Why is it important to prove the relationship between n log n and log(n!)?

Proving the relationship between n log n and log(n!) is important because it helps us understand the efficiency of algorithms that have a time complexity of n log n. It also allows us to compare the efficiency of different algorithms and determine which one is more efficient based on their time complexity.

Can n log n be equal to log(n!)?

No, n log n and log(n!) cannot be equal because the factorial function grows at a faster rate than the logarithmic function. As n increases, the difference between n log n and log(n!) becomes larger, making it impossible for them to be equal.

Similar threads

Replies
1
Views
1K
Replies
11
Views
2K
Replies
10
Views
2K
Replies
1
Views
1K
Replies
1
Views
4K
Replies
10
Views
4K
Back
Top