Determining the growth of two functions using Big-Oh definition

In summary, the Big-Oh definition is used to determine the growth rate of two functions. It compares the behavior of the functions as their input values approach infinity. The function with the slower growth rate is said to be dominated by the other function, and its growth is represented by the Big-Oh notation. This allows for a simplified way of analyzing the efficiency of algorithms and predicting their performance as the input size increases.
  • #1
Haku
30
1
Homework Statement
I want to find the relationship between the growth of two functions, and define which grows faster when n gets large. g(n) = 6^n/n^5 , h(n) = (ln n)^84.
Relevant Equations
|g(n)| < C*h(n) for some C>0
My attempt involved using the big-Oh notation, I think this should work but I am not sure how to go about it. The two functions are g(n) = 6^n/n^5 and h(n) = (ln n)^84.
I thought that I could use the inequality 6^n < ln(n)^84 and 6^n/|n^5| = |g(n)| < 6^n and put those inequalities together.
But then would I choose C = 1? and by def it has to hold for some n0 when n > n0, so how would I pick an n0 s.t. this holds?
Am I even on the right track here?
Thanks.
 
Physics news on Phys.org
  • #2
If we take the logarithm, then ##\log g(n)=n\log 6 - 5\log (n)## which shows that ##\log g(n) \sim n.## The same done for ##h(n)## yields ##\log h(n)=84\log(\log(n)),## so ##\log h(n) \sim \log\log(n)).##

But ##n## grows much faster than ##\log\log(n)),## so how can
Haku said:
I thought that I could use the inequality 6^n < ln(n)^84 ...
hold? We have ##g(n)=O(6^n)=O(e^n)## and ##h(n)=O(\log(n)##. How do you want to compare them otherwise? As ##\dfrac{g(n)}{h(n)}## or as ##g(n)=g(h(n))##?
 
  • #3
I am a little confused with your reply. Was the first part about showing that my claim that 6^n < ln(n)^84 was false?
As for the comparison of g and h, to show that f does not grow at a faster rate than g when n gets large, we would need to show the following:
Screen Shot 2021-04-10 at 9.48.10 AM.png

Do you think using this def given the functions g and h would be possible? I need to find either case of one growing faster than the other when n gets large. I can't see how you could intuitively see which case has a higher probability of holding so I just chose the case where g grows faster than h arbitrarily.
Does any of my working on the original post look useful in attempts to conclude growth via this def?
Thanks.
 
  • #4
Haku said:
I am a little confused with your reply. Was the first part about showing that my claim that 6^n < ln(n)^84 was false?
As for the comparison of g and h, to show that f does not grow at a faster rate than g when n gets large, we would need to show the following: View attachment 281215
Do you think using this def given the functions g and h would be possible? I need to find either case of one growing faster than the other when n gets large. I can't see how you could intuitively see which case has a higher probability of holding so I just chose the case where g grows faster than h arbitrarily.
Does any of my working on the original post look useful in attempts to conclude growth via this def?
Thanks.
Yes, ##6^n## grows a lot faster than ##\log n##. There is no constant ##C## such that ##|g(n)|<C\cdot h(n)## for large ##n.## If at all, it is the other way around because ##6^n>n^c>(\log n)^c## for large ##n.##
Have a look at the Wikipedia article about the usage of the Landau symbols:
https://en.wikipedia.org/wiki/Big_O_notation

Your example is: ##g(n)=O(6^n)## and ##h(n)=O((\log n)^c)##. If we write ##h(n)=C\cdot (\log n)^c## then ##n\sim e^{C\cdot h(n)^{c'}}##, i.e. ##g(n) = O(6^{e^{h(n)}}).## This would be a very unusual way to write it. The common notation would be ##g(n)=O(6^n)## and ##h(n)=O(log(n)^{84}).##
 
Last edited:
  • #5
But here we have the relationship log(n)^c < n^c < 6^n, where h(n) = log(n)^c for c = 84. But g(n) = 6^n/n^5 not just 6^n, so how can we extend the string of inequalities to fit g(n) such that log(n)^c < g(n)?
Because that way we could pick C = 1 > 0 and some arbitrary n0 such that the condition holds when n > n0.
 
  • #6
##n^5 ## is small against ##6^n## so we can write ##O(5^n) <g(n)=\dfrac{6^n}{n^5}<6^n=O(6^n)##. Hence we don't have to bother: It would be ##O(c^n)## for some number ##5<c<6## if we make more precise.

##h(n)=(\log n)^{84} = O(\log (n)^{84})< O(n^{84}) < O(6^n)=g(n)## for ##n>260.##
 
  • #7
Im confused because what does the notation O(log(n)^84) mean? To my understanding you use something like f = O(g) where that defines the growth between f and g and means f grows slower than g. What does your notation mean?
Also that last equality does not hold because g(n) is not 6^n?
 
  • #8
Haku said:
Im confused because what does the notation O(log(n)^84) mean? To my understanding you use something like f = O(g) where that defines the growth between f and g and means f grows slower than g. What does your notation mean?
Also that last equality does not hold because g(n) is not 6^n?
##h(n)=O(\log(n)^{84})## means ##h(n)\leq C\cdot \log(n)^{84}##.

##g(n)=O(c^n)## with ##c\in (5,6)## which is practically the same as ##O(6^n)##.
 
  • Like
Likes Haku

FAQ: Determining the growth of two functions using Big-Oh definition

What is the Big-Oh definition and how is it used to determine the growth of two functions?

The Big-Oh definition is a mathematical notation used to describe the asymptotic behavior of a function. It is commonly used in computer science to analyze the efficiency of algorithms. To determine the growth of two functions using Big-Oh, we compare the rate of growth of the functions as the input size approaches infinity. The function with the slower rate of growth is considered to have a lower Big-Oh complexity.

How do you calculate the Big-Oh complexity of a function?

To calculate the Big-Oh complexity of a function, we look at the highest order term in the function's equation. This term represents the dominant factor in the function's growth. We then drop any constants and lower order terms and use the remaining term as the Big-Oh notation. For example, if a function has a complexity of n^2 + 5n + 10, the Big-Oh complexity would be O(n^2).

Can two functions have the same Big-Oh complexity?

Yes, two functions can have the same Big-Oh complexity. This means that both functions have a similar rate of growth, but it does not necessarily mean that they have the same exact growth rate. For example, both O(n^2) and O(n^2 + 10) have the same Big-Oh complexity, but the latter function has a slightly higher growth rate.

How does the Big-Oh definition help in analyzing the efficiency of algorithms?

The Big-Oh definition allows us to compare the efficiency of different algorithms by looking at their growth rates. A lower Big-Oh complexity indicates a more efficient algorithm, as it will have a slower growth rate as the input size increases. This helps in determining which algorithm is the most suitable for a given problem in terms of time and space complexity.

Are there any limitations to using the Big-Oh definition to analyze algorithm efficiency?

Yes, there are some limitations to using the Big-Oh definition. It only considers the worst-case scenario and does not take into account the best or average case performance of an algorithm. Additionally, the Big-Oh notation does not capture the constant factors and lower order terms, which can also affect the actual runtime of an algorithm. Therefore, it is important to also consider other factors when analyzing the efficiency of algorithms.

Back
Top