Justify $\Theta(n^7 \log n)$ for $g(n)$

  • MHB
  • Thread starter evinda
  • Start date
In summary, the conversation discusses finding the asymptotic complexity of a function and justifying why $n^7 \log n$ is the dominant term. The participants go through the definition and use various arguments to show that all terms in the function are smaller than $n^7 \log n$. Finally, they conclude that $g(n)=\Theta(n^7 \log n)$ based on the definition of asymptotic complexity.
  • #1
evinda
Gold Member
MHB
3,836
0
Hi! (Smile)

I want to find the asymptotic complexity ($\Theta$) of the function $g(n)=8n^7 \log n+18 \log{13}+16 \log n^n+12 n^{\frac{5}{2}}$.

We see that the dominant term is $n^7 \log n$, but how could we justify it? (Thinking)

Does this mean that : $g(n)=\Theta(n^7 \log n)$ ? :confused:
 
Technology news on Phys.org
  • #2
According to definition we have to find constants such that

\(\displaystyle c_1 f(n) \leq g(n) \leq c_2 f(n) \)

is held true for all $n\geq N_0$ , Can you find such constants ?
 
  • #3
ZaidAlyafey said:
According to definition we have to find constants such that

\(\displaystyle c_1 f(n) \leq g(n) \leq c_2 f(n) \)

is held true for all $n\geq N_0$ , Can you find such constants ?

Yes, but.. how can I explain that $n^7 \log n$ is the dominant term? (Thinking)
 
  • #4
You can try to verify it by definition, that is,
$$g(n) = \Theta(n^7 \log n) \Leftrightarrow \exists c>0, \exists n_0 \in \mathbb{N}, \forall n \geq n_0: 0 \leq g(n) \leq c n^7 \log n$$

I would do the following: try to show that all terms (with exception of the first term obviously) of $g(n)$ are smaller than $n^7 \log n$ for a certain $n_0 \in \mathbb{N}$.

First, note that $\forall n \geq 2: 18 \log 13 \leq n^7 \log n$. This follows from the fact that $\log n$ and $n^7$ are increasing functions forall $n \in \mathbb{N}_0$.

Second, we can write $16 \log n^n = 16n \log n$. We want to search for which values of $n \in \mathbb{N}_0$ we have: $16 n \log n \leq n^7 \log n$ or equivalently $16 n \leq n^7$. Since $n^7$ increases much faster than $16n$ for the natural numbers (without zero) we can conlude that $\forall n \geq 2: 16n \leq n^7$, therefore $ \forall n \geq 2: 16n \log n \leq n^7 \log n$.

For the last term we can use a similar argument to show that $\forall n \geq 2: 12n^{\frac{5}{2}} \leq n^7 \log n$.

At this point we have shown that all terms are smaller than $n^7 \log n$.
What's the conclusion now?
 
  • #5
At the beginning, before I verify that $g(n)=\Theta(n^7 \log n)$ can I just say the following?

We see that the dominant term of the function $g(n)$ is $n^7 \log n$, so: $g(n)=\Theta(n^7 \log n)$ ?

Or is there a better way to explain why we suppose that $g(n)=\Theta(n^7 \log n)$? (Thinking)
 
  • #6
By definition

$$T(n) = \Theta( G(n) ) \, \implies \,\, \lim \frac{T(n)}{G(n)} \to c \, ; \,\, 0<c<\infty $$

Suppose that $f(n) = a(n) +g(n)$ where $a(n)$ is the dominant term so

$$\lim_{n \to \infty } \frac{a(n)+g(n)}{a(n)} = 1+ 0 = 1 $$

So $T(n) = \Theta (a(n)) \blacksquare $.
 

FAQ: Justify $\Theta(n^7 \log n)$ for $g(n)$

What is the definition of $\Theta(n^7 \log n)$?

The notation $\Theta(n^7 \log n)$ is used to describe the time complexity of an algorithm, where the runtime is directly proportional to $n^7 \log n$. This means that as the input size increases, the runtime of the algorithm will also increase at the same rate.

How is the runtime of an algorithm determined to be $\Theta(n^7 \log n)$?

The runtime of an algorithm is determined by analyzing the number of operations it performs as the input size increases. In the case of $\Theta(n^7 \log n)$, the algorithm will have a runtime of $n^7 \log n$ operations.

What does the $\Theta$ notation represent in $\Theta(n^7 \log n)$?

The $\Theta$ notation represents the tightest bound on the runtime of an algorithm. In this case, it means that the algorithm has an upper and lower bound of $n^7 \log n$ operations.

Can $\Theta(n^7 \log n)$ be simplified further?

No, $\Theta(n^7 \log n)$ is already in its simplest form. The $\Theta$ notation represents both the upper and lower bound of the algorithm's runtime, so it cannot be simplified further.

How does the $\log n$ term affect the overall time complexity of an algorithm?

The $\log n$ term has a smaller impact on the overall time complexity compared to other terms such as $n^7$. This is because the logarithm function grows at a slower rate than polynomial functions, such as $n^7$. However, as the input size increases, the logarithm term will eventually have a significant impact on the algorithm's runtime.

Similar threads

Replies
24
Views
4K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
1
Views
1K
Replies
15
Views
7K
Replies
7
Views
2K
Replies
1
Views
1K
Back
Top