- #1
EvLer
- 458
- 0
Hello, I am trying to understand how to solve problems relating to time complexity of algorithms, esp. problems of the following kind:
An algorithm takes 0.5 ms for input size 100. How long will it take
for input size 500 if the running time is the following:
linear, nlogn, n^2, N^3
An algorithm takes 0.5 ms for input size 100. How large a problem
can be solved in 1 min if the running time is the following:
linear, nlogn, n^2, n^3
I think I have a general idea of what is asked but cannot figure out how to find it mathematically. Could someone be kind enough to explain it step by step?
Appreciate your help.
An algorithm takes 0.5 ms for input size 100. How long will it take
for input size 500 if the running time is the following:
linear, nlogn, n^2, N^3
An algorithm takes 0.5 ms for input size 100. How large a problem
can be solved in 1 min if the running time is the following:
linear, nlogn, n^2, n^3
I think I have a general idea of what is asked but cannot figure out how to find it mathematically. Could someone be kind enough to explain it step by step?
Appreciate your help.