Analyzing Time Complexity and Memory of Binary Search Algorithm

In summary, the algorithm has a recurrence relation and an if condition. If the if condition is true, then the algorithm returns mid. If the if condition is false, then the algorithm returns the value at the lowest index in the array.
  • #1
evinda
Gold Member
MHB
3,836
0
Hello! (Wave)

I want to show that the $T(n)$ of the following algorithm is equal to $O(\log n)$.
How can we do it, now that we have if, else conditions? (Thinking)

Code:
int BinarySearch(int A[1…n], int y, int low, int high) { 
 if (high < low) {
    return -1; //not found 
 }
 mid = low + (high - low) / 2; 
 if (A[mid] > y) {
    return BinarySearch(A, y, low, mid-1); 
 }
 else if (A[mid] < y) {
       return BinarySearch(A, y, mid+1, high); 
 }
 else {
      return mid; //found 
}
}

Also, how can we find how many memory is required, so that the algorithm is executed?
 
Technology news on Phys.org
  • #2
evinda said:
I want to show that the $T(n)$ of the following algorithm is equal to $O(\log n)$.
How can we do it, now that we have if, else conditions?
Why are you saying "now that we have if, else conditions"? Did you have a problem where there were no such conditions?

The fact that $T(n)=O(\log n)$ holds because the number of elements in the (relevant portion of the) array is roughly cut in half with each recursive call.

evinda said:
Also, how can we find how many memory is required, so that the algorithm is executed?
Does the algorithm allocate any memory besides the variable [m]mid[/m]? This variable does not really count because it is simply a convenience. There are also a couple of technical reasons why it does not count.
 
  • #3
The if condition implies that one of the conditions will only hold in any iteration. Since we are interested in the worst case scenario , we assume that the element is never found in every iteration. Assuming that , in each iteration we take half of the elements and there is a finite number of comparisons . We can look at it as

$$T(n) = T(n/2) + c$$

which implies

$$T(n) = T(n/2^k) + ck $$

Take $n/2^k = 1$ which implies that $k = \log(n)$

So we have

$$T(n) = T(1) + c\log(n) = \mathcal{O}(\log(n)) $$
 
  • #4
Evgeny.Makarov said:
Does the algorithm allocate any memory besides the variable [m]mid[/m]? This variable does not really count because it is simply a convenience. There are also a couple of technical reasons why it does not count.

The prof told us that we can see how many memory is required, by counting how many recurrence relations are active simultaneously, how many activation records are open.

But, how can I see how many activation records are open simultaneously? (Thinking)
 
  • #5
evinda said:
The prof told us that we can see how many memory is required, by counting how many recurrence relations are active simultaneously, how many activation records are open.
Well, that's a subtle issue. The idea is that each time a program makes a function call, it obviously passes the control to that function. But it also has to remembers where to continue after the function returns a value. Thus, if $f_1$ calls $f_2$, one activation records is opened to remember the point in $f_1$ where the program will continue. If $f_2$ calls $f_3$, then another record is opened to remember the place in $f_2$, and so on. This also happens when a function calls itself.

Memory is spent not only on remembering the return address, but also on allocating parameters and local variables of the function being called. This is especially important for recursive calls. For example, in
Code:
int fact(int n) {
  int m;
  if (n == 0) return 1;
  else {
    m = n - 1;
    return n * fact(m);
  }
}
there are different copies of variable [m]m[/m] in each instance of the called function. That is, when a recursive call occurs, a new variable [m]m[/m] is allocated. The recursive calls does not overwrite [m]m[/m]!

So, a constant amount of memory is spent on each function call. When a function returns, this memory is deallocated. But if there are $n$ active (not yet finished) calls, then the amount of allocated memory is proportional to $n$.

Now, there is a possible optimization (called "tail recursion") when the recursive call is the last instruction in the function. This is the case with [m]BinarySearch[/m]: the value returned from a recursive call is immediately returned; nothing is done to it. Such recursive calls are equivalent to loops because one does not have to remember the return point, and parameters and local variables can be overwritten. However, not all programming languages perform this optimization. As far as I know, this is done in functional programming languages, such as Haskell and Scheme, but not in C. With optimization, tail recursion takes the amount of memory that is independent of the number of iterations, just like a loop does.

Another subtlety is that activation records are allocated on the section of RAM that is called the "stack", as opposed to, say, arrays, which are allocated on the "heap". One may or may not count stack in determining a program's memory requirement.

evinda said:
But, how can I see how many activation records are open simultaneously?
Count how many recursive calls have been made but have not yet finished.
 
  • #6
Evgeny.Makarov said:
Count how many recursive calls have been made but have not yet finished.

Is the number of the recursive calls that have been made but have not yet finished equal to the number of the levels, that the recursion tree has? (Thinking)
 
  • #7
evinda said:
Is the number of the recursive calls that have been made but have not yet finished equal to the number of the levels, that the recursion tree has?
Yes. On input $A[1\dots2^k]$ the number of recursive calls is $k$ or close to it.

Edit: The answer may depend on what you mean by the "recursion tree". In this case, each execution of the body of BinarySearch makes at most one recursive call, so the tree (as I understand it) is linear.
 
  • #8
Evgeny.Makarov said:
Yes. On input $A[1\dots2^k]$ the number of recursive calls is $k$ or close to it.

Nice! (Smile)

So, when we have for example this algorithm:

Code:
int BinarySearch(int A[1…n], int y, int low, int high) { 
 if (high < low) {
    return -1; //not found 
 }
 mid = low + (high - low) / 2; 
 if (A[mid] > y) {
    return BinarySearch(A, y, low, mid-1); 
 }
 else if (A[mid] < y) {
       return BinarySearch(A, y, mid+1, high); 
 }
 else {
      return mid; //found 
}
}

the recurrence relation is:

$$T(n)=\left\{\begin{matrix}
T\left( \lfloor \frac{n}{2} \rfloor\right)+10 &,n \geq 1 \\
2 &,n=0
\end{matrix}\right.$$

I tried to make the recursion-tree like that:

View attachment 3391

Is it right? If so, does it take the last value, for $N=1$, since then $\lfloor \frac{N}{2} \rfloor=0$ ?
Are there $\log N+1$ levels?

Or am I wrong? (Worried)
 

Attachments

  • treee.png
    treee.png
    790 bytes · Views: 71
  • #9
evinda said:
So, when we have for example this algorithm:

the recurrence relation is:

$$T(n)=\left\{\begin{matrix}
T\left( \lfloor \frac{n}{2} \rfloor\right)+10 &,n \geq 1 \\
2 &,n=0
\end{matrix}\right.$$
Does $T(n)$ denote time? Then why would we use it for memory? I don't see how numbers like 10 and 2 from the definition of $T$ have anything to do with memory.

evinda said:
I tried to make the recursion-tree like that:
Without defining the recursion tree it does not make much sense to discuss how it looks like. What can be discussed precisely is the number of recursive calls made when [m]BinarySearch[/m] is called with arguments $\mathit{low}$ and $\mathit{high}$. (I stated it incorrectly in the previous post: the maximum number of iterations/recursive calls depends on $\mathit{high}-\mathit{low}$, not on $n$.) And yes, it is around $\log_2(\mathit{high}-\mathit{low})$ (whether exactly, +1 or -1 or something like that requires a more careful look).
 
  • #10
Evgeny.Makarov said:
What can be discussed precisely is the number of recursive calls made when [m]BinarySearch[/m] is called with arguments $\mathit{low}$ and $\mathit{high}$. (I stated it incorrectly in the previous post: the maximum number of iterations/recursive calls depends on $\mathit{high}-\mathit{low}$, not on $n$.) And yes, it is around $\log_2(\mathit{high}-\mathit{low})$ (whether exactly, +1 or -1 or something like that requires a more careful look).

Coud you explain me further how you concluded that the memory, that is required, is around $\log_2(\mathit{high}-\mathit{low})$ ? (Thinking)
 
  • #11
There is another https://driven2services.com/staging/mh/index.php?threads/12609/ devoted to establishing the fact the number of recursive calls is approximately $\log_2(\mathit{high}-\mathit{low})$.
 

FAQ: Analyzing Time Complexity and Memory of Binary Search Algorithm

What is the purpose of analyzing the time complexity and memory of a binary search algorithm?

The purpose of analyzing the time complexity and memory of a binary search algorithm is to understand how efficient the algorithm is in terms of time and space. This analysis helps in determining the algorithm's performance and can be used to compare it with other algorithms to choose the most efficient one for a given problem.

How is time complexity of a binary search algorithm calculated?

The time complexity of a binary search algorithm is calculated by counting the number of operations performed in the worst-case scenario, which is when the target element is not present in the array. This is represented by the Big O notation, where the time complexity of a binary search algorithm is O(log n), where n is the size of the array.

What factors can affect the time complexity of a binary search algorithm?

The time complexity of a binary search algorithm can be affected by the size of the input array, the location of the target element in the array, and the implementation of the algorithm. A larger input array or a target element located towards the end of the array can result in a higher time complexity.

How is the memory usage of a binary search algorithm determined?

The memory usage of a binary search algorithm is determined by the amount of space required to store the input array, any auxiliary data structures used in the algorithm, and the variables used for the search. It can be measured in terms of the number of variables or the size of the input array.

Is it possible for a binary search algorithm to have a better time complexity than O(log n)?

No, it is not possible for a binary search algorithm to have a better time complexity than O(log n). This is because the algorithm always divides the search space in half, resulting in a logarithmic time complexity, which is the most efficient for searching a sorted array.

Similar threads

Replies
36
Views
6K
Replies
1
Views
1K
Replies
5
Views
2K
Replies
12
Views
2K
Replies
46
Views
9K
Replies
3
Views
3K
Replies
1
Views
1K
Back
Top