- #1
evinda
Gold Member
MHB
- 3,836
- 0
Hello! (Wave)
Let a matrix with $n$ integers in the interval $[1,100]$. I want to write an algorithm that sorts the elements of the matrix in $O(n)$. [Hint: Count and use the number of times that each number appears]
I have thought the following:
We create a matrix with $100$ entries. If the matrix contains for example 1 $1$ we put 1 at the first cell. I have written this as follows:
The first and second for are executed $100$ times and the third $100 \cdot j=100 \cdot n$ times, right?
So the complexity is $O(100)+O(100 \cdot n)=O(n)$.
Or have I done something wrong? (Thinking)
Let a matrix with $n$ integers in the interval $[1,100]$. I want to write an algorithm that sorts the elements of the matrix in $O(n)$. [Hint: Count and use the number of times that each number appears]
I have thought the following:
We create a matrix with $100$ entries. If the matrix contains for example 1 $1$ we put 1 at the first cell. I have written this as follows:
Code:
int matrix[100]
int count[100]
int new_matrix[n]
for i=0 to 99
count[i]=0
end
for i=0 to 99
if matrix[i]=i+1
then count[i]=count[i]+1
end
j=0
for i=0 to 99
if (count[i] not 0)
while (count[i] not 0)
new_matrix[j]=matrix[i]
j=j+1
count[i]=count[i]-1
end
end
for (i=0 to j-1)
print(new_matrix[i])
end
So the complexity is $O(100)+O(100 \cdot n)=O(n)$.
Or have I done something wrong? (Thinking)
Last edited: