- #1
secondprime
- 49
- 0
##A,B## are symmetric matrices of graphs ##G,H## respectively. For ##x \in G##,consider the graph ##(G-x)##, it has vertices adjacent to ##x## and vertices non adjacent to ##x##.## A_x## is the symmetric matrix of the graph ##(G-x)##, where ##C## is the symmetric matrix of the graph created by vertices of ##(G-x)## which are adjacent to ##x## and ##D## is the symmetric matrix of graph the created by vertices of ##(G-x)## which are not adjacent to ##x##.
## A_x =
\left( \begin{array}{cc}
C & E \\
E^{T} & D\\
\end{array} \right) ##
It should be noted, that-
1. Interchanging/swapping any two rows(or columns) of ##C## does not affect matrix ##D## (and vice versa) .
2. Any change in ##C## or ##D## or both ##C## and ##D## changes matrix ##E##.
If ##G \simeq H ## then there exists ##u \in H## so that, ##(G-x)\simeq (H-u)##, the matrix of ##(H-u)## is ##B_u##, where ## B_u= \left( \begin{array}{cc}
S & R \\
R^{T} & Q\\
\end{array} \right)##
Consider the situation where ##S \neq C,D \neq Q##. Since ##(G-x)\simeq (H-u)## ,so, there exists a permutation matrix ##P## ,so that ##PQ=D##.
Once ##Q## is arranged exactly like ##D## then ##C## can be arranged in linear/ polynomial time using the arrangement of rows of ##E## because after ##Q=D##, the rows of ##E## and ##R## would be same but arranged in different order. So, arrange the order of rows of ##R## like ##Q##, it will make ##S = C##.
One of the ##S,Q## has vertices ##<\frac{n}{2}## where ## n=|(G-x)|=|(H-u)|##, consider the smallest matrix to find the permutation matrix ##P##.
So, to determine whether ##(G-x)\simeq (H-u)## is true or not,there are ##n## vertices to check. Let, the time complexity to find this permutation matrix ##P## is ##K(n/2)##. So, the complexity would be-
##n*K(n/2)##(omitting the time complexity of rearranging ##E## as it has less growth than ##K##)
But recursion of above method can be used to minimize the complexity of ##K(n/2)## for smallest matrix, say, ##Q##, in that case,complexity would be,
##n*n/2* K(n/4)##
Using recursion, this gives a bound ##\frac{n^{log_2(n)}}{2^{\sum log_2(n)}}##.Is this bound correct? if not, what is the complexity of above algorithm?
## A_x =
\left( \begin{array}{cc}
C & E \\
E^{T} & D\\
\end{array} \right) ##
It should be noted, that-
1. Interchanging/swapping any two rows(or columns) of ##C## does not affect matrix ##D## (and vice versa) .
2. Any change in ##C## or ##D## or both ##C## and ##D## changes matrix ##E##.
If ##G \simeq H ## then there exists ##u \in H## so that, ##(G-x)\simeq (H-u)##, the matrix of ##(H-u)## is ##B_u##, where ## B_u= \left( \begin{array}{cc}
S & R \\
R^{T} & Q\\
\end{array} \right)##
Consider the situation where ##S \neq C,D \neq Q##. Since ##(G-x)\simeq (H-u)## ,so, there exists a permutation matrix ##P## ,so that ##PQ=D##.
Once ##Q## is arranged exactly like ##D## then ##C## can be arranged in linear/ polynomial time using the arrangement of rows of ##E## because after ##Q=D##, the rows of ##E## and ##R## would be same but arranged in different order. So, arrange the order of rows of ##R## like ##Q##, it will make ##S = C##.
One of the ##S,Q## has vertices ##<\frac{n}{2}## where ## n=|(G-x)|=|(H-u)|##, consider the smallest matrix to find the permutation matrix ##P##.
So, to determine whether ##(G-x)\simeq (H-u)## is true or not,there are ##n## vertices to check. Let, the time complexity to find this permutation matrix ##P## is ##K(n/2)##. So, the complexity would be-
##n*K(n/2)##(omitting the time complexity of rearranging ##E## as it has less growth than ##K##)
But recursion of above method can be used to minimize the complexity of ##K(n/2)## for smallest matrix, say, ##Q##, in that case,complexity would be,
##n*n/2* K(n/4)##
Using recursion, this gives a bound ##\frac{n^{log_2(n)}}{2^{\sum log_2(n)}}##.Is this bound correct? if not, what is the complexity of above algorithm?
Last edited by a moderator: