- #1
SeM
Hi, I am trying to prove that the eigevalues, elements, eigenfunctions or/and eigenvectors of a matrix A form a Hilbert space. Can one apply the inner product formula :
\begin{equation}
\int x(t)\overline y(t) dt
\end{equation}
on the x and y coordinates of the eigenvectors [x_1,y_1] and [x_2,y_2], x_1 and y_1 and x_2 and y_2:\begin{equation}
\int x_1\overline y_1 dt
\end{equation}
\begin{equation}
\int x_2\overline y_2 dt
\end{equation}and by that prove that the inner product of the matrix vectors is complete and therefore forms a Hilbert space L² [a,b] ?
The reason I am asking about this is because I am looking for a way to prove that the matrix elements, its vectors and its solution is/are Hilbert space, L^2[a,b]
The solution is in general form:
\begin{equation}
\psi = \alpha v_1 e^{\lambda_1t}+\alpha v_2 e^{\lambda_2t}
\end{equation}
where v_1 and v_2 are the eigenvectors of the matrix. In Kreyszig Functional Analysis, p 132, he says "In example 2.2-7 the functions were assumed to be real-valued. In certain cases, that restriction can be removed, to consider complex valued functions. These function form a vector space, which becomes an inner product space if we define:
\begin{equation}
\int x(t)\overline y(t) dt
\end{equation}
This gives also the complex norm.
\begin{equation}
\int \big(|x(t)|^2dt \big)^{1/2}
\end{equation}
And then Kreyszig ends with "The completion of the metric space corresponding to the inner product for the complex matrix (which I just gave above) is the real space L^2[a,b] .
I would like to prove that "my" matrix satisfies this condition too. So , because the general solution given above does not explicitly show that it satisfies the inner product, can I use the eigenvectors in the inner product formula to conclude that the matrix eigenvectors form a Hilbert space?
\begin{equation}
\int x(t)\overline y(t) dt
\end{equation}
on the x and y coordinates of the eigenvectors [x_1,y_1] and [x_2,y_2], x_1 and y_1 and x_2 and y_2:\begin{equation}
\int x_1\overline y_1 dt
\end{equation}
\begin{equation}
\int x_2\overline y_2 dt
\end{equation}and by that prove that the inner product of the matrix vectors is complete and therefore forms a Hilbert space L² [a,b] ?
The reason I am asking about this is because I am looking for a way to prove that the matrix elements, its vectors and its solution is/are Hilbert space, L^2[a,b]
The solution is in general form:
\begin{equation}
\psi = \alpha v_1 e^{\lambda_1t}+\alpha v_2 e^{\lambda_2t}
\end{equation}
where v_1 and v_2 are the eigenvectors of the matrix. In Kreyszig Functional Analysis, p 132, he says "In example 2.2-7 the functions were assumed to be real-valued. In certain cases, that restriction can be removed, to consider complex valued functions. These function form a vector space, which becomes an inner product space if we define:
\begin{equation}
\int x(t)\overline y(t) dt
\end{equation}
This gives also the complex norm.
\begin{equation}
\int \big(|x(t)|^2dt \big)^{1/2}
\end{equation}
And then Kreyszig ends with "The completion of the metric space corresponding to the inner product for the complex matrix (which I just gave above) is the real space L^2[a,b] .
I would like to prove that "my" matrix satisfies this condition too. So , because the general solution given above does not explicitly show that it satisfies the inner product, can I use the eigenvectors in the inner product formula to conclude that the matrix eigenvectors form a Hilbert space?
Last edited by a moderator: