Hermitian adjoint operators (simple "proofs")

In summary: T^*: V \rightarrow...##, so ##T## and ##T^*## aren't even on the same space. The equation is a statement about the action of ##T^*## on ##T##'s input space.You don't need to define a new space U. Just use the definition of ##T^*## and the given equation and the fact that ##T^*## is linear and ##T## is linear, and the result follows.In summary, the given equations are proved by using the definition of the adjoint operator and the linearity of T and T^*, as well as the properties of the scalar product. For question 4, the hint provided that Tv
  • #1
Incand
334
47

Homework Statement


I'm having some trouble with questions asking me to "show" or "prove" instead of computing an answer so I'm looking for some input if I'm actually doing what I'm supposed to or not (and for the last one I don't know where to get started really.)
1. Show that ##T^*## is linear.
2. Show that ##(T^*)^* = T##.
3. Show that ##\langle v,Tw\rangle = \langle T^*v,w\rangle##.
4. Show that ##(ST)^* = T^*S^*##.

I realize this is quite a lot so even taking a look at one of the above would be very kind!

Homework Equations


The adjoint to the linear operator ##T:V \to W## is the mapping ##T^*:W \to V## that is defined by ##\langle Tv,w \rangle = \langle v,T^*w\rangle## for all ##v\in V## and ##w \in W##.

The Attempt at a Solution


1. Using the definition for adjoint and the scalar product we have
##\langle v, T^*(w+u)\rangle = \langle Tv,w+u \rangle = \langle Tv,w\rangle + \langle Tv,u\rangle = \langle v,T^*w\rangle + \langle v,T^*u\rangle = \langle v,T^*w +T^* u \rangle##.
And for ##\langle v, T*(\alpha w)\rangle = \langle Tv,\alpha w\rangle = \bar \alpha \langle Tv,w\rangle = \bar \alpha \langle v, T^* w \rangle = \langle v, \alpha T^*w\rangle##.
Since ##v## is any vector in ##V## we have that ##T^*(v+w) = T^*v+T^*w## and ##T^*(\alpha v) = \alpha T^*v## are the only possibilities and that should show the linearity.

2. From the definition and using the conjugate symmetry of the scalar product we have
##\langle T^*v,w \rangle = \langle v, (T^*)^*w\rangle = \overline{ \langle (T^*)^*,v \rangle }##
But we also have
##\langle T^*v,w \rangle = \overline{\langle w, T^*v\rangle}= \overline{\langle Tw,v\rangle } ##. Again since ##v## is any vector in ##V## we have ##(T^*)^* = T##.

3. As I understand the question this means either that the operators switches roles i.e. ##T: W\to V## and ##T^*:V\to W##. Is this true or I'm supposed to show this when the operators don't even operate on the same space?? If it's as I imagine, doesn't this follow immediately from (2)?

4. This is the one I'm not sure how to do. Writing up the definition I have and using (2)
##\langle STv , w\rangle = \langle v, (ST)^*w\rangle = \langle (S^*T^*)^*v,w \rangle## and I don't know how to push ahead.
 
Physics news on Phys.org
  • #2
Incand said:

Homework Statement


I'm having some trouble with questions asking me to "show" or "prove" instead of computing an answer so I'm looking for some input if I'm actually doing what I'm supposed to or not (and for the last one I don't know where to get started really.)
1. Show that ##T^*## is linear.
2. Show that ##(T^*)^* = T##.
3. Show that ##\langle v,Tw\rangle = \langle T^*v,w\rangle##.
4. Show that ##(ST)^* = T^*S^*##.

I realize this is quite a lot so even taking a look at one of the above would be very kind!

Homework Equations


The adjoint to the linear operator ##T:V \to W## is the mapping ##T^*:W \to V## that is defined by ##\langle Tv,w \rangle = \langle v,T^*w\rangle## for all ##v\in V## and ##w \in W##.

The Attempt at a Solution


1. Using the definition for adjoint and the scalar product we have
##\langle v, T^*(w+u)\rangle = \langle Tv,w+u \rangle = \langle Tv,w\rangle + \langle Tv,u\rangle = \langle v,T^*w\rangle + \langle v,T^*u\rangle = \langle v,T^*w +T^* u \rangle##.
And for ##\langle v, T*(\alpha w)\rangle = \langle Tv,\alpha w\rangle = \bar \alpha \langle Tv,w\rangle = \bar \alpha \langle v, T^* w \rangle = \langle v, \alpha T^*w\rangle##.
Since ##v## is any vector in ##V## we have that ##T^*(v+w) = T^*v+T^*w## and ##T^*(\alpha v) = \alpha T^*v## are the only possibilities and that should show the linearity.

2. From the definition and using the conjugate symmetry of the scalar product we have
##\langle T^*v,w \rangle = \langle v, (T^*)^*w\rangle = \overline{ \langle (T^*)^*,v \rangle }##
But we also have
##\langle T^*v,w \rangle = \overline{\langle w, T^*v\rangle}= \overline{\langle Tw,v\rangle } ##. Again since ##v## is any vector in ##V## we have ##(T^*)^* = T##.

3. As I understand the question this means either that the operators switches roles i.e. ##T: W\to V## and ##T^*:V\to W##. Is this true or I'm supposed to show this when the operators don't even operate on the same space?? If it's as I imagine, doesn't this follow immediately from (2)?

4. This is the one I'm not sure how to do. Writing up the definition I have and using (2)
##\langle STv , w\rangle = \langle v, (ST)^*w\rangle = \langle (S^*T^*)^*v,w \rangle## and I don't know how to push ahead.

Your answers to 1-2 look good to me. For 3, there's no law that says that T must act on V into W. An arbitrary linear operator T can just as well act on W into V. And, for 3, T is just such an operator.

For 4, hint ##Tv## is a vector.
 
  • Like
Likes Incand
  • #3
PeroK said:
Your answers to 1-2 look good to me. For 3, there's no law that says that T must act on V into W. An arbitrary linear operator T can just as well act on W into V. And, for 3, T is just such an operator.

For 4, hint ##Tv## is a vector.
Thank you for going over the exercises! So for (3) is ##T## and operator ##T:U\to U## where ##U = V\cup W##? Or did I misunderstand you?
If this is true then ##\langle v, Tw \rangle = \overline{\langle Tw,v \rangle } = \overline{\langle w,T^*v \rangle} = \langle T^*v,w\rangle##.

That hint made it a whole lot of easier ##\langle v,(ST)^*w \rangle = \langle STv,w\rangle = \langle Tv,S^*w \rangle = \langle v,T^*S^*w \rangle##.
 
  • #4
Incand said:
Thank you for going over the exercises! So for (3) is ##T## and operator ##T:U\to U## where ##U = V\cup W##? Or did I misunderstand you?
If this is true then ##\langle v, Tw \rangle = \overline{\langle Tw,v \rangle } = \overline{\langle w,T^*v \rangle} = \langle T^*v,w\rangle##.

That hint made it a whole lot of easier ##\langle v,(ST)^*w \rangle = \langle STv,w\rangle = \langle Tv,S^*w \rangle = \langle v,T^*S^*w \rangle##.

In 3, you can infer that ##T: W \rightarrow V##
 
  • Like
Likes Incand
  • #5
PeroK said:
In 3, you can infer that ##T: W \rightarrow V##
Alright so it's enough that ##T: W \rightarrow V## with ##T:U\to U## being a special case. But other than that it's correct?
 
  • #6
Incand said:
Alright so it's enough that ##T: W \rightarrow V## with ##T:U\to U## being a special case. But other than that it's correct?
Yes. In general, the linear operator may map one inner product space to another; or, the two spaces could be the same.
 
  • Like
Likes Incand

FAQ: Hermitian adjoint operators (simple "proofs")

1. What is a Hermitian adjoint operator?

A Hermitian adjoint operator is an operator in linear algebra that is the complex conjugate transpose of another operator. It is also known as the adjoint or conjugate transpose.

2. How is the Hermitian adjoint operator related to Hermitian matrices?

A Hermitian adjoint operator is related to Hermitian matrices in that the adjoint of a matrix is the conjugate transpose of the matrix, and a Hermitian matrix is a matrix that is equal to its own adjoint.

3. What is the significance of Hermitian adjoint operators?

Hermitian adjoint operators have several important properties, including being self-adjoint, having real eigenvalues, and being unitary. They are also used in quantum mechanics to represent observables and in signal processing for filtering and noise reduction.

4. How can you prove that an operator is Hermitian adjoint?

The proof for an operator being Hermitian adjoint involves showing that the operator satisfies the definition of a Hermitian adjoint, which is that the inner product of the operator applied to two vectors is equal to the inner product of the two vectors applied to the operator's adjoint. This can be done using properties of conjugate transpose and inner products.

5. Are there any simple examples of Hermitian adjoint operators?

Yes, one simple example is the identity operator, which has its own adjoint as itself. Another example is the transpose operator in a real vector space, which is its own adjoint since real numbers are their own complex conjugates.

Similar threads

Back
Top