Is result of vector inner product retained after matrix multiplication?

In summary, the conversation discusses the idea of whether the inner product of two transformed vectors will still be positive if the original vectors were parallel and the transformation matrix has real entries. While this is true in the case of parallel vectors, it may not hold for linearly independent vectors, as shown by a counterexample provided in the conversation. The conclusion is that the outcome of the transformed inner product will depend on the chosen transformation and cannot be guaranteed to be positive.
  • #1
Master1022
611
117
Homework Statement
Let us imagine that we have two vectors ## \vec{a} ## and ## \vec{b} ## and they point in similar directions, such that the inner-product is evaluated to be a +ve number. If we now multiply both of the vectors by a matrix ## W ## which has real entries, will the inner product of the 'transformed' vectors also be positive?
Relevant Equations
Inner product
Hi,

I was thinking about the following problem, but I couldn't think of any conclusive reasons to support my idea.

Question:
Let us imagine that we have two vectors ## \vec{a} ## and ## \vec{b} ## and they point in similar directions, such that the inner-product is evaluated to be a +ve number. If we now multiply both of the vectors by a matrix ## W ## which has real entries, will the inner product of the 'transformed' vectors also be positive?

Attempt:

Intuitively I think along the lines of: if we imagine the operation as transforming a vector in some way, then the two vectors ## \vec{a}## and ## \vec{b}##, which were similar, should be transformed to similar vectors?

Mathematically, I can write the following:
[tex] <W \vec{a} , W \vec{b} > = (W \vec{a}) \cdot (W \vec{b}) = (W \vec{a})^{T} (W \vec{b}) = \vec{a}^{T} W^{T} W \vec{b} [/tex]

- ## W^{T} W ## is positive semi-definite.
- I suppose if ## \vec{a} ## and ## \vec{b} ##, then if/how positive the outcome will depend on some type of sensitivity of ## W ##? That is, we could view ## \vec{b} = \vec{a} + \vec{\epsilon} ## and think about that?

Any help is greatly appreciated.
 
Physics news on Phys.org
  • #2
If ##\vec a## and ##\vec b## are parallel, then the assertion follows directly from ##W^T W## being positive semi-definite (as long as you change the statement to be non-negative instead of positive - it should be pretty obvious that the result can be zero if ##W^T W## happens to have eigenvalues that are zero, such as when ##W = 0##).

However, consider what happens when you let ##\vec a## and ##\vec b## be linearly independent and ask yourself the following questions:
  • Taking any set of two vectors ##\vec c## and ##\vec d##, can you find a linear transformation ##W## such that ##W\vec a = \vec c## and ##W \vec b = \vec d##?
  • What does this mean for your assertion?
 
  • Like
Likes PeroK
  • #3
Thanks for the reply @Orodruin !
Orodruin said:
If ##\vec a## and ##\vec b## are parallel, then the assertion follows directly from ##W^T W## being positive semi-definite (as long as you change the statement to be non-negative instead of positive - it should be pretty obvious that the result can be zero if ##W^T W## happens to have eigenvalues that are zero, such as when ##W = 0##).
Agreed

Orodruin said:
However, consider what happens when you let ##\vec a## and ##\vec b## be linearly independent and ask yourself the following questions:
  • Taking any set of two vectors ##\vec c## and ##\vec d##, can you find a linear transformation ##W## such that ##W\vec a = \vec c## and ##W \vec b = \vec d##?
I think so... I could imagine forming some matrix equation like:
[tex] W [ \vec{a} \quad \vec{b}] = [\vec{c} \quad \vec{d}] [/tex]
and if ## \vec{a} ## and ## \vec{b} ## are linearly independent, then the matrix is full-rank and that can be inverted to find ## W ##.

Orodruin said:
  • What does this mean for your assertion?
That might suggest that the outcome of the transformed inner product is still positive? This doesn't necessarily seem correct, but I am confused still...
 
  • #4
Master1022 said:
That might suggest that the outcome of the transformed inner product is still positive? This doesn't necessarily seem correct, but I am confused still...
It took me about 30 seconds to find a counterexample. Hint. Let the vectors be ##(1,0)## and ##(1, 1)##. Look for a 2 x 2 matrix that maps ##(1, 0)## to itself and ##(1, 1)## to something in approximately the opposite direction.
 
  • Informative
Likes Master1022
  • #5
Master1022 said:
That might suggest that the outcome of the transformed inner product is still positive? This doesn't necessarily seem correct, but I am confused still...
So you are free to choose any ##\vec c## and ##\vec d##. Are there any such pairs with a negative inner product?
 
  • Like
Likes Master1022
  • #6
PeroK said:
It took me about 30 seconds to find a counterexample. Hint. Let the vectors be ##(1,0)## and ##(1, 1)##. Look for a 2 x 2 matrix that maps ##(1, 0)## to itself and ##(1, 1)## to something in approximately the opposite direction.
Okay thanks @PeroK ! That makes sense, and yes I can see how the answer to my question was 'not necessarily'.
 
  • Like
Likes PhDeezNutz

FAQ: Is result of vector inner product retained after matrix multiplication?

What is a vector inner product?

A vector inner product, also known as a dot product, is a mathematical operation that takes two vectors and returns a scalar value by multiplying their corresponding components and summing the results.

How is a vector inner product used in matrix multiplication?

In matrix multiplication, the dot product is used to calculate the elements of the resulting matrix. The dot product of a row vector and a column vector gives the corresponding element in the resulting matrix.

Is the result of a vector inner product retained after matrix multiplication?

Yes, the result of a vector inner product is retained after matrix multiplication. It is used to calculate the elements of the resulting matrix and is an essential step in the process.

How does the order of matrix multiplication affect the result of the vector inner product?

The order of matrix multiplication does not affect the result of the vector inner product. The dot product operation is commutative, meaning that the order of the vectors being multiplied does not change the result.

Can the vector inner product be used to multiply matrices of different dimensions?

No, the vector inner product can only be used to multiply two vectors of the same dimension. It is not a valid operation for multiplying matrices of different dimensions.

Similar threads

Back
Top