Linear Algebra - Multiplying a matrix and vector.

In summary, the conversation discusses a possible mistake in the diagram of a square matrix pre-multiplying a column vector and the correct method for matrix multiplication. It is mentioned that there are different ways to define matrix multiplication, such as row*column and index methods.
  • #1
jinksys
123
0
I am reading OpenGL SuperBible 5th Edition and noticed something off.
2jMWHl.jpg


I just took linear algebra and was always told to make sure that the rows and columns match when multiplying two matrices. The vector should be on the right side of the matrix, correct?
 
Physics news on Phys.org
  • #2
Since the matrix is diagonal, it doesn't matter.
 
  • #3
mathman said:
Since the matrix is diagonal, it doesn't matter.


How so?
 
  • #4
you are correct. that's a mistake - it should be on the right.
 
  • #5
Maybe that book defines matrix multiplication in a different way. But your example doesn't show this that much. Do you have a less simple example?
 
  • #6
jinksys said:
I am reading OpenGL SuperBible 5th Edition and noticed something off.
2jMWHl.jpg


I just took linear algebra and was always told to make sure that the rows and columns match when multiplying two matrices. The vector should be on the right side of the matrix, correct?

You are correct. The diagram is incorrectly denoting the result of the square matrix premultiplying the column vector.
A matrix can be premultiplied by a row vector to yield another matrix, but there is no formal method of premultiplying a matrix directly by a column vector.
 
  • #7
I believe mathman was referring to
[tex]\begin{bmatrix}a & b & c & d \end{bmatrix}\begin{bmatrix}t & 0 & 0 & 0 \\ 0 & u & 0 & 0 \\ 0 & 0 & v & 0 \\ 0 & 0 & 0 & w\end{bmatrix}[/tex]
and
[tex]\begin{bmatrix}t & 0 & 0 & 0 \\ 0 & u & 0 & 0 \\ 0 & 0 & v & 0 \\ 0 & 0 & 0 & w\end{bmatrix}\begin{bmatrix}a \\ b\\ c \\ d\end{bmatrix}[/tex]

Of course, that is not what is shown in the first post.
 
  • #8
Hi, jinksys!

Nice of you to spot that error.

You should learn first the row*column-multiplication-type, although it is perfectly possible to define more general multiplication rules.
(And, this is done, for example by the highly general index-multiplication method)
 

FAQ: Linear Algebra - Multiplying a matrix and vector.

What is the process for multiplying a matrix and vector in linear algebra?

In linear algebra, multiplying a matrix and vector involves multiplying each element in the vector by the corresponding elements in each row of the matrix. The resulting values are then summed to create a new vector with the same number of rows as the original matrix.

What is the difference between multiplying a matrix by a vector and multiplying two matrices?

Multiplying a matrix by a vector results in a new vector, while multiplying two matrices results in a new matrix. The process for multiplying a matrix by a vector is also slightly different, as it involves a combination of addition and multiplication, while multiplying two matrices simply involves multiplying corresponding elements and summing the results.

Can a matrix and vector with different dimensions be multiplied together?

No, a matrix and vector can only be multiplied together if they have compatible dimensions. This means that the number of columns in the matrix must be equal to the number of rows in the vector. Otherwise, the multiplication operation is not defined.

What is the purpose of multiplying a matrix and vector in linear algebra?

Multiplying a matrix and vector is a fundamental operation in linear algebra and is used to transform the vector into a new vector with potentially different values. This is useful in various applications, such as solving systems of linear equations and performing transformations in geometric spaces.

Are there any special rules or properties for multiplying a matrix and vector in linear algebra?

Yes, there are a few special rules and properties for multiplying a matrix and vector. For example, the order of multiplication matters, and the resulting vector will have the same number of rows as the original matrix. Additionally, the associative and distributive properties apply, but the commutative property does not.

Similar threads

Replies
2
Views
1K
Replies
69
Views
5K
Replies
32
Views
1K
Replies
1
Views
755
Replies
9
Views
2K
Back
Top