Understanding Matrices: Questions & Answers

  • Thread starter Char. Limit
  • Start date
  • Tags
    Matrices
In summary, matrices can be extended to complex numbers without adding or removing any rules, but there are some beautiful theorems that hold for complex matrices but not real matrices. While it is possible to define 3D-matrices and 4D-matrices by taking matrices of matrices, there are no suitable definitions for multiplication on these structures. Tensors can also be rewritten as matrices, but this does not correspond to a "matrix of matrices" structure.
  • #1
Char. Limit
Gold Member
1,222
22
I have a few questions relating to matrices.

1. All of the matrices I've worked with so far dealt with real numbers or real functions of real numbers. Can you work instead in complex numbers, and do you have to add or remove any rules because of this?

2. All of the matrices I've worked with so far have been in 2d, that is, they've all been m-by-n matrices. However, I was wondering if mathematicians explored the idea of m-by-n-by-o matrices, or three-dimensional matrices.
 
Mathematics news on Phys.org
  • #2
Char. Limit said:
I have a few questions relating to matrices.
1. All of the matrices I've worked with so far dealt with real numbers or real functions of real numbers. Can you work instead in complex numbers, and do you have to add or remove any rules because of this?

Yes, the theory for complex matrices is very similar to that of real matrices. There are no extra rules that I know of. In fact, working with complex matrices simplifies a lot, in the context that there are some beautiful theorems that hold for complex matrices but not real matrices. A thing that comes to mind is that complex matrices are always be transformed in a triangular matrix, while real matrices do not have that property.

Sometimes however, you do need to add some rules. For examples, when you start working with inner products, then you will make separate definitions for complex or real matrices. Another example is hermitian matrices, which are an analogon for real-symmetric matrices.

In fact, almost the entire matrix theiry that you've seen with real numbers carry over to matrices over a general field (or even a commutative ring). There is no need for complex or real numbers. In fact, one can also look at rational matrices (but this theory is less satisfactory then real or complex matrices).

I actually find it strange that you've not worked with complex matrices so far. I find it very likely that they will introduce it very soon, because there is no reason to just assume real matrices.

2. All of the matrices I've worked with so far have been in 2d, that is, they've all been m-by-n matrices. However, I was wondering if mathematicians explored the idea of m-by-n-by-o matrices, or three-dimensional matrices.

This is a very good question, I have asked myself thesame question many times. But sadly, I don't know how you would define multiplication.

I'm pretty sure that the theory exists, but I don't know what it is. The reason that it isn't well known, is probably because there is no need for 3D-matrices. Matrices are handy because they represent linear and bilinear maps, because they represent linear systems of equations, etc. I don't see what extra benefits that 3D-matrices would give. Plus, it super hard to visualize 3D-matrices...

But I'm hoping that somebody else will comment on this, because I too want to know the answer to this question.
 
Last edited:
  • #3
Extra: while I don't know how you would define 3D-matrices, I do know how to define 4D-matrices:

Just take a matrix whose entries are matrices themself. Thus you take [tex]M_n(M_m(\mathbb{K}))[/tex]. This can be visualized as 4D-matrices. However, since the matrices are not commutative, this means that the matrices over matrices don't have a lot of good properties. For example, there is no easy notion of determinants of this matrices (this is actually a active field of research).
 
  • #4
micromass said:
Extra: while I don't know how you would define 3D-matrices, I do know how to define 4D-matrices:

Just take a matrix whose entries are matrices themself. Thus you take [tex]M_n(M_m(\mathbb{K}))[/tex]. This can be visualized as 4D-matrices. However, since the matrices are not commutative, this means that the matrices over matrices don't have a lot of good properties. For example, there is no easy notion of determinants of this matrices (this is actually a active field of research).

Well, if we can take matrices of matrices to get 4-d matrices, is it perhaps possible to take a vector (which could be considered a 1-d matrix), and then have each element of the vector be a 2-d matrix? Or perhaps the reverse option?
 
  • #5
Yes, this is of course possible, but I don't see a way to define a suitable multiplication on this. If you have a vector of matrices, then the only way that you can multiply those would give you a nxn-matrix of matrices, or a 1x1-matrix of matrices. However, you will want to get a vector of matrices again.
 
  • #6
In some ways you can consider a tensor as a sort of "n dimensional matrix".

In engineering, tensor equations are often rewritten as matrix equations, possibly because engineers don't like tensors, but also to make it easy to use numerical methods and computer software that already exist for matrices.

For example the stress-strain relationship for a material is really an equation involving two second-order and one fourth-order tensor, but it is often written as a multiplication of 6x6 matrices, even though the matrix form obscures how to transform the matrix elements into different coordinate systems compared with the tensor forms.

The fourth-order stress-strain relationship tensor has 81 elements, but for an isotropic material the large number of symmetries mean that there are only two independent quantities. So the whole tensor can be defined by two physical parameters - for example one way to do it is using Young's modulus and Poisson's ratio. Even for the most general anisotropic material, the symmetry conditions that are caused by (non-relativistic) 3-D space being isotropic mean there are "only" 21 independent material properties, not 81.

But this rewriting of tensors as matrices doesn't look like a "matrix of matrices".

As micromass said, the problem with the idea of an "m x n x o matrix" is that it is hard to define anything corresponding to matrix multiplication, except by "slicing" it into a vector of conventional matrices - but as soon as you do that, you are treating one of the three "dimensions" as special compared with the other two.
 
  • #7
What you can do on 3D-matrices is performing some kind of ternary multiplication (that is, a multiplication with 3 elements).

Remember that multiplication for matrices is defined as

[tex](A\cdot B)_{i,j}=\sum_{k=0}^n{A_{i,k}B_{k,j}}[/tex]

This generalizes to 3D-matrices in the following way, for 3D-matrices A,B and C, we define [A,B,C] as

[tex][A,B,C]_{i,j,l}=\sum_{k=0}^n{A_{i,j,k}B_{i,k,l}C_{k,j,l}}[/tex]

and this can be generalized to multidimensional matrices.

Now, what good propertues does the ternary multiplication have? Well, the binary operation . satisfies associativity, so clearly the ternary multiplication should have associativity as well.
We call this property the para-associativity and it says that

[tex][[a,b,c],d,e]=[a,[b,c,d],e]=[a,b,[c,d,e]][/tex]

I strongly suspect that ternary multiplication satisfies this.

For more information about ternary operations and the structures that have this, see http://en.wikipedia.org/wiki/Heap_(mathematics)
 
  • #8
micromass said:
What you can do on 3D-matrices is performing some kind of ternary multiplication (that is, a multiplication with 3 elements).

Remember that multiplication for matrices is defined as

[tex](A\cdot B)_{i,j}=\sum_{k=0}^n{A_{i,k}B_{k,j}}[/tex]

This generalizes to 3D-matrices in the following way, for 3D-matrices A,B and C, we define [A,B,C] as

[tex][A,B,C]_{i,j,l}=\sum_{k=0}^n{A_{i,j,k}B_{i,k,l}C_{k,j,l}}[/tex]

and this can be generalized to multidimensional matrices.

Now, what good propertues does the ternary multiplication have? Well, the binary operation . satisfies associativity, so clearly the ternary multiplication should have associativity as well.
We call this property the para-associativity and it says that

[tex][[a,b,c],d,e]=[a,[b,c,d],e]=[a,b,[c,d,e]][/tex]

I strongly suspect that ternary multiplication satisfies this.

For more information about ternary operations and the structures that have this, see http://en.wikipedia.org/wiki/Heap_(mathematics)

You suspect, yes. But how would we prove that? I mean, we have a formula for the entry of the ternary product. Can we use that to prove para-associativity?

I'm going to go try that now.
 
  • #9
Well, yeah, use the formula to prove it and do a lot of manipulation. I'm really bad at such things, so I won't try it :biggrin: But do tell us the result!

After para-associativity, there are some other things that ternary multiplication could satisfy:
- Is there a zero-element: i.e. is there a 0 such that [0,x,y]=[x,y,0]=0. this should be the zero matrix, but is this unique?
- Is there an identity: i.e. is there a matrix 1 such that [1,1,x]=[x,1,1]=x for all [tex]x\neq 0[/tex]. And is this 1 unique?
- How does ternary multiplication behave w.r.t. addition and scalar multiplication? i.e. is it true that [tex][\alpha x+\beta y,c,d]=\alpha[x,c,d]+\beta[y,c,d][/tex].

These are some exciting questions you could ask for this multiplication.
 

FAQ: Understanding Matrices: Questions & Answers

1. What is a matrix?

A matrix is a rectangular array of numbers, symbols, or expressions arranged in rows and columns. It is commonly used in mathematics and other fields such as computer science, physics, and engineering to represent and manipulate data.

2. What are the components of a matrix?

A matrix is made up of rows and columns, also known as elements or entries. The number of rows and columns in a matrix is called its dimensions. The element in the i-th row and j-th column is denoted by aij.

3. What are the different types of matrices?

There are several types of matrices, including square matrices, rectangular matrices, zero matrices, identity matrices, diagonal matrices, upper and lower triangular matrices, and symmetric matrices. Each type has its own unique properties and uses.

4. How do you perform operations on matrices?

To perform operations on matrices, they must have the same dimensions. Addition and subtraction are done by adding or subtracting corresponding elements. Multiplication can be done by multiplying the elements of one matrix by the elements of the other and then adding the products. Other operations such as transposing, finding the determinant, and finding the inverse can also be performed on matrices.

5. What are some real-world applications of matrices?

Matrices are used in various fields and applications, such as computer graphics, data analysis, population genetics, quantum mechanics, and economic forecasting. They are also used in solving systems of linear equations, which have many practical applications in engineering, physics, and economics.

Similar threads

Back
Top